June 26, 2020

Part 3: Get your baseline

This is the third in a series of blogs outlining an assessment plan for back-to-school 2020 after the coronavirus pandemic has forced schools to close their buildings and administer teaching remotely instead. The next blogs in this series will be released weekly and you’ll be able to view them all here.

It’s shockingly quiet in most school buildings these days, even with certain year groups going back. The constant sound of voices and the continual movement of students and staff going from place to place have largely been replaced by silence. If we’re lucky, these conditions allow for a little more reflection than usual. If we take a few moments to prepare for the potential academic realities we’ll face in the new school year, we’ll be a lot better off. As we noted in our first blog, multiple predictions about back-to-school (BTS) 2020 have already been published. Will we see the dire ‘COVID-19 slide’ that some have predicted? If so, how significant will this slide be? The EEF is suggesting that “the gap at the end of primary school could widen by between 11% and 75% between March and September,” affecting children of disadvantaged families especially. But the truth is, we just don’t know until pupils return to school.

The authors of one recent analysis made some alarming predictions, based on the typical “summer slide” (Kuhfeld and Tarasawa, 2020). However, findings about the impact of summer learning loss actually vary widely. Koury (2019) noted that “the recent research on summer learning loss is quite mixed, with some studies showing significant loss and others finding little evidence” (emphasis added). However, this raises a further question: to what extent do recent school closures truly reflect the dynamics of a typical summer?

During a typical summer, the majority of students are off from school and are not necessarily engaged in learning activities of any kind. During COVID-19, the majority of students have had learning continue in some form, whether online or with printed material provided by their schools.

Another author recently mused about the potential academic impacts of COVID-19 by looking at longitudinal performance data around long-term school closures in Argentina, which were caused by extensive teacher strikes in the 1980s and 1990s (Barnum, 2020). The amount of time students were away from school had some parallels to recent events, but the context does not. The researcher who originally analysed the Argentinian data echoes an idea we noted above, explaining that conclusions likely cannot be drawn because “the situation today is much different,” as schools “have online learning” that “would at least mute part of the negative results” (Jaume cited in Barnum, 2020).

Educational heroes of the COVID-19 era

In some ways, the dire decline in student performance that some have projected unintentionally belittles the tremendous efforts that so many educators have made with different learning activities. And this has been no small feat. Most schools have had an ‘all hands on deck’ sense of urgency to make sure that learning continues, despite the sudden school closures. During the spring of 2020, a massive amount of schoolwork was completed, often via distance learning, and social media feeds are rife with evidence that many parents are also actively involved in keeping things going for their children academically. We are even offering schools to option to make this summer more academic than most by using myON over the summer.

It’s been a century since the last global pandemic. During this time, technology has transformed the world almost beyond recognition. We have more ways to support remote learning than ever before, and educators have made the most of these resources over the last few months to make this transition as smooth as possible. They are clearly among the heroes of the COVID-19 response!

An all-important first step

Let’s return to the dire predictions we discussed earlier. In a broad sense, the core of these is accurate: we will likely see lower test scores and a wider variance in scores when the new school year begins this autumn. However, the magnitude of the impact is the primary concern. There does not appear to be a viable way to predict the scale of the decline or the range of score variability with any degree of reliability. However, the EEF’s analysis estimates that the pandemic will likely reverse the progress made in ‘closing the gap’ since 2011 (though effective remote learning should mitigate some of the effects).

Ultimately, however, these predictions do not matter. They will not change the realities we’ll face in the next academic year. Let’s not spend too much time wringing our hands over dire predictions; instead, let’s “control what we can control” and make plans for using our assessment data to guide teaching and learning in the new school year.

An excellent first step that school leaders can take right now is getting their hands firmly around their ‘baselines’ for student proficiency and growth. In other words, take the time now to gather data you can use to compare performance during the more-typical BTS 2019 with the anything-but-typical BTS 2020. Before we can fully understand the new realities of the 2020–2021 school year, we must have a firm grasp of how things were going before the disruptions. 

Identifying metrics to compare

Most schools have key metrics on which they rely to make decisions and drive instruction. Some use Percentile Rank (PR) scores, while others favour Student Growth Percentiles (SGP), Reading Ages (RA), Scaled Scores (SS) or — most likely — a combination of these.

Other schools may be less sure of which scores to consult. For those seeking guidance, we suggest that you have scores that provide each of the following:

  • normative reference
  • proficiency-based reference (if available)
  • growth reference

Some sort of normative reference is critical because it provides a position within a national sample. While many schools gravitate toward Reading Age (RA) because it’s considered intuitive, we’d suggest the use of the Percentile Rank (PR) score. This is a norm-referenced score that indicates how a student is performing compared to their academic pears. The score is presented as a range of 1 to 99, where 50 is average and represents a student performing at or above 50% of their peer group. The PR considers the student’s year group, chronological age, and the time of year they are testing.

This will compare students to what was typical in 2019: witnessing a drop in Percentile Rank gives an understanding of the impact the COVID-19 closures has caused. Using the Star Screening report to contrast performance at BTS 2019 and BTS 2020 will provide an additional opportunity to intervene.

Using the Star Screening report for national performance benchmarks

Using the Star Screening report for national performance benchmarks

One helpful reference is a proficiency-based score. Given that Star Assessments are statistically linked to the KS2 SATs, many of our customers use benchmarks tied to SATs expectation levels when they view Star Screening Reports. Those reports, then, reflect data through the lens of likely performance on the eventual summative test, and they provide estimates of the number of students who fall above or below expectations. For secondary school students, the KS2 linking study is a great resource for Year 7 catch up.

Using the Star Screening report for KS2 benchmarks

Using the Star Screening report for national performance benchmarks

Proficiency-based scores, which frame things through the lens of a school’s students in relation to a criterion-based assessment, add an element not provided by normative scores. While an observation such as “We saw a 6-point drop in the average PR score among incoming Year 6s” provides a helpful norm-referenced perspective, the proficiency-based score can provide additional insight, such as “We now have 32 more students falling below benchmark.”

Using the Star Growth report

Finally, it is also important to have a reliable growth reference. Historically, many schools used a pre-test/post-test model, which is easily accommodated by Star’s Growth report. But with the advent of Student Growth Percentile (SGP) scores — which are also clearly displayed on the Growth report — many now look to this metric instead. The SGP is a growth score that indicates high growth (65+), typical growth (35-65) and low growth (>35) relative to students of a similar academic ability. SGP scores consider additional elements (e.g. the phenomenon of ‘regression to the mean,’ comparing students to their true ‘academic peers,’ etc.) that are not considered in a pre-test/post-test model, resulting in a more comprehensive measurement of student growth.

Ensuring educational equity

Once you’ve identified the metrics that are representative of overall performance, the final step — if at all possible — is to disaggregate your data to check for any variances in performance by student subgroup. Many authors have suggested that our most disadvantaged students will be disproportionately impacted by the COVID-19-related school closures. In analysing your data, you might find, for example, only a small dip in scores overall, but a more significant decrease for Pupil Premium students or for EAL students. Understanding performance across subgroups is critical for ensuring equity.

Depending on how your Renaissance site was set up, you may or may not have imported the demographic information necessary to disaggregate scores by subgroup. The customisation fields within Star reports support disaggregation when this demographic data is available. If you have Renaissance Data Integration (RDI), you can also pull this data into Renaissance through Wonde integration.

Getting started

So, how can you most effectively gather baseline data? To help with this, we have a webinar to help you identify the data you want to collect as a baseline for later comparison.

Of course, getting your baselines will help you prepare to meet students’ academic needs during the new school year. In the next blog in this series, we’ll discuss another critical component of planning for BTS 2020: the social-emotional aspect of student learning. How can insights from social-emotional learning support your assessment plans? Get the answer in the next blog in this series, along with tips for embedding Star Early Literacy in daily instruction.

The next instalment of this blog will be published on 3rd July 2020, and you’ll be able to view it here. Follow us on social media to stay up to date and share your thoughts with us.

References

Share this post