Guest Post: My teachers have more information than ever on learning and progress, and a lower workload

By Gary Alexander, Deputy Head Teacher

I am Deputy Head Teacher of Battle & Langton CE Primary School, a two-form entry primary school on the south coast of the UK.

In 2014, I was keen to move on from old assessment procedures and took the abolition of National Curriculum levels as an opportunity for me to innovate practice.  Although National Curriculum levels were familiar and comforting, there is no doubt in my mind that they needed to go.  To start with, teacher assessment against National Curriculum levels took far too long, and generated so much work for teachers that it became an onerous task that dominated their thoughts.  Accountability levels were rising (to levels far too high), and assessment information was being used to judge teachers rather than to support learning.

I stumbled across the power of Renaissance Star Assessments™ when looking at using Renaissance Accelerated Reader™ to support reading and organise my library.  I was looking for something that would remove the burden of teacher assessment, whilst still giving strong and reliable information about learning.  I needed the macro level information around group progress, the percentage of children ‘on track’ to achieve end of key stage expectations, and granular information about next steps in learning for individual children.

To begin with, we introduced Star™ to years 3 to 5 as a 6 month trial.  Children completed the short and painless computer-adaptive tests, and teachers received information about learning instantly.  It took us a while to begin to understand what all the information was telling us, and what parts we needed to look at first.   Once we had a basic handle on the information, teachers began to realise that the information more often than not confirmed their own perception of how well that child was doing, even though they hadn’t ‘ticked a sheet’ to say so.   After the initial 6 month trial, the other teachers became agitated that they were still filling in ‘APP style’ sheets, whist the trial year groups got all the information given to them with minimal workload.  The time being saved was huge.

In the first full year we launched for years 2 to 6 and Star Assessments™ became our only formal assessment tool.  Although teachers were still informally assessing day to day – how else would they teach – there was no formal recording of these judgements needed.  About February of that year, I began to realise that either my school was incredible, or the default benchmark for reaching the expected standard was too low.  I wish that 90% of my children had achieved the expected standard in their SATs as Star™ was predicting, but they didn’t.  But, of course, as this was the first year of the new ‘more challenging’ SATs tests, and the Star™ data couldn’t accurately predict anything.

After the results came in, and Primary Head Teachers across the country had finished weeping over the reading results, I did a correlation study of each test, comparing the SATs scores to the preceding Star™ scores.   This proved that the correlation between the two tests was very strong (0.86 in Maths, 0.79 in Reading).  This reassured me that Star could give a very accurate indication of KS2 performance.Graph showing correlation between Star Maths and SATs outcomes

The problem was that nobody knew where to draw the ‘pass’ line.  I looked at the two sets of data and worked out what children needed to score in Star™ to be able to pass their SATs.  This gave me a benchmark to work backwards through the years.  Now we have a system that judges attainment very accurately, and can tell me at any point in the year which children are on track (given typical progress) to meet the expected standard at the end of the year.  This gives me the ability to predict SATs scores this year with greater confidence.

Graph showing correlation between KS2 SATs outcomes and Star Reading Standardised Scores

In addition to robust attainment data, Star™ gives me relative progress information, which allows me to see instantly how children are progressing compared to all children using Star™ in the UK who started the year at a similar point.  This is similar to the DfE ‘value added’ measure, in that it groups children according to prior attainment, and judges their progress against an average for the group.  This was I can see how much progress my ‘high flyers’ are making when compared to all the other ‘high flyers’ in the system.  So very powerful.

All in all, I believe we have as good an assessment system for reading and maths as is possible in these times.  My teachers have more information than they’ve ever had on learning and progress, and have a lower workload.  Children, parents and governors have robust information given to them when we need to, and I have more confidence in my judgements of how my school is performing.

But don’t ask me about Writing!


Learn more about The robust Assessment System which Gary Alexander talks about. Click here to download the Assessment Guide.


Gary Alexander
Deputy Head Teacher

Gary Alexander has been working in education in the UK and abroad for 17 years.  He is currently Deputy Head Teacher at a large, successful primary school in East Sussex, where he leads on teaching learning and assessment.  He is interested in all things educational, but has a particular interest in making use of technology to reduce teacher workload and enable them to spend more time thinking about and improving their practice.



Monthly newsletter


Social