fbpx

Why bother with school analytics?

By Joshua Perry,

We’ve asked Joshua Perry, education technology expert and entrepreneur, to write a series of blogs about analytics and assessment. This is the first instalment, which looks at the role analytics play in schools. Joshua is on Twitter as @bringmoredata

We shouldn’t take the usefulness of school analytics as a given. I’ve been working in this space for six years now, and I’ve encountered plenty of scepticism about the usefulness of school data. People have said things to me like:

“You data people just love creating pretty charts for the sake of it.”

“School data is so unreliable that we can’t trust it anyway.”

“I know my class better than any report could.”

“Ofsted don’t even look at internal data any more so why should we bother?”

“We’re drowning in data.”

Now, I’m a school data guy, so perhaps you expect me to recoil at such views. But you’d be wrong! All of these things can be true! To be more specific…

Data people DO love to create pretty charts

I sometimes stay up on a Friday night to write blog posts including pretty charts and it’s my favourite part of the week. So, yes, guilty as charged. But more importantly, just because a chart is pretty, it doesn’t mean it’s useful. If you’ve followed Coronavirus twitter at any point you’ll have seen plenty of examples of this. There are several beautiful visualisations out there illustrating the number of confirmed cases… But what does “confirmed cases” really mean? Is it really just a proxy for the number of tests a country has performed, rather than the actual true number of cases? And are the tests themselves accurate? So, yeah, a beautiful chart is fun to produce (if you’re someone like me), but it can also be dangerous if the underlying dataset isn’t suited to the task in hand.

“A beautiful chart is fun to produce, but it can also be dangerous if the underlying dataset isn’t suited to the task in hand.”

School data CAN be unreliable

We shouldn’t hide from this: attendance data can be entered incorrectly; behaviour data is hard to compare between teachers; parents may not want to reveal their free school meal eligibility status. Academic achievement is a particularly thorny area, since an assessment can only ever sample a learner’s knowledge; it’s just not possible to test understanding of every concept in a summative assessment without it taking impossibly long to sit. (I’ll come back to this in a subsequent blog). That’s true even of national assessments like GCSEs, so it’s going to be extra true when schools write their own assessments. And since primary schools often use teacher judgment, adding an additional layer of subjectivity, then yes, of course, school data can be unreliable!

A teacher SHOULD know their class better than a report

Look, I love data, but I don’t think that analysis is somehow superior to professional judgement. Rather, a good report adds to a professional’s body of knowledge; it influences their thinking but it can never replace their thinking.

“School data can be unreliable, and we shouldn’t hide from this.”

It’s GOOD that Ofsted don’t look at your internal data any more

A major anxiety in recent years has been that teachers have felt pressure to produce a certain set of grades to fit into the narrative that a school wishes to communicate to Ofsted. That sounds calculating, but in my experience it was more likely to arise from a subconscious bias than a deliberate skew. If you’re deciding your teacher-assessed grades for the summer term, and you know that you’re due an Ofsted shortly, well, perhaps the borderline grades get rounded up. Or if you’re writing your end of year assessment, maybe you don’t sweat too hard to make the questions sufficiently challenging. So, yes, Ofsted’s interest in internal data did too often change the incentive calculation in unhelpful ways.

Many schools ARE drowning in data

If your school collects data six times a year still, that makes me nervous. You really can’t spot meaningful changes in performance over six week periods, and every summative test cycle takes time away from teaching. Moreover, if your primary teacher assessment process involves tracking every learning objective at a super-granular level, well, you’ll never have any time to, you know, teach.

So why do we bother with analytics at all?

Well, I think the key reason – above all others – is that we should use data to make more informed decisions.

That may sound obvious, but so many analysis strategies lose sight of this. You might look at weekly attendance numbers, but are you really doing it because you know what actions you’ll take if numbers go up or down? For example, are you looking for early warning signs of attendance issues so that you can swing into action with a strategy to turn that around? Or are you just collecting data and reporting it to governors because, you know, it’s what you’re supposed to do.

“If you don’t have a strategy, it’s impossible to ensure that schools are avoiding the dodgy, time-consuming practices that have given school data a bad name.”

That’s why these days my advice to any school or Multi Academy Trust (MAT) is to write a strategy document covering your approach to data. It can be short – I’ve seen great ones that fit on two pages – and it doesn’t need to cover every detail and process. But if you don’t have a strategy, it’s impossible to ensure that schools are avoiding the dodgy, time-consuming practices that have given school data a bad name.

The things I’d cover in my policy

    1. How will analysis feed into the organisation’s decision-making process? It’s important to consider this at each level (class teacher, school leadership, MAT) and good to be as specific as possible – I’ve seen examples that actually list the questions they ask of the data and what decisions the answers feed into. I’ll explore this more in my next two blogs.
    2. What is the dataset or assessment being relied upon, and is it up to the job? The more objective and reliable the data, the better, which is why standardised assessments (like Renaissance’s Star Assessments) have increased in popularity in recent years.
    3. What’s the audience for an assessment? In other words, what data does leadership need to gather, and what is better left with class teachers only? Again, I’ll come back to this in later blogs.
    4. How frequently will data be gathered? This shouldn’t just involve writing down what you’ve always done. Instead, ask whether that frequency could be reduced without harming the quality of insight. I know of a few MATs now who are just collecting data twice a year – and to the best of my knowledge, no one’s died as a result.
    5. What will the test conditions be when pupils sit summative assessments? Will students sit in classrooms or go to the exam hall? This may sound trivial but without a policy, teachers and schools will do different things, and the choices made do have an impact on results.
    6. How will that data be collected and analysed? Are there tools that can automate the process, and/or improve reliability?

Of course, these areas are interrelated. For example, if the question you’re trying to answer is “Do I need to reteach some element of the curriculum to this class?” you’ll need to think about what assessment allows you to do this well. You’ll also want to consider whether you need to analyse strand and question-level data, as the overall grade on its own may not be enough. Timing will be key – if you only collect data at the end of the summer term, you haven’t left any time for reteaching. And simple, intuitive online tools may be needed to minimise the work burden for staff.

“The insights arising from good analysis can play a central part in school effectiveness, but only if we’re gathering the right thing for the right reason.”

In my next blog I’m going to dive into some specifics around exactly what kind of decisions data can be used to inform. But for now, I hope I’ve persuaded you that not all school data types are slavishly addicted to gathering data for data’s sake. I believe very strongly that we should care about school data, and the insights arising from good analysis can play a central part in school effectiveness, but only if we’re gathering the right thing for the right reason.

 

Joshua’s blog series can be read here. To see how we’re supporting students and teachers during school closures, click here. You can follow Joshua on Twitter on @bringmoredata and Renaissance on @RenLearnUK


Joshua Perry



Monthly newsletter


Social