Doing data differently

This article originally appeared in Leader magazine
Autumn Term 2 2022

Tom Middlehurst reflects on the core messages that came out of ASCL's recent Data and Regulation Conference.

It’s been two years since we’ve had any national data, which, for a sector that is used to producing, analysing and comparing data extensively, may feel like a long time. But with the return of Key Stage 2 assessments, GCSEs, A levels and vocational and technical qualifications (VTQs) exams last summer, schools and colleges now largely have access to the data sets they’re used to. But what does the data tell us – and what are its limitations? 

These were the key questions at ASCL’s Data and Regulation Conference in October. The last data conference was in autumn 2020: since then, we’ve had a general election, five secretaries of state, centre assessment grades (CAGs), teacher assessed grades (TAGs) and a return to exams. The decision to publish Key Stage 4 and 16–18 outcomes was a political one, and although primary data will not be published in the same way, it is available for use by the system through ASP (analyse school performance) and the IDSR (inspection data summary report). There is a general sense that this data can tell us some things, though, as always, never everything. Historical data always has its limitations: it tells us the ‘what’, not the ’why’, but perhaps those limitations are even greater this year. 

The policy context 
The interpretation of this data – the why – therefore has to be put against the wider educational context. At a national assessment level, while GCSEs, A levels and other Level 2 and 3 exam data is being published, there were significant adaptations to the exams in 2022. Conversely, KS2 national assessments went ahead without any adaptions or with any generosity applied to marking, but these results are not being published at school level in the public domain. 

These assessments sit within, what ASCL Council (www.ascl.org.uk/Council) fears, is a context of increasing curriculum centralisation, albeit covert centralisation. In the last six months we’ve had a plethora of non-statutory guidance from the DfE, from the Parent Pledge to length of the school week, Music for All and changes to National Tutoring Programme reporting. While all of these are presented as ‘non-statutory’, they nonetheless use the language of ‘should’ and ‘expect’ throughout. It may feel to many school and college leaders that any sense of curriculum autonomy is being eroded. 

At the same time, leaders are making tough decisions about funding – with uncertainty about the future of energy costs beyond the six-month guarantee, and unfunded pay rises. This has raised fears that curriculum offers may be narrowed, as every curriculum area is looked at through the lens of affordability. 

So, when we talk about data, we need to look at it in this wider context. 

Exams 2023 
We also now know the DfE and Ofqual’s plans for next year. Grading will return to 2019 outcomes, with an important caveat that 2023 grades will not be lower than in 2019, even if the performance standard is lower. There will be only limited adaptations in the form of formula sheets in GCSE maths, physics and combined science. There will be no optionality nor advanced information in other subjects. 

As with 2019–22, this means that 2023 data will not be able to be compared to previous years because each of the five data sets will have been produced under different conditions and different policy contexts. 

What’s in a name? 
The core message from the DfE and Ofqual is that although data is important for accountability purposes, it is unhelpful to compare data between schools and colleges, and to previous years. The impact of Covid was so variable, even within single areas, and even within single institutions. Instead, parents and other stakeholders should use data as the starting point for a conversation: what was the context of the school or college over the last two years? With the checking exercise now complete, and confirmed data due in February, it will be important to remind stakeholders of this message. 

This is perhaps best reflected in the change in the way in which data is presented this year. Performance tables are now known as ‘Find and check the performance of schools and colleges’ rather than ‘Compare school and college data’, something we hope continues in perpetuity. The data has significant caveats on it as leaders will have already seen. There is no whole school list, with schools ranked from top to bottom on a range of measures. So too, the tool that allowed users to create their own tables by selecting individual schools has been removed, all with the intention to discourage comparisons. 

How will Ofsted use data? 
The same messages were shared by Ofsted. Under the education inspection framework (EIF), data has only ever been a starting point for inspection, and barely appears in reports. Instead, Ofsted is interested in how you’ve used that data to inform the curriculum and pedagogy. Again – the why, not the what. 

Ofsted reminded leaders at the conference that it doesn’t have any access to data from 2020 or 2021 and will not look at internal data sets. How then, delegates asked, can we demonstrate curriculum impact? 

The advice given was to respect the disciplinary nature of assessment, rather than impose overly rigid, top-down assessment policies. Instead of asking staff to fill in countless RED Amber Green (RAG)-rated spreadsheets, talk to staff about what success looks like in their subject, and how they know whether students have learned what they intended, and mastered a skill. 

What does this mean for leadership practice? 
Throughout the day, there was a sense of optimism: this allows us to do things differently. Whether it’s a more intelligent approach to performance management that doesn’t rely solely on a teacher’s last exam results, or questioning current school policies and why data is being collected at certain times, this is an opportunity to think afresh. School and college leaders could ask the following questions: 

  • How useful is the internal data we collect? What’s the purpose of its collection? 

  • How do we know pupils are making progress? What’s the most meaningful way of recording this? 
    What does success look like in different subjects? 

  • How do we encourage a disciplinary approach to assessment? 

  • What context is important to explain our national data sets? 
    How do we talk to governors, trustees and other stakeholders about this? 

A lot of the messages about 2022’s data have always been there: an over-reliance on data is a dangerous thing. This year’s results just bring that message into even sharper relief.  

Tom Middlehurst 
ASCL Curriculum, Assessment and Inspection Specialist 
@Tom_Middlehurst

Related Pages

Exams 2025: Ofqual guides

  • Assessment
  • Examinations
  • Ofqual

Curriculum and Assessment Committee

  • Council
  • Teaching
  • Learning
  • Curriculum
  • Assessment