Research briefing 32: How should we measure and hold schools accountable for the progress of their pupils?
10 October 2016
Each year the UK Government publish ‘school league tables’ holding secondary schools to account for the average progress and final exam results of their pupils. Our work critiques their three most recent progress measures: Contextual Value-Added (CVA, 2006-2010), Expected Progress (EP, 2011-2015) and Progress 8 (P8, 2016-).
- CVA recognised that poor pupils make less progress than their richer peers and adjusted for this to make fair and meaningful comparisons between schools. That some schools misused CVA to set differential targets for pupils with different socioeconomic and ethnic status reflects the perverse incentives that arise from high-stakes testing;
- EP was an ideological shift away from CVA whereby the Government declared all pupils must make the same progress, irrespective of their prior attainment and socioeconomic circumstances. This severely biased EP in favour of Grammar and other schools with advantaged intakes;
- P8 represents a partial return to CVA in that it again recognises that pupils with higher prior attainment make more progress. However, it continues to ignore the very large socioeconomic and demographic differences between schools which also drive results.
Policy makers, schools and parents should view progress measures and school league tables more generally with far more caution and scepticism than has often been the case to date.
Our aim is to draw attention to recent dramatic changes to Government school progress measures which underlie the way schools are held accountable. We show that seemingly academic and technical differences in the construction and interpretation of CVA, EP and P8 lead to fundamentally different school rankings. In particular, the move from CVA to EP greatly benefited schools with high prior attaining pupils. We also find that a third of schools judged by the Government to be ‘underperforming’ by EP in 2010 are in top half of schools nationally in terms of CVA.
Many have argued and we would agree, that school league tables are best used as tools for school self-evaluation and as a first step towards identifying successful school policies and practices. Where they are used by governments and school inspection systems, they may be better used as monitoring and screening devices to identify schools performing unexpectedly poorly for the purpose of careful and sensitive further investigation.
We start by discussing and statistically critiquing the Government’s justifications for scrapping CVA which they outlined in their 2010 White Paper. We then describe their current EP measure and show that this suffers from fundamental design flaws. We illustrate these problems using the Government’s own 2014 school league table data for all schools in England. We then reanalyse their official 2010 school league table data as this is the only year for which CVA and EP are published simultaneously. Using this data we show that CVA and EP lead to very different school rankings and judgements as to which schools are underperforming. Lastly, we describe the new P8 measure and predict which schools are likely to win and lose from this latest change.
For further information, please see our working paper which will soon appear in the British Educational Research Journal.
- Leckie, G., & Goldstein, H. (2016). The evolution of school league tables in England 1992-2016: ‘contextual value-added’, ‘expected progress’ and ‘progress 8’. Bristol Working Paper in Education Series. Working Paper, 2/16.
- Leckie, G., & Goldstein, H. The evolution of school league tables in England 1992-2016: ‘contextual value-added’, ‘expected progress’ and ‘progress 8’. British Educational Research Journal. Forthcoming.
This work was funded under Leckie’s ESRC Future Research Leaders grant (ES/K000950/1). Further details can be found on the grant website.Contact