Is it worth calculating Progress 8 anymore?

England’s secondary schools are held accountable for their performance by the Department for Education’s (DfE) ‘Progress 8’ performance measure. This measures the average pupil progress made between Key Stage 2 (age 11) and GCSEs (age 16) and represented a long-called-for improvement over the previous raw attainment measure. Nevertheless, concerns have been voiced as to how valid Progress 8 is for holding schools to account, with implications for the policymakers, schools, and parents who use Progress 8 to inform decision making around schools.

About the research

This research was comprised of a comprehensive review of Progress 8, complemented with new analyses of the underlying data, focusing on assessing the statistical strengths and weaknesses of the measure. We explored five areas of concerns: choice of pupil outcome attainment measure; potential adjustments for prior attainment and background characteristics; decisions around which schools and pupils are excluded from the measure; presentation of Progress 8 to users, choice of statistical model, and calculation of uncertainty; and issues related to the volatility of school performance over time.

Policy recommendations

• Present a measure which is less focused on the English Baccalaureate (EBacc) alongside Progress 8, which is currently weighted 70:30 in favour of the traditional academic subjects of the EBacc. This would provide a more holistic picture of school performance and give schools greater freedom to pursue more varied curriculums.

• A version of Progress 8 that considers a pupil’s background should also be presented alongside the current measure to provide a perspective on school performance reflective of the different contexts and challenges which schools face.

• Recognise pupil mobility by making school Progress 8 scores an average of all pupils who attended each school, weighted by their time in each school. This would hold schools accountable for all the pupils they have taught.

• Communicate more clearly the limited importance in general of school Progress 8 scores in comparison with pupil, family and other factors, and the importance or not of each individual school’s score.

• Report multiyear averages alongside Progress 8 to address the instability of scores based on a single year of data and emphasise to parents choosing schools that Progress 8 scores represent very uncertain predictions as to the future performance of schools.

• Progress 8, and the wider use of school performance data for accountability in England, should be rethought in light of the disruption to schools, examinations, and current and future measures of Progress 8 caused by the pandemic.

Key findings

• Progress 8 offers improvements over the previous headline measure of school performance which summarised the percentage of students achieving five GCSEs at grades A* to C: it controls for school differences in prior attainment at intake; encourages focus on students across the ability distribution rather than at the GCSE grade C/D boundary; and it presents whether scores are significantly different from the average school performance.

• Our statistical review highlighted several areas where improvements could be made to Progress 8, mainly concerning clearer communication of the meaning and importance of results to users, and the provision of companion metrics to broaden the scope of the performance measures, in doing so lessening the stakes attached to Progress 8 and providing a more nuanced picture on school performance.

• More general and long-standing concerns remain with the way school performance data is used to inform school accountability in England, in particular the perverse incentives and unintended negative consequences induced by the high stakes attached to school performance measures.

Image credit: Photo by Jeswin Thomas on Unsplash

Further information

The journal article of this work is available open access from the Review of Education.

This work was funded by ESRC grants ES/R010285/1 (Prior and Leckie) and ES/T003677/1 (Jerrim).

Authors

Dr Lucy Prior, University of Bristol; Professor John Jerrim, University College London; Dave Thomson, FFT Education Datalab; Professor George Leckie, University of Bristol

Contact the researchers

Dr Lucy Prior
Research Associate in Quantitative Methods in Education
School of Education
University of Bristol
lucy.prior@bristol.ac.uk
 
Professor John Jerrim
Professor of Education and Social Statistics
UCL Institute of Education
University College London
j.jerrim@ucl.ac.uk

Dave Thomson
Chief Statistician
FFT Education Datalab
Fischer Family Trust
dave.thomson@fft.org.uk

Professor George Leckie
Professor of Social Statistics
School of Education
University of Bristol
g.leckie@bristol.ac.uk

Edit this page