Published: 12th November, 2025
The 16-18 Progress Measures: How they work and what they measure
Written by: Alps Senior Education Consultant, John Philip
Another Covid Legacy: Data Gaps for newer Heads of Sixth
The 16–18 Progress Measures (16–18 PMs) released in February 2025 were the first since 2019. As a result, many Heads of Sixth who have taken up their roles since then may be encountering these measures for the first time.
This paper, along with the free Alps webinar on 8th December, is designed to support Heads of Sixth in deepening their understanding of the measures and exploring strategies to further improve their Progress scores and bandings.
The webinar will be jointly presented by John Philip from Alps and Rachel Johnson from The PiXL Network.
To register click here
Why were there no 16-18 progress measures published between 2019 & 2024?
The Department for Education (DfE) will not publish progress measures in the Performance Tables unless both the prior attainment (PA) and the results were achieved in examinations.
As Table A demonstrates, this was not the case from 2020 through to and including 2023.
Table A
High Level Summaries: What you need to know:
Which students are included in the 16-18 progress measures?
To be included in the 16 to 18 value-added (VA) measure, a student must:
- Be aged 16 – 18* (so Year 14 students will not be included)
- Have results at the end of Key Stage 4
- Have completed an academic, applied general or tech level qualification
- If they enter and fail, they are included
- If they withdrew and did not enter, they are not
How is value-added progress calculated by the DfE?
The calculation of 16-18 value added progress measures contains five main steps:
- Calculating each student’s KS4 prior attainment
- Calculating the national average grade for students with similar prior attainment in the same subject for comparison
- Calculating student value-added (VA) scores in each subject
- Calculating school and college VA scores
- Calculating confidence intervals – scores can only be ‘significant’ if both the Upper & Lower Confidence intervals are EITHER above or below the 0 line.
How is a student’s value-added progress calculated in each subject studied?
For all students, the DfE calculates their average attainment at key stage 4 (KS4)
The subjects included in prior attainment score calculation differs depending on the qualification type as Table B shows.
Table B
AS levels taken before a student reaches the end of key stage 4 are included in the prior attainment calculation for academic and applied general/tech level value added.
Students are compared with other students (with similar prior attainment) studying the same qualifications nationally.
In each subject the DfE divides students nationally in that subject into up to 20 bands based on their prior attainment.
The DfE then calculates the average attainment in each subject for each of these bands. This allows them to compare each student’s result in each subject with the average result of students with equivalent prior attainment taking the same qualification. The difference between the two is the student’s value-added score in that qualification.
Model A from the 16-18 Accountability Measures Technical Guidance demonstrates this visually for two fictitious students in different PA bands.
Model A
The 16 to 18 value-added measure shows the progress each student makes between key stage 4 and graded level 3 qualifications compared with the actual progress made by students nationally who had similar levels of attainment at KS4.
How is a subject’s value-added progress score calculated?
The progress of all students in each subject in your school / college is aggregated to create each subject’s VA score.
All scores are shown with Upper & Lower confidence intervals. A subject score is only regarded as ‘significant’ if both the Upper & Lower Confidence intervals are EITHER above or below the 0 line. Confidence intervals widen if the subject cohort is small and narrow the larger the subject cohort is.
Model B from the 16-18 Accountability Measures Technical Guidance demonstrates this.
The figure below shows an example of how a school or college VA score is calculated from five student VA scores in an individual qualification.
Model B
How are a school or college’s value-added Progress scores calculated?
Four different value-added Progress scores may be calculated for your school or college, depending on which subjects are in your Post16 curriculum:
- A Level
- Academic (A Levels + EPQ + Core Maths)
- Applied General
- Tech Level
Calculating value-added scores for qualification types
After the VA scores for each subject have been determined, the qualification type VA scores for the school or college can be calculated by finding the sum of the VA scores for each subject within each qualification type, divided by the total number of students taking each individual subject in each qualification type.
Model C from the 16-18 Accountability Measures Technical Guidance demonstrates this for A Level but also shows that the larger subject cohorts in a school / college have the biggest impact on the value-added score as well as the overall Average Grades.
More importantly your larger cohort subjects have the greatest impact on your students’ grades and post-18 pathways.
Model C
How were school or college value-added progress scores banded in 2024?
As for Progress 8, five bands are used:
- Well Above Average
- Above Average
- Average
- Below Average
- Well Below Average
The proportions of schools and colleges placed in each band in 2024 varied across qualification types as Tables 3 – 5 demonstrate.
Table 3 – A Level and Academic Progress Bands in 2024
Table 3 – A Level and Academic Progress Bands in 2024
Table 4 – Applied General Progress Bands in 2024
Table 4 – Applied General Progress Bands in 2024
Table 5 – Tech Level Progress Bands in 2024
Table 5 – Tech Level Progress Bands in 2024
Alps value-added analysis and the 16-18 Progress Measures compared
Both the Alps value-added analysis and the 16-18 PMs are calculating the progress of students from a starting point to an end result. However, there are differences between the two. The most significant difference between Alps and the DfE measures is one of purpose as Table 6 demonstrates.
Table 6
The second key difference is timing, Alps analysis is available from the start of term, whereas the validated 16-18 PMs are available in the following February, which is too late to set improvement priorities for your current cohorts, linking back to the purpose of Alps analysis.
If I use Alps and the 16-18 PMs – why are my outcomes sometimes different?
The outcomes tend to be similar between the two sets of data. However, there are things that can influence a difference in the performance scores between the two sets of analysis, and some of the main ones are outlined below.
1. Difference in methodology (1): The DfE have 5 progress bands and Alps has 9, giving more detail against which to determine performance improvement actions:
2. The data included can be different (1): Alps’ analysis in August is based on the prior year’s DfE dataset and the current year’s customer benchmarks. Alps has access to the national DfE data for the current year after the publication of the 16-18 Performance Tables in February.
3. The data included can be different (2): The DfE separates vocational qualifications into Applied General and Tech Level categories when calculating school or college Progress scores whereas Alps aggregates all such qualifications when calculating our BTEC Quality Indicator.
4. The data included can be different (3): The DfE only includes students in the 16-18 age bracket. Alps includes all uploaded students regardless of age to give a comprehensive analysis for a school or college.
5. Difference in methodology (2): The DfE places students into 20 prior attainment bands that are unique to that subject, so a student could be in a different prior attainment band for different subjects. Alps places students into 10 prior attainment bands, and they remain in the same bands regardless of subject studied, to allow sufficient analysis to enable a strategic focus on student-level improvement strategies, without undue over-complexity.
6. Difference in methodology (3): Subject cohorts of less than 6 are suppressed in the16-18 PMs subject analysis. Alps includes analysis of all cohorts, no matter how small – to enable a comprehensive analysis. Small cohorts are still graded 1-9 in Alps, as it is still necessary to measure the progress of these students because the progress of every student matters.
About the author: John Philip
John started working with Alps in 2008, while he was working at Little Heath Comprehensive School. At Little Heath, John used Alps to achieve top 2% performance in value-added terms. He also worked with schools regionally and nationally through the Raising Achievement Partnership Programme. Since leaving Little Heath in 2010, John additionally works as an associate for 22 secondary schools through PiXL.
Join our free webinar
16-18 Progress measures: How they work and what they measure
Join our free webinar on the 8th December at 4pm where Alps Senior Education Consultant, John Philip and PiXL CEO, Rachel Johnson unpack the complexities of the 16-18 progress, explaining how they work and why none were published between 2019 and 2024. We’ll finish with a live Q&A, where you can ask Rachel and John any questions you have.
Join us on the 8th December at 4pm: Register your place here.
