A common theme throughout all of the recent commentary on the state of CEIAG in schools has been that the publication of Destination statistics for all schools is a ‘good thing.’ In the modern world, the argument goes, transparency of outcomes for schools should not just rely on qualifications gained by students but on the stability and suitability of the progress those students then make in their first steps beyond the school gates.
With this in mind I wanted to post something concentrating purely on the Destination Data of my own school’s leavers to show how this does and does not offer insight when looking at figures on a school size level.
I’ll be using 4 sets of Destination data to give some context.
Firstly, there is the data currently on the DfE performance tables website. This relates to our 2009/10 leavers.
Second, is the data for the 2010/2011 cohort that is due to be published on the performance table site in June.
So, what to notice between those two? The trend in our numbers to FE seem to be falling while the numbers to Sixth Form College are rising, Apprenticeships are steady and “Destinations not sustained” are falling. The FE and Sixth Form trends have the biggest swing in numbers so could tell the story of a more definitive trajectory. The Apprenticeships and Not Sustained numbers are pleasing but I’m wary of hanging out the bunting because, as you can see from the second table, the numbers of students involved are small. One or two students either way and those percentages alter significantly.
A hugely important factor to bear in mind is that this data is based not on a snapshot but on an extended time period. As the guidance tells us Participation is determined as enrollment for “the first two terms (defined as October to March) of the year after the young person left KS4″ and not sustained destinations are defined as “young people who had participated at an education destination during the academic year but did not complete the required six months participation.” There is much to commend on the longer term measurement being used here which does more thoroughly test a school’s CEIAG legwork to suitably place their students post KS4. A negative consequence of this more considered approach though is the sheer amount of time that has to be allowed before publication to let the students travel through the system. The most recent set of data above covers students who left us 3 years ago. 3 years can be a lifetime of change in a school with new initiatives, new curriculum, staff turnover, Leadership changes, new priorities and events so to use this to judge that school in the here and now seems to be a little redundant.
The third set of data for our 2011/2012 cohort is from our Local Authority, who, alongside their Youth Service partners, work their way through enrollment lists, phone calls and house visits to get all of the stats which the DfE then utilise in future.
The first thing to notice is that some of the Destination terms are not the same. This immediately causes issues in comparison. Compared to the first two sets of data, the trend away from FE routes and towards Sixth Form (not differentiated between School Sixth Form and Sixth Form College here) reduces but continues. The NEET category (not known in the DfE data) is pleasing again (with the same caveat as above) while the Part Time Education numbers are odd and appear towards the larger end of the local spread (more about this below) but they lead to another concern; any conclusions we draw are only as sound as the data collection and entry job that went before them.
The biggest difference in the data sets is that the Local Authority data is a snap shot taken on the 1st of November 2012, just a few short weeks after the GCSE results. If published then, the immediacy of this data could provide interested parties such as Ofsted or parents much more reactive numbers on which to judge local secondary schools but this immediacy could also cause problems. Any snap measurement could offer a warped view of a reality that would produce very different data if captured on a different date (were the statistics exactly the same on the 2nd of November?) and perhaps not highlight gradual drop out as those learners went through the first term of their KS5 routes. To combat this and to show trends the Authority repeat the exercise in the following April with the same year group and the results of this follow up snapshot for the 2012 leavers are in the columns on the right below.
Clearly the largest change between the November and April is the Part time Education number now reads zero and the number of Apprenticeships has jumped by the same number to 12. How much of this change can be attributed to data entry decisions or to the steady progress of our leavers securing Apprenticeships in the year school would only be known to those with local knowledge of our alumni. It’s a tale not told in the stats.
So, what can we learn from all this data?
1) The considered publication timeframe on the DfE performance tables has both good and bad sides for judging school performance
2) When you drill down to school level, the numbers of actual students involved moving from category to category can be small enough so that only a few students fluctuating between them can significantly impact the percentages
3) Trends in destination growth or reduction for different routes can only be properly identified with multiple data sets over a longer period
If Ofsted and stakeholders such as parents are to get the most out of Destination data in its current form, a considered and measured view and a desire to understand the stories behind the figures really will be required.