destination statistics

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

Advertisements

The CEC State of the Nation report

The latest publication from the Careers & Enterprise Company (CEC) continuing their expanding library of research, State of the Nation 2017: Careers and enterprise provision in England’s schools was published earlier this month. Utilizing the “State of the Nation” title also employed by the annual updates from the Social Mobility Commission (and so helping affirm the aims of the CEC with policy makers), this is a publication which shows the Company moving on from earlier releases which audited the CEIAG landscape and onto a new stage of updating on progress made.

The report is based on 578 responses from secondary schools who have completed the online Careers program auditing tool, Compass, and the comparison of this set of data with the data collected for the original Gatsby Good Career guidance report in 2014.

The CEC makes a number of claims from this exercise but the accompanying media coverage focused on the responses which indicate an improvement in school provision since 2014 as more schools report that they are meeting more benchmarks.

There is evidence of improvement since the original Gatsby survey in 2014. Schools in 2016/2017 are achieving an average of half a Benchmark more than they were in 2014/2015 (1.87 versus 1.34). The proportion of schools not achieving any Benchmarks has fallen by one third from 31% to 21%. The proportion of schools achieving half the Benchmarks has more than doubled from 6% to 16%

Which sounds positive but these are figures which should be treated with caution and, like the rest of the report, taken in the round alongside other data. These are the points I found most interesting in the report:

1. This is a small number of schools and a narrow method of evidence collection

As can be seen in the Appendices, the 2014 Gatsby report used multiple sources of evidence to form it’s benchmarks, recommendations and costings. Six overseas visits took place with interviews with practitioners, policy makers and stakeholders in these countries conducted. Visits and interviews with six Independent schools also added to the evidence base as well as reviewing eighteen previous reports on CEIAG provision. Finally an online survey was completed by 361 secondary schools in winter 2014.

gatsby school profiles

The breakdown of the responding schools

As a baseline, 361 schools (from approximately  3329 secondary schools at the time) is a thin slice so it’s positive that 578 have used the Compass tool but this is still small. The 2014 figures included only 9 schools then judged as Requiring Improvement by Ofsted, the 2017 report does not include this figure. In 2017 there are now 3408 secondary schools in England so 578 equates to roughly 17% of secondary schools responding.

2. This is based on self evaluation

Asking any professional if they do a good job isn’t going to get objective responses. Both the 2014 and 2017 reports are clear to point out that questions of validity could arise both from the bias of the overall sample (those taking the time to complete the survey could be more likely to be interested in CEIAG for example) and responses being overly generous to the CEIAG provision on offer in their establishment (via the Overconfidence Effect).

None of this data relates to outcomes. No students are asked by an objective third party on their view of provision, no destination data monitored, no LEO data cross referenced, no employers surveyed. Self evaluation via online questionnaire is an extremely limited (but cheap) method of providing reference points and progress evaluation.

This is typified by the inclusion of one of the case study schools that reported itself to be meeting “seven or eight” of the Gatsby benchmarks. Looking at the most recent KS4 destination data (2015) for that school, you can see that all of the data that a school with a strong CEIAG offer should be achieving well on, the school isn’t:

  • Pupils staying staying in education or employment for at least 2 terms after KS4 is 86%, well below the 94% average for English state funded schools
  • Pupils not staying in education or employment for at least 2 terms after KS4 is 11% well above the 5% national average
  • The percentage of KS4 leavers moving into Apprenticeships is 3%, half the nation average of 6%

It’s important to remember that behind all of those statistics are the actual students who each had their own story, background and challenges to overcome but these are not the statistics to highlight the positive social justice leveling work of CEIAG,

The report references these omissions on page 26 and makes the somewhat valid point that

One limitation of attainment and progression data is that it is backward looking and thus if we look for relationships between the Compass data and outcomes, we are comparing one cohort’s career provision with another cohort’s outcomes

and conclude that the destination data sources mentioned above could be used to correlate with Compass data over a longer period of time. This would enable relationships (if any) between consistent quality CEIAG provision and student outcomes to be found. This is an admirable goal to be supported in future but it isn’t how accountability in education works. Ofsted gradings are held by schools for years after the inspection took place, a young person leaving Year 11 this summer might have attended an “outstanding” school but could be based on a verdict of provision that happened seven years ago. There is always a lag between monitoring of provision and actual provision.

3. Further bad social mobility vibes

Another of the included case studies is also a little tone deaf for an organisation that is keen to show that it playing it’s role in the Government’s social mobility agenda through the Opportunity Area policy. Including Simon Langton Girls Grammar School, a selective entry school whose pupils, including the 5.6% eligible for free school meals, must take the Kent Procedure for Entrance to Secondary Education tests to enrol is at odds with the overall aim of both the document and the CEC.  The CEIAG work at Simon Langton might be exceptional, it certainly features prominently on their website, but this is not helping disadvantaged pupils. Areas with selection at age 11 fail the poorest children and the CEC should steer clear of involving itself in work that perpetuates these outcomes.

4. If the survey responses are to be believed, then Quality Mark Awards are far too generous

The 2017 data survey data reports that schools that hold a Careers Quality Mark (now all joined together in the Quality in Careers Standard) achieve a higher number of Gatbsy benchmarks than those schools without but that this still only reaches an average of 2.63 of the 8 benchmarks for those schools. This is a blow to those who advocate that Quality Marks are a valid indicator of provision quality. The results of a self reported survey, including the biases mentioned above, are reporting that their CEIAG provision does not meet the benchmarks the external monitored Quality Marks claim they do. That there is so little congruence between these results is evidence that Careers Quality Marks assessment and monitoring processes have not been anywhere near stringent or demanding enough and need to improve. As the report says

As the Quality in Careers Standard works towards aligning fully with the Benchmarks we would expect to see schools achieving the Quality in Careers Standard reaching all eight Benchmarks

but this will be a challenge for a service paid for by the schools who have volunteered to be inspected to achieve.

Showing the impact of the type of strategic work the CEC is involved with is always going to be difficult. With so many stakeholders involved in the delivery of provision and so many factors influencing the outcomes for young people, concentrating on the input factors to begin with is sensible but, due to a total reliance on self evaluation, this is also with it’s downsides. Over the forthcoming months I would expect to see the CEC to transition towards utilizing more quantitative data sources on which to base their judgments of progress.

We are beset on all sides by the tyranny of bad CEIAG reports

kxu81h

“say jobs of the future again, I dare you, I double dare you”

A lot of reports get published that look at the state of CEIAG provision for young people in the UK and offer improvement ideas. As well as policy makers there are a vast number of stakeholder organisations in this arena and across areas such as social mobility, apprenticeships, vocational education that all overlap with Careers Advice. Some of these organisations are more upfront in the policy ambitions of their backers than others but all have found that publishing a report is a proven method of gaining those all important media column inches if you want to advance your agenda.

Some sink, never to pass over the desks of Ministers while others take center stage in shaping Government thinking. The quality spectrum of these reports is wide and two came out recently that, to my mind, should be filed at the weaker end of the publication pool.

First up came Beyond the Numbers: Incentivising & implementing better apprenticeships from the University of Sheffield. Branded under their “Sheffield Solutions” research arm, the publication was based on a number of interviews with

local and national stakeholders in education, training and youth services, staff members – including tutors, trainers and employers

views from apprentices which were collected from

two focus groups and a number of in-depth, semi-structured interviews

as well as previous publications. The report includes quotes and stories that rehash the cliches of school CEIAG’s relationship to apprenticeships, including a lack of information regarding alternative to HE routes, a belief that apprenticeships were treated as a second class pathway and that high achieving pupils were actively discouraged from applying for them. The actual application figures of young people compared to the opportunities on offer, isn’t considered.

Where the report really falls down though is in it’s recommendations for schools

sheffield1

  1. Rethinking school league tables to include apprenticeships – this already happens. When you go the DfE school comparison site you can find individual school data through school name, distance to your postcode, through Local Authority area or through Parliamentary Constituency. Users can then scroll down past huge amounts of information about the school to find the Pupil Destinations – what pupils did after key stage 4 drop down menu and, hey presto, there is that information.

sheffield2

You can also find this data about key stage 5 leavers on the 16-18 tab further up the page.

Apprenticeship destination information is a single drop on a website that is an ocean of information about each school from the number of teachers, to the performance of disadvantaged pupils, to the number of pupils entered in Physics, Biology and Chemistry. Data on pupils remaining in education or employment after leaving the school is included in the headline data

sheffield3

but the sheer amount of other information means that users are left to navigate to find what is important to them.

2. Extra training and resources for Careers Advisers in school about apprenticeships – nobody is ever going to say ‘no’ to more resources or extra training which is why the DfE has contracted organisations across the country to offer this to schools. The provider across the Midlands is Workpays. They will come into school to offer provision for students, send you resources and offer training. The DfE has a page with resources for schools and advisers and the ASK (Apprenticeship Support & Knowledge) providers will come and offer events for students. The University of Sheffield is, again, recommending something that already exists.

3. Coordinated, single application process for apprenticeships – Guess what, it already exists. Find An Apprenticeship is not a great website (it’s text search is terrible) but it is a single, coordinated portal for apprenticeships. All of the apprenticeships, they’re all on there. What it is not though is a single application process as many apprenticeship vacancies require an applicant to click through to the employer website to register (again) and complete an application. This is something that is out of the hands of Government as many employers will insist on their own hiring methods that are standardized across their business for all job roles. This is part of the challenge when supporting a young person through a labyrinth registration process on a company website full of business jargon but it fits established employer HR practices.

So all three of the recommendations for education are, to some extent, already in place which highlights how, while diagnosing problems with CEIAG provision may be achievable, offering solutions requires more a real understanding of the landscape.

The other report that caught my attention was Averting a 90Bn GDP crises: A report on the image and recruitment crises facing the built environment carried out by Kier Group by polling “2000 secondary school teachers, advisers and parents.” The Group, a profitable player in the UK construction market, look very keen to play their part in improving student career advice by pledging 1% of their workforce to act as ambassadors and place a “virtual world plaque” on sites to help the public “explore a digital world of information on a project.” They hope that these initiatives will begin to change widely held views of their industry as their poll reports 73% of parents not wanting their child to pursue a career in the sector and, despite 76% knowing that apprenticeships lead to careers in construction, 45% wouldn’t encourage their child to take an apprenticeship when leaving school. To it’s credit the report gives context to the current CEIAG landscape by devoting a whole page the loss of funding and the placing of the legal duty on schools in 2012.

Where the report fails to offer much value is, again, in the recommended solutions, both those from within the construction industry and from government, to improve the situation. Despite clearly identifying that parents are a persuasive and influential negative voice against young people aspiring to work in their industry they suggest nothing to then engage with parents. That parents are an important voice in shaping the career views of a young person is backed up by other data and we also have clear indications of how young people would like to receive their CEIAG and what types of provision help them most. An important type of provision is work experience and workplace visits, the report also fails to acknowledge or offer a proposal to grow the dearth of these opportunities in the sector.

dioz0yrw4aaemp8

The 1% workforce ambassador pledge will hopefully, from a very low base, improve the number of work inspiration opportunities.

From Government they ask that the Careers & Enterprise Company is allowed to continue it’s work (it will be so this isn’t much of recommendation) and

2. Mandate that every school gives children a minimum of three one hour careers advice sessions – the first session with a school advisor, follow up sessions with ambassadors from relevant industries.
3. Ensuring the frameworks and resources are in place to support schools and colleges to meet all of the eight benchmarks identified by the Gatsby Foundation14 for best practice careers advice
4. Mandate that the careers advice process begins as early as possible in a young person’s life to enable them to make informed choices about their subject/course selection

which are all useful and worthwhile suggestions but after earlier acknowledging that

as part of the difficult choices made through austerity measures, funding for Connexions was cut, leaving a significant responsibility largely resting with schools themselves

and that

given substantial and repeated budget cuts, other schools are unable to provide the kind of service that they would aspire to

the report fails to then include the obvious point that these (uncosted) increases in service provision would require more funding. This shows a lack of willingness to bring up the funding of public services for the wider benefit and a failure to acknowledge the financial reality in schools.

Reports that help shine attention to the issues with employer engagement and CEIAG in schools but also then offer constructive solutions that work within the realities of the landscape are to be welcomed. Reports that finger point at a Careers service under funded and unable to solve all of the problems laid at it’s door without significant collaboration and investment, only have one purpose; to shift the focus of blame away from the other stakeholders.

Apprenticeships & KS4 &KS5 students from ethnic minority backgrounds

Released this week was a slew of data on the destinations of 2015 Key Stage 4 & 5 leavers that provides nerds like me hours of interesting noodling about in excel. There’s plenty to get through, not only focusing on the number of students taking up each type of route but also on the characteristics of those students.

This post looks at the trend from the data (now going back to 2010, although breakdowns that include ethnicity only start in 2012) in the percentages of  KS4 & 5 leavers of different ethnicities going into apprenticeships. The charts below detail both the percentage of all leavers in that year accessing apprenticeships and the percentage of students from each named ethnic background accessing apprenticeships.

ks4-apprenticeship-destinations

ks5-apprenticeship-destinations

The first thing to notice is that these percentages are comparatively small compared to other routes. The 6% of KS4 students starting apprenticeships is well below the 38% starting an FE College and another 38% starting a school Sixth-Form. That 6% equates to approximately 34,000 young people while the 7% of KS5 leavers equates to approximately 25,400 students from this age group.

For all the talk from policy makers as apprenticeships solving youth employment concerns, it is largely the growth of 25+ apprenticeship starts that has allowed the previous Government’s target of 2 million to be reached. The starts of those under 19 are inching up, as evidenced by the slow growth in the Total bars in the chart above and this chart

apprenticeships-blog-post-1

from The House of Commons Library.

and this year 130,000 under 19s commenced an apprenticeship. This is more than double the (approx) 50,400 combine leavers above which shows that many young people are accessing apprenticeships after the first two terms (the definition used by the destination statistics) after leaving their KS4 & KS5 providers.

What seems to be happening though is that the apprenticeship route is failing to find much growth with students from ethnic minority backgrounds straight out of school and college. While their White peers are eking up the percentage of students from that background pursuing this route, the percentage of black or Asian students is either stalled or lagging behind.

This is not to say that the numbers of students from these backgrounds taking apprenticeships has not increased, it has

apprenticeships-blog-post-2

but that, as a percentage of the total number of students from each of those backgrounds leaving KS4 & KS5, their progression is being outpaced by their White counterparts.

The reasons for this are going to be numerous and complex for different groups.

In Luton we have a diverse student cohort drawn from the local community and this wider trend in apprenticeship progression is reflected in the 2015 destination figures of the town. Of the four schools with highest percentage of students whose first language is not English (from Icknield High School at 64.9% to Denbigh High at 94.6%), only one reaches a 4% progression rate to apprenticeships while the figures of two of the others are so small they are suppressed to protect confidentiality. There is an attainment factor in play here, all four of those schools are in the top six in the town for academic progress. 86% of Denbigh’s leavers progressed to the local Sixth-Form college. The reasons for this huge majority are not only that these students (and their families) see this route as desirable but that the grades they have achieved make that route available. We also know that, comparably, white working class students (particularly boys) underachieve in their academic progress and so, find academic routes Post 16 harder to access. Does this mean that this ethnic group has fewer routes to pursue, so a greater percentage opt for apprenticeships?

Is the fact that the majority of apprenticeships on offer are at qualifications levels perceived to be accessible to (slightly) lower academic achievers but not offering challenge or progression to higher academic achievers a dissuading factor?

apprenticeships-blog-post-3

Another favourite tactic from policy makers when discussing destinations is to revert to the safe waters of “a lack of aspiration” for low progression figures but I’ve blogged in the past about this is a sound bite hiding a more nuanced situation. Could it be that students from White backgrounds actually have greater social capital around apprenticeships and are more able to access the networks of support to gain a foothold into this route?

Whatever the reasons, it seems that schools and colleges are not currently impacting on the trend for students from Asian and Black backgrounds to access apprenticeships despite a current, diverse advertising campaign. The numbers of young people of all backgrounds accessing apprenticeships needs to increase but the messages are so far struggling to reach all of the students in our schools.

The stories told and not told by school Destination data

A common theme throughout all of the recent commentary on the state of CEIAG in schools has been that the publication of Destination statistics for all schools is a ‘good thing.’ In the modern world, the argument goes, transparency of outcomes for schools should not just rely on qualifications gained by students but on the stability and suitability of the progress those students then make in their first steps beyond the school gates.

With this in mind I wanted to post something concentrating purely on the Destination Data of my own school’s leavers to show how this does and does not offer insight when looking at figures on a school size level.

I’ll be using 4 sets of Destination data to give some context.

Firstly, there is the data currently on the DfE performance tables website. This relates to our 2009/10 leavers.

Second, is the data for the 2010/2011 cohort that is due to be published on the performance table site in June.

So, what to notice between those two? The trend in our numbers to FE seem to be falling while the numbers to Sixth Form College are rising, Apprenticeships are steady and “Destinations not sustained” are falling. The FE and Sixth Form trends have the biggest swing in numbers so could tell the story of a more definitive trajectory. The Apprenticeships and Not Sustained numbers are pleasing but I’m wary of hanging out the bunting because, as you can see from the second table, the numbers of students involved are small. One or two students either way and those percentages alter significantly.

A hugely important factor to bear in mind is that this data is based not on a snapshot but on an extended time period. As the guidance tells us Participation is determined as enrollment for “the first two terms (defined as October to March) of the year after the young person left KS4″ and not sustained destinations are defined as  “young people who had participated at an education destination during the academic year but did not complete the required six months participation.” There is much to commend on the longer term measurement being used here which does more thoroughly test a school’s CEIAG legwork to suitably place their students post KS4. A negative consequence of this more considered approach though is the sheer amount of time that has to be allowed before publication to let the students travel through the system. The most recent set of data above covers students who left us 3 years ago. 3 years can be a lifetime of change in a school with new initiatives, new curriculum, staff turnover, Leadership changes, new priorities and events so to use this to judge that school in the here and now seems to be a little redundant.

The third set of data for our 2011/2012 cohort is from our Local Authority, who, alongside their Youth Service partners, work their way through enrollment lists, phone calls and house visits to get all of the stats which the DfE then utilise in future.

The first thing to notice is that some of the Destination terms are not the same. This immediately causes issues in comparison. Compared to the first two sets of data, the trend away from FE routes and towards Sixth Form (not differentiated between School Sixth Form and Sixth Form College here) reduces but continues. The NEET category (not known in the DfE data) is pleasing again (with the same caveat as above) while the Part Time Education numbers are odd and appear towards the larger end of the local spread (more about this below) but they lead to another concern; any conclusions we draw are only as sound as the data collection and entry job that went before them.

The biggest difference in the data sets is that the Local Authority data is a snap shot taken on the 1st of November 2012, just a few short weeks after the GCSE results. If published then, the immediacy of this data could provide interested parties such as Ofsted or parents much more reactive numbers on which to judge local secondary schools but this immediacy could also cause problems. Any snap measurement could offer a warped view of a reality that would produce very different data if captured on a different date (were the statistics exactly the same on the 2nd of November?) and perhaps not highlight gradual drop out as those learners went through the first term of their KS5 routes. To combat this and to show trends the Authority repeat the exercise in the following April with the same year group and the results of this follow up snapshot for the 2012 leavers are in the columns on the right below.

Clearly the largest change between the November and April is the Part time Education number now reads zero and the number of Apprenticeships has jumped by the same number to 12. How much of this change can be attributed to data entry decisions or to the steady progress of our leavers securing Apprenticeships in the year school would only be known to those with local knowledge of our alumni. It’s a tale not told in the stats.

So, what can we learn from all this data?

1) The considered publication timeframe on the DfE performance tables has both good and bad sides for judging school performance

2) When you drill down to school level, the numbers of actual students involved moving from category to category can be small enough so that only a few students fluctuating between them can significantly impact the percentages

and that

3) Trends in destination growth or reduction for different routes can only be properly identified with multiple data sets over a longer period

If Ofsted and stakeholders such as parents are to get the most out of Destination data in its current form, a considered and measured view and a desire to understand the stories behind the figures really will be required.

 

The updated (April 2014) Statutory Careers guidance is here

Published today we get:

A Press Release:

https://www.gov.uk/government/news/pupils-to-be-advised-by-employers-to-pursue-ambitious-careers

The actual Statutory guidance:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/302422/Careers_Statutory_Guidance_-_9_April_2014.pdf

and some supplementary guidance with extra information, some wonderful case studies of work in schools, a Q&A section and some helpful web links:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/302424/Careers_Non-Statutory_Departmental_Advice_-_9_April_2014.pdf

Read, enjoy and debate but my initial thoughts are that no school leader could read those documents and claim “not to know” what to do for Careers work in their school.

2013 Luton Destination Statistics: Less 16 and 17 Year Olds in full time outcomes

 

Continuing an intermittent series of posts concentrating on the Destination data of Luton school leavers (previous posts here and here), this post looks at the number of 16 & 17 year olds in Luton who were in full time education or employment in December 2013.

At this time 4,940 16 & 17 year olds were known to the Local Authority and of these 88.2% were in full-time outcomes. This is a fall of 3.3% against December 2012 and the only fall in participation in the Eastern Region and compared to a national figure of 89.8%. This regression is also disappointing compared to the positive news across the country as Local Authorities and schools move to comply with the Raising the Participation age legislation.

Of those learners only 1.4% were in an Apprenticeship route, the second lowest in the Eastern Region (behind Southend) which shows that while awareness of this route is growing, the gap between positions sought and those acquired is still huge. 84.1% were still in full-time education.

Delving deeper it’s clear where the fall in participation can be attributed. At 16 both girls and boys show healthy figures for participation (95.8% and 96.6% respectively)  but at 17 something happens and the figures drop to 82.2% for girls and 78.3% for boys. This is the worst 17-year-old boys participation rate in the Eastern Region. There could be a number of reasons for this; students are signing up to unsuitable courses to begin with (aka the Careers advice is poor), the safety nets to stop them dropping out of Key Stage 5 provision are weak or the provisions themselves do not possess the strength to keep them enticed in the first place. It’s a conundrum that needs solving.