The latest publication from the Careers & Enterprise Company (CEC) continuing their expanding library of research, State of the Nation 2017: Careers and enterprise provision in England’s schools was published earlier this month. Utilizing the “State of the Nation” title also employed by the annual updates from the Social Mobility Commission (and so helping affirm the aims of the CEC with policy makers), this is a publication which shows the Company moving on from earlier releases which audited the CEIAG landscape and onto a new stage of updating on progress made.
The report is based on 578 responses from secondary schools who have completed the online Careers program auditing tool, Compass, and the comparison of this set of data with the data collected for the original Gatsby Good Career guidance report in 2014.
The CEC makes a number of claims from this exercise but the accompanying media coverage focused on the responses which indicate an improvement in school provision since 2014 as more schools report that they are meeting more benchmarks.
There is evidence of improvement since the original Gatsby survey in 2014. Schools in 2016/2017 are achieving an average of half a Benchmark more than they were in 2014/2015 (1.87 versus 1.34). The proportion of schools not achieving any Benchmarks has fallen by one third from 31% to 21%. The proportion of schools achieving half the Benchmarks has more than doubled from 6% to 16%
Which sounds positive but these are figures which should be treated with caution and, like the rest of the report, taken in the round alongside other data. These are the points I found most interesting in the report:
1. This is a small number of schools and a narrow method of evidence collection
As can be seen in the Appendices, the 2014 Gatsby report used multiple sources of evidence to form it’s benchmarks, recommendations and costings. Six overseas visits took place with interviews with practitioners, policy makers and stakeholders in these countries conducted. Visits and interviews with six Independent schools also added to the evidence base as well as reviewing eighteen previous reports on CEIAG provision. Finally an online survey was completed by 361 secondary schools in winter 2014.
As a baseline, 361 schools (from approximately 3329 secondary schools at the time) is a thin slice so it’s positive that 578 have used the Compass tool but this is still small. The 2014 figures included only 9 schools then judged as Requiring Improvement by Ofsted, the 2017 report does not include this figure. In 2017 there are now 3408 secondary schools in England so 578 equates to roughly 17% of secondary schools responding.
2. This is based on self evaluation
Asking any professional if they do a good job isn’t going to get objective responses. Both the 2014 and 2017 reports are clear to point out that questions of validity could arise both from the bias of the overall sample (those taking the time to complete the survey could be more likely to be interested in CEIAG for example) and responses being overly generous to the CEIAG provision on offer in their establishment (via the Overconfidence Effect).
None of this data relates to outcomes. No students are asked by an objective third party on their view of provision, no destination data monitored, no LEO data cross referenced, no employers surveyed. Self evaluation via online questionnaire is an extremely limited (but cheap) method of providing reference points and progress evaluation.
This is typified by the inclusion of one of the case study schools that reported itself to be meeting “seven or eight” of the Gatsby benchmarks. Looking at the most recent KS4 destination data (2015) for that school, you can see that all of the data that a school with a strong CEIAG offer should be achieving well on, the school isn’t:
- Pupils staying staying in education or employment for at least 2 terms after KS4 is 86%, well below the 94% average for English state funded schools
- Pupils not staying in education or employment for at least 2 terms after KS4 is 11% well above the 5% national average
- The percentage of KS4 leavers moving into Apprenticeships is 3%, half the nation average of 6%
It’s important to remember that behind all of those statistics are the actual students who each had their own story, background and challenges to overcome but these are not the statistics to highlight the positive social justice leveling work of CEIAG,
The report references these omissions on page 26 and makes the somewhat valid point that
One limitation of attainment and progression data is that it is backward looking and thus if we look for relationships between the Compass data and outcomes, we are comparing one cohort’s career provision with another cohort’s outcomes
and conclude that the destination data sources mentioned above could be used to correlate with Compass data over a longer period of time. This would enable relationships (if any) between consistent quality CEIAG provision and student outcomes to be found. This is an admirable goal to be supported in future but it isn’t how accountability in education works. Ofsted gradings are held by schools for years after the inspection took place, a young person leaving Year 11 this summer might have attended an “outstanding” school but could be based on a verdict of provision that happened seven years ago. There is always a lag between monitoring of provision and actual provision.
3. Further bad social mobility vibes
Another of the included case studies is also a little tone deaf for an organisation that is keen to show that it playing it’s role in the Government’s social mobility agenda through the Opportunity Area policy. Including Simon Langton Girls Grammar School, a selective entry school whose pupils, including the 5.6% eligible for free school meals, must take the Kent Procedure for Entrance to Secondary Education tests to enrol is at odds with the overall aim of both the document and the CEC. The CEIAG work at Simon Langton might be exceptional, it certainly features prominently on their website, but this is not helping disadvantaged pupils. Areas with selection at age 11 fail the poorest children and the CEC should steer clear of involving itself in work that perpetuates these outcomes.
4. If the survey responses are to be believed, then Quality Mark Awards are far too generous
The 2017 data survey data reports that schools that hold a Careers Quality Mark (now all joined together in the Quality in Careers Standard) achieve a higher number of Gatbsy benchmarks than those schools without but that this still only reaches an average of 2.63 of the 8 benchmarks for those schools. This is a blow to those who advocate that Quality Marks are a valid indicator of provision quality. The results of a self reported survey, including the biases mentioned above, are reporting that their CEIAG provision does not meet the benchmarks the external monitored Quality Marks claim they do. That there is so little congruence between these results is evidence that Careers Quality Marks assessment and monitoring processes have not been anywhere near stringent or demanding enough and need to improve. As the report says
As the Quality in Careers Standard works towards aligning fully with the Benchmarks we would expect to see schools achieving the Quality in Careers Standard reaching all eight Benchmarks
but this will be a challenge for a service paid for by the schools who have volunteered to be inspected to achieve.
Showing the impact of the type of strategic work the CEC is involved with is always going to be difficult. With so many stakeholders involved in the delivery of provision and so many factors influencing the outcomes for young people, concentrating on the input factors to begin with is sensible but, due to a total reliance on self evaluation, this is also with it’s downsides. Over the forthcoming months I would expect to see the CEC to transition towards utilizing more quantitative data sources on which to base their judgments of progress.