I think it’s safe to say that the Gatsby Benchmarks have been a game changer in the CEIAG world. Not because they offer new ways of working with young people or reinvent the purpose of CEIAG but because they included practice with a solid research backing and standarised what a comprehensive careers programme in schools and colleges looks like. This standarisation has enabled other stakeholders and professional to quickly understand a common rationale and purpose behind such a programme and so buy in to the shared goals.
The benchmarks have been supercharged in some areas of the country with the support of local CEC supported Careers Hubs. With the second wave of 20 Hubs to come online in September 2019, the CEC has been keen to show evidence of pacy progress towards meeting both technical goals around Benchmarks but also highlight that this work is happening in disadvantaged areas of the country.
Schools and colleges in this first wave of Careers Hubs are already outperforming the national average across all aspects of careers education. After two terms, schools and colleges which are part of the first wave of Hubs are:
- outperforming the national average on every single one of the eight Gatsby Benchmarks of good careers guidance
- the majority (58%) are providing every student with regular encounters with employers
- the majority (52%) are providing every student with workplace experiences such as work experience, shadowing or workplace visits
Most striking is that improvements are strongest in disadvantage areas including in Careers Hubs located in Tees Valley, Lancashire, the Black Country and Liverpool City Region.
There are a two issues which should provoke some discussion about the work of Hubs.
Starting from a higher base point
The CEC Prospectus for the Careers Hubs bidding process was clear on the criteria areas had to meet to put forward successful bids.
It would logically follow then that Compass data for those schools and Colleges involved in Hubs should be, on average, at a lower point than School and Colleges not in Hubs at the start of the scheme. Sure, other factors such as destinations and achievement feed into the definition of Cold Spot areas but CEIAG and employer engagement provision is a central metric. There may also be some individual exceptions of providers offering high Gatsby compliant provision within those Cold Spot areas of course but, taken in the round, if the self reported Compass data is a consistent picture of practice and provision then it makes sense for the initial Hub Compass data to be below the national average. Yet this wasn’t the case. Using the July 2018 data (the left hand blue bars) from the CEC tweet below
and comparing it to the nationwide State of the Nation figures from 2018
we can see that the Hubs were reporting a higher percentage of schools and colleges meeting already every Benchmark than the national average (apart from one – Benchmark 3) before the Hub scheme had even begun. The CEC is right to say in it’s press releases that by March 2019, Hub schools and colleges were
outperforming the national average on every single one of the eight Gatsby Benchmarks of good careers guidance
but what they don’t include is that this was the case for all but one of the Benchmarks before the Hubs had even started work.
This is concerning for the questions it raises on the reliability of the Hub awarding process, Compass as a self evaluation tool but should also prompt queries for the CEC over the pace of progress of those institutions involved in Hubs. Is it easier to roll the CEIAG snowball down the far slope once it’s already closer to the summit?
2. The more you know, the more you doubt
At the recent National Careers Leader Conference in Derby I was fortunate to attend some brilliant sessions including this from Dr Jill Hanson who is undertaking the Gatsby Pilot evaluation for ICEGs. I posted about the interim report back in March 2019 and it was great to hear about the positives the Pilot resulted in. After 2 years of the pilot young people at pilot schools and colleges were more likely to recall participating in CEIAG provision
and the 2018 cohorts reported much higher scores on a career readiness index
with clear correlation for higher readiness scores for those in providers who had fully achieved more Benchmarks.
A pause for concern though comes in responses from the same students who completed the career readiness index in both 2016 & 2018. These show significant drops in pupil confidence in career management and planning and information and help seeking skills but not work readiness skills.
As Tom Staunton notes in Dr Hanson’s slides, there could be a number of overlapping explanations for this. In the room, the practitioners present concluded that this might be a case of young people being introduced to a wider variety of routes that had pushed them beyond their comfort zone and in doing so reduced confidence and certainty in the routes they had previously been aware of (if any). If suddenly the world seems larger, your place in it will seem smaller. This is a theme which has been described in previous CEC research “Moments of Choice” and it will be interesting to see if a) this trend in the data continues and b) what steps the providers involved should take to address the issue (if any). Potential remedying work through personal guidance offering more support to those students reporting a lower level of confidence in those areas or more “nudge” based interventions aimed at groups? Or nothing at all?
Up-scaling a model such as the Gatsby Benchmarks comes with pitfalls to avoid, particularly the temptation for providers to over-rate their progress or look for tick box filling solutions that don’t translate into substantive outcomes for learners. As Sam Freedman notes here
about a different education policy proposal, compliance isn’t always the full recipe and the intangible’s that can help make a good school CEIAG program (parental relationships, drive of the practitioner, heck, even office placement in the school) are difficult to measure. The forthcoming Compass Plus has the potential to address some of those issues as it more closely ties provision to self-evaluation.
Regarding the negative effects on student confidence in their future planning skills, the results of the Careers Registration Learning Gain project in Higher Education are a useful longitudinal comparative. Using a similar method (asking students to complete a career readiness set of questions year on year), these show that more mature learners can move towards higher rated career ready states. By the final year of a degree an increase of 18.28% of students reported themselves to be in the Compete category (see NICEC Journal April 2019). Could it be that the less confident younger students Dr Hanson found are a perfectly natural, even desirable outcome of Gatsby compliant CEIAG provision and that confidence in career planning only comes with greater maturity? Should CEIAG practitioners in schools revel in the fact that their students are less confident about their route but more aware of the diversity of options? These are fascinating questions that we have the potential to find answers to as the Gatsby Benchmarks standarise provision across the country.