gatsby benchmarks

Where are Careers Hubs starting from and ending with?

I think it’s safe to say that the Gatsby Benchmarks have been a game changer in the CEIAG world. Not because they offer new ways of working with young people or reinvent the purpose of CEIAG but because they included practice with a solid research backing and standarised what a comprehensive careers programme in schools and colleges looks like. This standarisation has enabled other stakeholders and professional to quickly understand a common rationale and purpose behind such a programme and so buy in to the shared goals.

The benchmarks have been supercharged in some areas of the country with the support of local CEC supported Careers Hubs. With the second wave of 20 Hubs to come online in September 2019, the CEC has been keen to show evidence of pacy progress towards meeting both technical goals around Benchmarks but also highlight that this work is happening in disadvantaged areas of the country. 

Schools and colleges in this first wave of Careers Hubs are already outperforming the national average across all aspects of careers education. After two terms, schools and colleges which are part of the first wave of Hubs are:

  • outperforming the national average on every single one of the eight Gatsby Benchmarks of good careers guidance
  • the majority (58%) are providing every student with regular encounters with employers
  • the majority (52%) are providing every student with workplace experiences such as work experience, shadowing or workplace visits

Most striking is that improvements are strongest in disadvantage areas including in Careers Hubs located in Tees Valley, Lancashire, the Black Country and Liverpool City Region.

There are a two issues which should provoke some discussion about the work of Hubs.

Starting from a higher base point

The CEC Prospectus for the Careers Hubs bidding process was clear on the criteria areas had to meet to put forward successful bids.

cec hubs 1

It would logically follow then that Compass data for those schools and Colleges involved in Hubs should be, on average, at a lower point than School and Colleges not in Hubs at the start of the scheme. Sure, other factors such as destinations and achievement feed into the definition of Cold Spot areas but CEIAG and employer engagement provision is a central metric. There may also be some individual exceptions of providers offering high Gatsby compliant provision within those Cold Spot areas of course but, taken in the round, if the self reported Compass data is a consistent picture of practice and provision then it makes sense for the initial Hub Compass data to be below the national average. Yet this wasn’t the case. Using the July 2018 data (the left hand blue bars) from the CEC tweet below

and comparing it to the nationwide State of the Nation figures from 2018

state of the nation2

we can see that the Hubs were reporting a higher percentage of schools and colleges meeting already every Benchmark than the national average (apart from one – Benchmark 3) before the Hub scheme had even begun. The CEC is right to say in it’s press releases that by March 2019, Hub schools and colleges were

outperforming the national average on every single one of the eight Gatsby Benchmarks of good careers guidance

but what they don’t include is that this was the case for all but one of the Benchmarks before the Hubs had even started work.

This is concerning for the questions it raises on the reliability of the Hub awarding process, Compass as a self evaluation tool but should also prompt queries for the CEC over the pace of progress of those institutions involved in Hubs. Is it easier to roll the CEIAG snowball down the far slope once it’s already closer to the summit?

2. The more you know, the more you doubt

At the recent National Careers Leader Conference in Derby I was fortunate to attend some brilliant sessions including this from Dr Jill Hanson who is undertaking the Gatsby Pilot evaluation for ICEGs. I posted about the interim report back in March 2019 and it was great to hear about the positives the Pilot resulted in. After 2 years of the pilot young people at pilot schools and colleges were more likely to recall participating in CEIAG provision

icegpilot1

and the 2018 cohorts reported much higher scores on a career readiness index

icegpilot2

with clear correlation for higher readiness scores for those in providers who had fully achieved more Benchmarks.

A pause for concern though comes in responses from the same students who completed the career readiness index in both 2016 & 2018. These show significant drops in pupil confidence in career management and planning and information and help seeking skills but not work readiness skills.

icegpilot3.JPG

As Tom Staunton notes in Dr Hanson’s slides, there could be a number of overlapping explanations for this. In the room, the practitioners present concluded that this might be a case of young people being introduced to a wider variety of routes that had pushed them beyond their comfort zone and in doing so reduced confidence and certainty in the routes they had previously been aware of (if any). If suddenly the world seems larger, your place in it will seem smaller. This is a theme which has been described in previous CEC research “Moments of Choice” and it will be interesting to see if a) this trend in the data continues and b) what steps the providers involved should take to address the issue (if any). Potential remedying work through personal guidance offering more support to those students reporting a lower level of confidence in those areas or more “nudge” based interventions aimed at groups? Or nothing at all?

Going forward

Up-scaling a model such as the Gatsby Benchmarks comes with pitfalls to avoid, particularly the temptation for providers to over-rate their progress or look for tick box filling solutions that don’t translate into substantive outcomes for learners. As Sam Freedman notes here

about a different education policy proposal, compliance isn’t always the full recipe and the intangible’s that can help make a good school CEIAG program (parental relationships, drive of the practitioner, heck, even office placement in the school) are difficult to measure. The forthcoming Compass Plus has the potential to address some of those issues as it more closely ties provision to self-evaluation.

Regarding the negative effects on student confidence in their future planning skills, the results of the Careers Registration Learning Gain project in Higher Education are a useful longitudinal comparative. Using a similar method (asking students to complete a career readiness set of questions year on year), these show that more mature learners can move towards higher rated career ready states. By the final year of a degree an increase of 18.28% of students reported themselves to be in the Compete category (see NICEC Journal April 2019). Could it be that the less confident younger students Dr Hanson found are a perfectly natural, even desirable outcome of Gatsby compliant CEIAG provision and that confidence in career planning only comes with greater maturity? Should CEIAG practitioners in schools revel in the fact that their students are less confident about their route but more aware of the diversity of options? These are fascinating questions that we have the potential to find answers to as the Gatsby Benchmarks standarise provision across the country.

Advertisements

The numbers in the Careers Hubs Benchmark 8 progress stats are pretty wild

Launched in September 2018 with 20 Hubs across the country (plus the orginal North East pilot area), the Careers & Enterprise Company is now expanding this policy with another 20 Hubs. When launched, I was positive about the structure of support they would be able to offer local areas and could see the rationale behind expanding the North East pilot but was concerned that the funding model those schools and colleges enjoyed was not also being replicated. The initial wave of hubs covers locales across the country:

  1. Black County – 36 schools and colleges
  2. Buck Careers Hub – 21
  3. Cornwall – 40
  4. Cumbria – 40
  5. Greater Manchester – number of schools & colleges involved not clear
  6. Heart of the south west – 40
  7. Humber – 26
  8. Lancashire – number not clear
  9. Leeds City Region – 35
  10. Leicester – 20
  11. Liverpool City Region – 34
  12. New Anglia – 32
  13. North East – 40 (plus 10 colleges?)
  14. Solent – 32
  15. South East – ?
  16. Stoke – 20
  17. Swindon – 40
  18. Tees Valley – 35
  19. West of England – 25
  20. Worcestershire – 40
  21. York – 35

The CEC says the total number of schools and colleges involved is 710.

As we reach the end of the first academic year of their existence, the CEC claims that schools and Colleges in those Hubs are progressing faster towards meeting the Gatsby benchmarks than schools and colleges not located in Hubs and large proportions of them are already meeting a number of the Benchmarks.

 

Which shows rapid improvement in the percentage of Hub schools & Colleges reporting that they are fully meeting Gatbsy benchmarks. Within those figures though a truly eye opening amount of work must be happening.

d6r-s2mw4aab-8f

Let’s take one benchmark in particular – Benchmark 8, Personal Guidance. The claim from the CEC is that 61% of Hub schools and colleges are reporting that they are fully meeting this Benchmark.

The School Guidance for this Benchmark is clear that to achieve it, every pupil should have a guidance interview with a Careers Adviser by the age of 16 and, if the school has a sixth Form, another if required by the age of 18.

gatsby8

While in the Sixth Forms & Colleges Guidance the wording is slightly different to take into account that students can complete Entry, Level 1, Level 2 or Level 3 study programmes at different ages up to 19 so the age of the student isn’t the limiting factor, just as long as the IAG interview occurs during the learners study programme.

fegatsby8

But the aim remains the same; every young person gets a 1:1 Careers meeting with a qualified professional.

Across the 710 schools and colleges in the Hubs it’s hard to find published the exact numbers of schools and the exact number of dedicated Post 16 providers (I’ve included the total number of providers for each Hub above where I could find it) but whatever those figures are, the CEC is now claiming that 61% of Hub providers are fully meeting Benchmark 8. This is extraordinary in itself but what I find even more remarkable is that 56% of those providers were reporting that they were already fully compliant with Benchmark 8 back in July 2018 before the Hub started. That is a very high level of provision in terms of pupil numbers.

Dfe data is that, on average, there are 948 pupils in a secondary school.

Across the 20 Hubs lets say, conservatively, 700 schools of the 710 participants are secondary schools that gives a total school pupil population of 663,600.

That leaves around 10 Sixth Forms or Colleges (in reality, it’s likely that these Post 16 providers take up a greater number) and these providers can vary tremendously in size. For example, Sunderland College has around 4,800 full time learners while Sixth Form Colleges have, on average 1,823 and School Sixth Forms even smaller at 202 students on average.

Sunderland College were part of the North East pilot Hub so I’ll include their learners but be conservative on the other participants and say the rest are smaller Sixth Form Colleges. That would result in a total of 21,207 Post 16 learners included in the Benchmark 8 figures in the pilot.

So the total number of students covered by the Hubs = 684,807 pupils (although this is likely to be larger)

If 61% of providers are now reporting fully meeting Benchmark 8 then that’s approx 423,832 young people in those 20 areas that have had a Careers interview. In July 2018, before the Hub started, 389,092 (56%) of young people were having a Careers interview. This is a huge amount of Careers and guidance provision occurring in those localities.

There should be huge lessons for those practitioners in the rest of the country to learn from these figures.

  1. What was the practice and structure already in place that allowed those 56% of those providers to already meet everyone of their students for a Careers interview? Considering that Hub areas were chosen specifically in response to the CEC’s own cold spots research which was meant to indicate a dearth of Careers provision,

cec 2

There should be learning opportunities here for the CEC as well as their Personal Guidance fund is another pot of money looking to support innovative practice in this area of CEIAG. Their publication in the “What works series: Personal Guidance” shows though that there are not many short cuts to providing provision in this area and how time and cost intensive Personal Guidance is by it’s very nature.

personal guidacne1

In a 948 roll secondary school, a Year 11 cohort would equal around 190 pupils. Seeing 5 of those pupils a day for a Careers interview would take nearly 38 days or over 7.5 weeks so this is a significant staffing allocation and that is just one year group. As a practitioner in an FE College with around 3000 full time students attending, I am another Careers Leader looking for ways to offer a guidance service that meets all of the quality points above but is also flexible enough to maximize capacity.

Hopefully the CEC is learning from those providers in the Hub areas how, despite rating lowly on the Cold Spot metrics, over half of them were able to previously achieve Benchmark 8.

2. How does that level of provision compare to providers outside of Hub areas?

Other sources offer insights but not directly comparable data. The most recent DfE omnibus survey (surveying pupils in Year 8 to 11) reports 47% of (under 16) pupils say the experienced a face to face guidance session

face to face iag

while the 2019 Youth Employment Survey (3008 young people aged 14-24) reports that 67% of young people had an interview with a Careers Advisor.

face to face youth iag

The most recent CEC State of the Nation report shows that 48% of all schools and colleges completing Compass reported that they were fully meeting Benchmark 8 in their first submission but this figure has risen to 55.4% on second rating.

personal guidacne2

So the Hub areas were already starting from a higher base than the rest of the country before the Hubs had even started.

3. Is this stable and should a new Benchmark 8 rating be submitted by the provider every year?

As Deirdre Hughes asks here for Benchmarks 5 & 6 but her question is equally applicable to Benchmark 8

 

 

Each academic year will bring new students for a school or college to work with and many things (loss of staff, internal restructures, expanding school roll) could result in a provider not maintaining their 100% compliance with Benchmark 8. Could the percentage of providers meeting Benchmark 8 in a Hub area fall as well as rise?

4. What changes have lead to the increase in capacity to be able to offer more or attain more take up of Careers interviews since the Hubs started?

Is it more schools and colleges dedicating more staffing towards this provision or something else?

It will be interesting to see how the new Hubs add to the lessons the CEC is learning over the next academic year and whether the rate of progress against Benchmarks continues particularly in areas which require high resource allocation.

Primary CEIAG and the preparation for choice

At the beginning of March Damian Hinds reannounced £2m of funding for the CEC to research and invest in CEIAG in Primary schools. This sparked a comment piece by the Headteacher blogger Michael Tidd which argued against this initiative. It’s worth saying that Tidd’s concerns seem to fall into two categories: 1) resources are tight and the funding for this initiative is small so any impact will be slight and 2) Careers Education is not a priority at this stage of education. Leaving aside the zero sum game view on resourcing (just because Primary CEIAG receives some funds, it doesn’t automatically follow that other areas shouldn’t or can’t receive funds), it’s his view on CEIAG that this post will concentrate on.

Tidd asks,

what are we hoping that 10-year-olds will take from these new lessons? I think many primary children have no idea what they want to do when they grow up – and I think that’s okay. Primary education shouldn’t be about preparation for the world of work.

And then goes onto reason that

The world of careers is enormous, and there should be no hurry to make any decisions. It’s bad enough that we force young people to deliberately narrow their curriculum at 14; I certainly don’t want children to be ruling anything in or out any sooner

I understand that any media articles have tight word counts so complexity and subtlety can be lost but I’ll take Tidd at his word and offer the following in rebuttal. The first point is the fact that Primary schools report that they are already offering CEIAG provision to pupils

primary ceiag

through a range of activities. So this funding is not for squeezing new things into crammed timetables but for improving the efficacy of provision that is already happening.

Second, is the need to tackle this conceptual view of Primary (or any) CEIAG as only a mechanism of immediate choice as this is a damaging and false starting point of the aims and outcomes of good CEIAG provision. That isn’t to say that some CEIAG provision does enable and facilitate choices but that other provision lays the groundwork for this. This has long been advocated by the Education & Employers Taskforce charity who established their Inspiring the Future offshoot, Primary Futures to achieve just this

Here the framing of the provision is not choice limiting or insistent on choices being made but as provision as a method for expanding and broadening horizons. The CEC publication, “What works: Careers related learning in primary schools” draws together much of the nascent research in this field to evidence why this is the correct approach.

The evidence suggests that career related learning in primary schools has the potential to help broaden children’s horizons and aspirations, especially (though not exclusively) those most disadvantaged.

Some of the challenges that all CEIAG provision aims to overcome is laid out

Robust longitudinal studies have shown that having narrow occupational expectations and aspirations can, and do, go on to influence the academic effort children exert in certain lessons, the subjects they choose to study, and the jobs they end up pursuing.Research has also shown that the jobs children aspire to may be ones that their parents do, their parents’ friends do or that they see on the TV and/or social media.

The passages (page 2) which describe how young children base their career knowledge and aspirations on their close circle of influencers (social capital), conceive their view on their place and opportunities in society (cultural capital) and establish their belief in their ability to determine their own outcomes against other factors (identity capital) lucidly offer the rationale for careers provision at Primary school age. The argument that Primary CEIAG is not beneficial because young minds would subsequently preclude routes falls away as the very rationale for informed Primary CEIAG provision is for young minds to expand routes and options.

How these aims can be achieved is explained in detail in a recent LKMCO/Founders4Schools report “More than a job’s worth: Making careers education age-appropriate.” In its sections covering the rationale and design of CEIAG provision at secondary and Post-16 level, the report retreads much ground already covered through the CEC’s What Works series and the original Gatsby report. Where the report adds value to the ever-increasing library of CEIAG publications though is the clear direction for practitioners as to what sorts of provision could be offered to children of different ages.

lmkco report1

The inclusion of the 2-4 Pre-school age group caused enough of a stir to get media coverage which also tended towards Tidd’s take on the concepts being discussed.

Finally, it’s worth saying that I agree with concern around the narrowing of options (read; curriculum) at 14 as the benefits of continuing with more a broader curriculum for longer is well evidenced. Where I would disagree with Tidd is that I would propose that the methods and age appropriate delivery of CEIAG provision the LKMCO publication outlines might actually prove to have benefits for students once they reach the later stages of secondary schooling. At these Moments of Choice (to use the CEC terminology), when students currently struggle through a complex choice system without the skills and knowledge to navigate that choice architecture, the pay off from the horizon broadening and stereotype challenging Primary CEIAG work he disparages could be evident.

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

The 2017 student destinations of the original Gatsby pilot group

With the recent release by the DfE of the 2016/17 Destinations Data, I thought it would be a useful exercise to look at the Data of those institutions that were involved in the original Gatsby Benchmarks pilot to see how that improvement in CEIAG provision is effecting student outcomes.

All of the 2017 Destination Data used for this post is sourced from the DfE KS5 & KS4 tables (revised editions) here. Any 2015 Destination Data is sourced from the DfE KS5 & KS4 tables for that cohort which can be found here.

In the original North East pilot which started in September 2015, 16 providers (including 3 Further Education providers) used the Gatsby Benchmarks to assess and plan their own provision. With the support of a LEP appointed area lead and £9,000 central funding for each institution they made significant progress to improving their CEIAG offer against the Benchmarks.

In 2015, 50% of the schools and colleges in the pilot achieved no benchmarks, but after two years of hard work over 85% now reach between six and eight benchmarks.

I’ve taken the Destinations Data for those institutions from the DfE tables above and put them in their own Excel table (with the national regional North East figures) which you can download here > gatsby providers destinations

You can also compare that Data against the trends in nationwide Destinations Data in table 1 in the accompanying report to the 2017 release.

national destinations data

Destinations Data

Each year Destinations Data is a snapshot of a cohort of leavers so it is always wise to a) not draw too definitive a set of conclusions and b) place in context of region and historical Destinations Data if possible. In my table above I have also included the regional figures from 2015 and 2017.

There will also be your own personal approach to using Destination Data as a tool. I think that (with the above caveats) it is useful for judging the impact of CEIAG work. If a school is enabling leavers to progress into sustained destinations that cover the variety of routes and perhaps even buck regional or national trends, then I am much more convinced by the efficacy of a school or college’s CEIAG provision.

So we can see that for 2017 KS4 leavers, the Gatsby schools were under-performing for overall sustained destinations against both 2017 regional and national averages. In fact, the achieved average of the schools of 89% in a positive sustained destination has been left behind nationally since the 2012/13 leavers cohort (table 1). The percentage of KS4 leavers (5.8%) securing an Apprenticeship is a touch above the national average but only in line with the regional average and below the 2015 regional average of 8%. Perhaps the affects of the Apprenticeship Levy and the lag that has incurred on young people securing apprenticeships is shown here. Elsewhere the destination not sustained average of 9.5% is higher than both the regional and nation averages (excluding alternate provision providers) and the 2015 regional figure. The percentage of learners moving onto Further Education or Sixth Form providers is varied and can depend heavily on locally available institutions and their offer that students can travel to so not much value can be drawn from those data points.

At KS5 the three institutions involved offer a more mixed story. (It is worth noting at the outset the clear size differences between the institutions involved, Bishop Auckland College had only 60 KS5 leavers in the data while Sunderland College included 1,082) A percentage of 79% for the Gatsby group transitioning into any positive sustained destination is below both regional and national averages while 9% of learners moving into apprenticeships is above both regional and national comparison rates. The greatest distinction can be found in the Destination not sustained results as an average of 16% of students not achieving a sustained destination is well above regional and national averages.

Conclusions

With the roll-out of both the Gatsby Benchmarks as part of the Careers Strategy and DfE school and College guidance and the Hub structure across much of the country I would expect that most officials within the DfE would be wanting to see the growth shoots of a more sustained and significant impact on positive student destinations in the original pilot area. These may yet come as the 2017 Destinations Data is only looking at the second cohort of school leavers to exit KS4 or KS5 since the start of the pilot area’s Gatsby journey. But the desire for improvement in CEIAG provision must come with goals. Benchmarks are either a method of standarising provision types that has impact on outcomes or they’re not. All CEIAG practitioners (and, I would guess) researchers are aware of the difficult nature of capturing the value of CEIAG work, so much happens in a young person’s life that can have an impact on the journey they take, but if we all do really believe that CEIAG can have a positive impact on those young people; that comes with the responsibility of accepting some metrics will be valued by policy makers. Currently, one of those metrics isn’t moving.

Our Further Education Careers Programme statement (v2)

With the move to a new College, one of my first jobs has been to update our public facing Careers pages on the College website. According to the DfE FE Careers Guidance this should include a Careers Programme Statement that I have previously blogged about here.

This is content that doesn’t sit naturally alongside main purpose of a Further Education website as so much of this public facing tool is solely dedicated to marketing the College. Usually written and designed by a dedicated marketing team, the website is an important part of the recruitment drive that all FE Colleges must consistently undertake to prosper.

While the majority of the rest of the site is dedicated to informing but also enticing potential learners to use the College, the remit of the Careers Programme Statement has to be written, as dictated by the Guidance,

in a way that enables learners, parents, college staff and employers to access and understand it.

So it has slightly different audiences to reach but ultimately all with the same wider strategic goals of a modern FE College in mind. To inform your local community of the work the College does with its students and to invite collaboration and integration with that community including the local labour market.

I am seeing plenty of schools start to upload similar Statements on their websites but examples from FE Colleges are a little more sparse currently. If you know of any, please drop a link in the comments below.

 

The importance of trust

Working with young people (and their parents, but more on that later) as a Careers Adviser/Leader often means assisting them as they traverse points of transition. Be it across key stages, subject changes, institution changes or into a whole new sectors of the labour market, CEIAG practitioners are often the face of the possibilities on offer in the preparation phase of a transition. For the young person this often mean moving from a place of comfort where the rules and expectations (and short cuts) are known and familiar and into a space with new rules, new people and new codes of expected behaviour.

This is where “trust” becomes a vital factor. If the CEIAG practitioner is valued by the young person as a “trusted” source then the preparation work can aid the transition from the initial considerations, research through to choices and decision to overcome the worry of uncertainty. That is why this graphic

is so applicable to CEIAG work with young people and one that I’ve thought about following a few recent CEIAG news events.

A number of recent surveys clearly reported just how much influence parents/guardians have over the career and transition decisions of young people despite their lack of current knowledge of educational pathways and up to date labour market information.

For CEIAG practitioners working in schools, the message here is that the practitioner should be positioning themselves as a “trusted” source to both parents and young people.

That requires time and work in building relationships. At the recent Education Select Committee, Ian Mearns reiterated his belief that Careers Advisers from outside schools were best placed to ensure impartiality when offering IAG to young people as the incentives to keep learners within organisations are simply too strong. To back himself, he referred to recent Careers & Enterprise data that shows that seems to indicate that schools with Sixth Forms offer weaker Careers provision to their learners. What this model of IAG finds more challenging to achieve than in-house Advisers though is the time and presence required to build relationships and so the “trust” needed to actually impact young people and parents/guardians decision-making.

We all know the worth of the Gatsby benchmarks but one of the most significant indicators of the impact a school’s CEIAG programme is the amount of trust the parents and pupils have in their Careers Leader.