Face 2 Face IAG

CEIAG in Post 16 providers – a survey

Over the years of writing this blog the annual omnibus survey from the DfE has always offered a useful insight into the reality of the scale of CEIAG provision across the country. Up until now I did not realise that they also undertake a Post 16 version of the survey, the most recent release of which includes plenty of interesting information about the scale and types of provision on offer by providers.

fe omnibus survey

The first point to make about the results is that respondents do appear to come from the wide variety of providers across the FE landscape (Table A1: 421 responses) but overall it’s heartening to see just how widespread the range of CEIAG work is across the Post 16 stage.

fe omnibus survey 1

The rise in employer encounters since 2017 was noted by the CEC looking for signs of impact of their work.

The figures that provide the most surprise to me though come from the split into type of provision by type of institution

fe omnibus survey 2

My assumption would be that FE Colleges would be offering more employer encounters for students than Sixth Forms attached to schools. Employer engagement is a central tenet of the mission of FE providers and the qualifications they offer. In my experience at least, the range and scale of employer engagement is much more frequent and in-depth then what you would expect in a school with a small sixth form but that seems to not to be the case here. The other interesting data point is the scale of difference between the students at different providers participating in University visits but this comes with a word of warning. There is some confusion across the document in the way this question is worded; fig 10.1 phrases it “All students considering university have had at least two visits to universities” while 10.2 uses “University applicants have had at least two visits to universities.” These differences appear subtle but for an FE College who will have a significant proportion of their student population studying qualifications at Level 2 and below, the wording of this question could elicit significantly different results from respondents.

Elsewhere in the survey, it is heartening to see CEIAG provisions taking center stage in respondents thinking when detailing their “activities to encourage students to have high aspirations or to help them achieve their potential.”

fe omnibus survey 3

Careers Teams in Sixth Forms, FE Colleges, UTCs & Studio Schools would be involved in the organisation or delivery of all of those types of provision in some way. Leaving aside the continual misappropriation of disadvantaged young people having “low aspirations,” when research shows that they have high aspirations but lack the tools and social and cultural capital to enact those aspirations (pdf), this data shows Post 16 Careers Leaders how to best frame their offer to explain value to Senior Leaders. The potential areas to offer provision in that would gain benefit can be found in the responses to the next question, “Barriers faced be post-16 institutions in raising aspiration within the student population.”

fe omnibus survey 4

Many of which are structural barriers (e.g. cost of continuing education, socio-economic) but also barriers which Careers Teams can help tackle with clear messaging. For example, with the use of Martin Lewis’ campaign materials to tackle some of the myths around Higher Education tuition fees to assuage student fears over the impact of these costs and offering to play a central role in parental engagement events and activities.

Wide scale tracking of CEIAG provision is valuable to note the impacts that policy or changes in accountability focus can ripple through the system. These annual surveys from the DfE are an important data point to achieve this. Another survey that may interest you or benefit from your involvement is the CEC survey of Careers Leaders in schools which will hopefully report interesting data on the workforce change that the Careers Strategy and DfE Careers Guidance for schools has influenced so get your response sent if this is applicable to you. A similar survey for FE Careers Leaders is planned for later this year.

 

Advertisements

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

Going for a job interview in the States around Halloween has it’s own special challenges

“An actual interview going on at the office. We take Halloween very seriously.”

ji61tt3nvlu11

Some of the comments:

I too had an interview on halloween last year. Yes, the interviewers, all four, were wearing costumes ranging from sorceress to tin man.

I interviewed 3 years ago at a company during their halloween dress up day. The people interviewing me were dressed up as Forrest Gump and Lt Dan.

I’m honestly not certain what the Careers profession can do to prepare clients for this!

via r/funny

Finding yourself

e5ec3e7.jpg

From Reddit Get Motivated

It’s an appealing piece of writing and will ring true for many if they reflect on their career journey but I’m not sure if I completely agree with it with regard to careers theory.

It certainly contains elements of agreement with the growth and exploration phases of Super’s five life and career development stages but seems to start from a position that the individual is returning to a fixed, preordained destination that was always within them. It is a position that retreats from (the rightly much criticised) Dweck’s Mindset work.

The idea is positioned closely to Gottfredson’s Circumscription and Compromise theory which includes the self-reflection and actualization of the quote but Gottfredson also promotes the need for the individual to expose themselves to a wider range of experiences to aid the reflection process.

The idea in the image is closer to Parson’s “trait and match” basis of the client as an unmovable object who, before navigating the labour market along their favoured journey and perhaps even to find a satisfying destination, takes the time to undergo a process of deeper self realisation to understanding their aptitudes and interests. How experience has shaped these traits is not discussed, only that they are present and can be tested. This theory has its critiques in respect to it’s assumption on the stability of the labour market and the stability of the traits of the client. The quote in the image proposes the later and discounts the influence of experience gained along the journey being stripped away and, for that reason, it loses value for me.

 

 

 

 

 

Using images & visual starters in Career Guidance

With a h/t to @CareersResearch I found these examples from Katherine Jennick about her practice using visual starters and images in her 1:1 IAG work with her cohort of Key Stage 3 & 4 clients very interesting.

The discussion with Liane Hambly (from around 17 minutes in the video below) is an excellent CPD resource and, I would imagine, a very useful resource for any leaders of the Level 6 Career Guidance and Development Diploma. Give it a watch.

 

 

In 2018/2019, UCAS will charging schools & colleges for a poorer service

One of the regular annual financial outgoings from a school or College’s CEIAG budget are the various fees to access the different services and registrations for the UCAS advisers website. For education institutions there is no choice but UCAS to administer their learner’s Higher Education applications and this is reflected in the zero charge to become a registered UCAS centre. Where the charges from UCAS do start to rack up though is the extra services on offer to track the progress of offers, replies and acceptances your learners make. These are useful tools for tracking the destinations of learners, the offers they received and how your institution compares to competitors but they come with an individual or packaged price tag.

ucas adviser track fees

Paying for a service is that helps write destination reports and offer a better service to learners is perfectly reasonable. What will cause consternation to those paying for those services from the 2019 application cycle will be the fact that the data they rely upon may be incomplete.

Advisers signing into the 2019 portal will be greeted with this:

ucas 2019 sign in

Which, as I asked UCAS,

means that, from now on, any reports offered by UCAS may be based on incomplete data as learners may not have opted in to share their post application progress with their centre.

Of course GDPR is an important piece of legislation that has fundamentally reframed the way that individuals regard the use of their data both on and off the internet and UCAS Corporate cannot ignore it. What is seems they are willing to ignore though is that they will be charging educational institutions a fee for what will be, in effect, a poorer service and product. They are also oblivious to the potential knock on customer service effect this will have on learners as many will be approaching the source of IAG in their school or College post application only for the Adviser to have no method of checking their application unless the applicant signs in to UCAS Apply/Track themselves. I can see this significantly increasing the number of calls to UCAS support lines as school based IAG advisers find themselves unable to offer much post application IAG as they will not be able to see the learner’s application.

Schools & Colleges should be aware of this change and will have to do their best to encourage their learners to opt in to sharing their post application progress but this will only go so far. Many learners complete their form in their own time, away from school or College, so will go through the terms & conditions section without an Adviser present.

For Careers Leaders in Colleges, writing their Higher Education destinations reports next summer will be much more of a headache than previous years.