dfe

The CEC is heading into tricky strategic waters

careers_logo

Since it’s inception I would hope that this blog has been viewed as being demanding but fair to the Careers and Enterprise company. While some of their early work seemed more suited to the corporate sphere rather than the transparent world of the public sector they have since been given an wider remit by the Government, weathered (what in my view) has been some grandstanding but empty criticism from Robert Halfon and expanded their offer to schools and colleges through Careers Hubs, online tools and other funding streams. With this context, my position is that the sector should welcome that the DfE is funding careers work and tasking the CEC with looking at a fuller variety of careers provision rather than just the original remit of facilitating employer encounters. The DfE Guidance for Schools and Colleges has done much to focus attention and add impetus and importance to the CEC in the minds of School and College Senior Leaders. This work should be continued and built upon further. My fear for the future though is that the CEC is having to stray into tricky political waters.

Targets

As an indicator of their increased transparency, the CEC now publishes it’s annual grant funding letter. This sets out the clear targets and expectations of the DfE for the CEC and indicates the funding allocated to each strand of work

The DfE has determined those targets to have value and some of the data points around training, allocation of funds and sharing of best practice seem sensible for system improvement but overall these outcomes are very technical and input based. Key performance indicators such as “55% of schools and colleges in the Wave 1 Careers Hubs fully achieving Gatsby Benchmark 6” or “70,000 young people reached in Wave 1 Opportunity Areas by August 2019” are admirable in their specificity and adherence to the Gatsby research but also bring a danger for the CEC as they are ultimately lacking in both political and public impact.

Since it’s inception the CEC has received over £95 million (with £24.3 million of that for 2019/20). The issue here is not in terms of figures (careers work needs funding) but when Ministers have to justify previous and future expenditure. The DfE will need to present outcomes for this flagship policy to both political audiences (at Education Select Committees and in Parliament) and to the pubic through achievements that, they hope, will resonate with voters at election times.

We have already seen the CEC struggle to articulate their progress and achievements at two Eduction Select Committee sessions where the questions focused on the need to prove outcomes for students while Claudia Harris and Christine Hodgson’s answers relied on data showing the input provision that had been enacted. In Parliament and in previous speeches by Ministers, there has been confusion over the aims of the CEC. This mismatch between expectation of delivery and what is achieved is what will prove to be tricky for the CEC to manage.

The need for a compelling message

In 2016 I attended a session at an Education and Employers research conference where two ex-DfE civil servants spoke about the need to distill research and outcomes down to the simplest, most concise summary possible so that Ministers can digest and cascade it. They did not quite advocate Trumpian levels of “put in as many pictures as possible” but their reasons as to why the “4 or more employer engagements” research broke through so successfully are worth the attention of the CEC when considering promoting their work to MPs and the public.

The narrative battle

However, the CEC is tasked with showing progress against those very technical key performance indicators in their grant funding letter. Previously they achieved this through annual State of the Nation reports but now have released data which has gone further by showing progress against the Gatsby Benchmarks broken down to Local Enterprise Partnership level. This shows a

contrasting picture across the country, with the top performing areas made up of largely coastal and economically disadvantaged communities, while the bottom is made up almost exclusively of affluent counties.

This (with the caveat of noting that Compass is self reported data) is a positive picture indicating a large swell of change in CEIAG provision levels for young people and work. Unfortunately this does not translate to the mantra of keeping your outcomes simple and easily understood. Compare that positive picture based on Gatsby Benchmarks and the accompanying TES article from Anne Milton with other policy and research data released in the very same week as the CEC LEP level data. First came the Impetus Youth Jobs Report which utilised the LEO dataset

In March 2017 (the latest date we can analyse using the data we have access to) 26% of disadvantaged young people were NEET, compared to 13% of their better-off peers. This is the equivalent of around 78,000 additional disadvantaged NEETs aged 18-24. Looking at the same data from the opposite end of the lens, 26% of NEETs were from disadvantaged backgrounds, despite being only 16% of the population

and that

A disadvantaged young person is about 50% more likely to be NEET in the North East compared to London

This was soon followed by the 2018-19 State of the Nation report from the Social Mobility Commission. The key findings are stark and easily summarised:

  • The better off are nearly 80% more likely to end up in professional jobs than those from a working-class background.
  • Even when people from disadvantaged backgrounds land a professional job, they earn 17% less than their privileged colleagues.
  • social mobility has remained virtually stagnant since 2014. Four years ago, 59% of those from professional backgrounds were in professional jobs, rising to 60% last year
  • in 2014 only 32% of those from working class backgrounds got professional jobs, rising marginally to 34% last year
  • those from working class backgrounds earn 24% less a year than those from professional backgrounds, even if they get a professional job they earn 17% less than more privileged peers
  • by age 6 there is a 14% gap in phonics attainment between children entitled to free school meals and those more advantaged
  • by age 7 the gap has widened to 18% in reading, 20% in writing and 18% in mathematics
  • only 16% of pupils on free school meals attain at least 2 A levels by age 19, compared to 39% of all other pupils
  • twice the number of disadvantaged 16 to 18 year olds are at further education colleges compared to sixth-forms, and this segregation within the education system has risen by 1.2% since 2013
  • student funding for 16 to 19 year olds has fallen 12% since 2011 to 2012, and is now 8% lower than for secondary schools (11 to 15 year olds), leading to cuts to the curriculum and student support services that harm disadvantaged students
  • graduates who were on free school meals earn 11.5% less than others 5 years after graduating

The accompanying coverage resonated through articles across the media (some examples here and here) and gave enough political leverage for it to be raised at PMQs.

It’s worth reminding ourselves that these two reports and the CEC publication are talking about the very same disadvantaged communities yet through very different lenses. Of course, the CEC is reviewing current trends in provision which may not have an impact on outcomes for pupils for many years while adhering to reporting against it’s key performance indicators. Translating those KPIs when explaining the positive outcomes of their work to audiences without a CEIAG specialism is a huge hurdle for the CEC as they have to:

  1. explain what the Benchmarks are
  2. explain why they are good to achieve
  3. show that they are helping schools and colleges achieve them
  4. then outline the impact on positive outcomes for students in those disadvantaged communities.

I fear this means that achieving positive traction with politicians and the public will be extremely difficult.

Politically tricky waters

The political future of the country is currently in a highly unpredictable place but the CEC must be conscious of the need to persuade future Governments (of any colour ribbon) of their value. Labour’s Education Policy of a National Education Service is outlined in broad strokes without clarity on the need or role of a CEC type organisation. But whichever party is in power to make decisions on funding, they will not make those decisions merely based on research and evidence but research, evidence and outcomes that has been successfully communicated. If the CEC continues to constrain themselves to only communicating the value of the work in line with their key performance indicators then they will soon find themselves outmaneuvered by those able to use other statistics and research to paint a much more negative picture of the current state of CEIAG provision in disadvantaged communities to undermine any positive progress made.

Advertisements

Careers apps

Back in December 2018, the DfE announced two winners of a £300,000 funding pot open for digital IAG tools that would help young people make more informed choices about their University choice both in regard to provider and subject. Now, in April 2019, both of the tools have launched.

Offering IAG through digital platforms to young people has a mixed track record (we’ll always have the memory of Plotr) but practitioners know they can be a fundamental resource to use with your client base.

The two apps take different approaches to how they inform and advise their users with the Higher Education data now available. It’s worth saying that both platforms are (at the time of writing) still in beta testing so improvements in design, layout and usability will be ongoing and any judgments should be made with that in mind. The first app, ThinkUni is described as a “personalised digital assistant’ bringing together data on universities, courses and financial outcomes that are easy to explore and compare” and is from a team that has a good track record in offering social mobility enhancing schemes through the Brilliant Club. The current site looks fairly basic with text drop down boxes asking users on their preferences on University study (city or campus, course reputation or earnings outcomes etc).

think uni1On first impressions, I found there isn’t much to be impressed with here. The very first question assumes that the user knows what subject they want to study so relies on a baseline that simply isn’t there for a lot of young people and then site assumes that the user will be studying A Levels so widening participation also doesn’t seem to be a concern (I’m sure that other Level 3 qualifications will be incorporated into the site at some point but soft launching without them isn’t a good look). The “digital assistant” selling point is also over played with the course suggestion results being much the same as would result from a search using advanced filters on the UCAS search facility. If the user already knows their views on subject, location, course type etc. to input, then why not just go or be directed to the source site? Currently, the “assistant” part of ThinkUni seems extremely inactive.

The other competition winner comes from The Profs, a team who have previously built a Professional Tutor finding platform, and is a much more interactive looking experience. “The Way Up” games tasks users to pick an avatar and make choices on learning and earning routes at each step of their journey.

way up game1This approach develops a greater sense of ownership in the process and the results as the user is able to modify the route to reflect their own interests while still following the linear structure of the game. The interface isn’t the most aesthetically astounding you’ll see and I also thought that some of the presented LMI was easy to miss on-screen but, once you notice it, the format does incorporate a significant amount of LMI data into each stage. I also think that the biggest learning gain for young people using the platform might not be regarding their career choice or route but the realistic balance to be found when budgeting monthly in-comings and outgoings.

As a format for simulated learning, turn based, point and click games were also used back in the days of late 2000s Aimhigher University visits when one of the regular activities was a web-based game that allowed secondary school students to take control of a new University student avatar and make choices for their study, work and social life. The implications of those choices displayed in a character health chart which valued balance above partying too hard or studying too much. The user was able to see the realistic choices on offer and the consequences of those choices and reflect on how they would react in that possible environment. So the format isn’t new but the inclusion of the LMI and HE data is.

The “Way Up Game” is designed to have the widest possible capture point so that it includes career routes and choice options for lots of young people. At the more specific and detailed end of the simulation market, flight and even truck driving simulations are PC games that can require high level computers to run with the amount of detail their fan base demands while still offering career learning opportunities. More accessible versions of this format can be found in sector skills funded apps such as Construction Manager from the CiTB. Allowing users to take charge of a construction business, hire employees, pitch for contracts and then take on those jobs all presented within a SIMS type graphical interface make for an engaging career learning experience. Place these alongside digital diagnostic tools and digital communication tool there is a rich variety of online CEIAG resource.

Research

Research evidence on the value of digital and online IAG experiences offers some guidance to both of the creative teams on what could help their products have the impact they are looking for with users.

Two excellent summaries of research in this area are the CEC What Works edition: “Careers Websites” and this recent webinar from Tristram Hooley “Approaches to online guidance”

Neither of the two apps offer any links to expanding social networks or sharing results so building users social capital does not seem to be on the agenda.

The CEC document references research from Dunwall et al (2014) which evaluated the MeTycoon careers game and found that

87% of participants said playing the game had given them new career ideas and 66% said they had shared or discussed the game with friends.

The format of “The Way Up Game” more closely matches MeTycoon so those developers will be hoping for that level of impact with their users. The ThinkUni platform perhaps gains research backing with its slight nod towards the user involving CEIAG professionals in the findings from using the site. The CEC summary states:

The use of careers websites should be integrated into schools’ careers education provision, and may be more effective for pupils when use, at least initially, is mediated and supported by careers and education professionals.

Once the user has contemplated their suggestions, the final screen ThinkUni suggests

think uni assistant1

This is only a very slight prompt though. The user is not asked, for example, if they wish to email their results to a named individual which could be a CEIAG professional or school tutor so perhaps both developers would benefit from designing accompanying session plans that could enable teachers/CEIAG practitioners to use the apps in group sessions and build upon the learning experiences of the young people in the room. A further step could even be to incorporate “nudge” techniques by communicating to both user and professional so conversations could occur to see if further research tasks have been undertaken by the user. Neither of the platforms require the involvement of CEIAG professionals in the learning journey of the user.

This failure to build in involvement of practitioners places both of the apps well behind more detailed digital offers such as Start Profile. This program combines both personalisation lead by the user lead and exploration of career routes with LMI drawn from LMI for all and the ability for practitioner oversight and involvement. As this ICEGS evaluation of Start concludes

Start builds on much of what the existing evidence base tells us about the efficacy of online products. It brings together information and advice for a young person and allows them to personalise their learning journey. It offers a blended learning technology in which the school can connect the online learning to classroom based career learning. It builds on longstanding career assessment practices by building a personal profile of work preferences, qualities, skills and interests and using this to match users to jobs and learning opportunities based on their suitability and how available those jobs are in the labour market.

Differences do remain though between Start Profile and these two new apps in their data sources. LMI for All utilises a range of sources (detailed on page 10 here) but they (and so Start Profile) do not seem to include data from the Office for Students on HE access, continuation, attainment and progression.

By side-stepping CEIAG professionals both apps purely user focussed offers but this could still offer positive impact. The CEC Moments of Choice research concluded that young people desire the presentation of careers data that:

moments choice1

and it would be fair to conclude that both apps achieve at least 7 of those requirements to varying degrees. Young people can access the data in a method that is convient to them, when they require it, be safe in the knowledge that it is using reliable sources, receive suggested future actions and be able to personalise it. Only the involvment of influencers is missing.

International comparisons

These formats for offering HE focused CEIAG learning are also available in other countries. For example, Australia has Campus Quest which offers users two games, Campus Quest based on a student attending a University campus and E-Study Quest based on a student studying from home.

The graphical interface is slightly more interesting than both of the new UK apps but in particular the 3D presentation is more eye-catching than “The Way Up” game.

Value

For the DfE to offer funding, policy holders must hope that any resulting resources will add value to the marketplace of existing CEIAG digital products either through successfully filing a niche or building upon existing products. For me, currently the two apps (still at testing stage remember) do neither and they also choose to set aside a proportion of the research in this area. It may be more politically satisfying for the DfE to achieve a new CEIAG platform through this process but questions should be asked whether a more worthy platform could have been achieved through the adaption of existing products and how any resulting products are able to fit into, adapt and shape for the positive the current CEIAG landscape supporting young people.

CEIAG in Post 16 providers – a survey

Over the years of writing this blog the annual omnibus survey from the DfE has always offered a useful insight into the reality of the scale of CEIAG provision across the country. Up until now I did not realise that they also undertake a Post 16 version of the survey, the most recent release of which includes plenty of interesting information about the scale and types of provision on offer by providers.

fe omnibus survey

The first point to make about the results is that respondents do appear to come from the wide variety of providers across the FE landscape (Table A1: 421 responses) but overall it’s heartening to see just how widespread the range of CEIAG work is across the Post 16 stage.

fe omnibus survey 1

The rise in employer encounters since 2017 was noted by the CEC looking for signs of impact of their work.

The figures that provide the most surprise to me though come from the split into type of provision by type of institution

fe omnibus survey 2

My assumption would be that FE Colleges would be offering more employer encounters for students than Sixth Forms attached to schools. Employer engagement is a central tenet of the mission of FE providers and the qualifications they offer. In my experience at least, the range and scale of employer engagement is much more frequent and in-depth then what you would expect in a school with a small sixth form but that seems to not to be the case here. The other interesting data point is the scale of difference between the students at different providers participating in University visits but this comes with a word of warning. There is some confusion across the document in the way this question is worded; fig 10.1 phrases it “All students considering university have had at least two visits to universities” while 10.2 uses “University applicants have had at least two visits to universities.” These differences appear subtle but for an FE College who will have a significant proportion of their student population studying qualifications at Level 2 and below, the wording of this question could elicit significantly different results from respondents.

Elsewhere in the survey, it is heartening to see CEIAG provisions taking center stage in respondents thinking when detailing their “activities to encourage students to have high aspirations or to help them achieve their potential.”

fe omnibus survey 3

Careers Teams in Sixth Forms, FE Colleges, UTCs & Studio Schools would be involved in the organisation or delivery of all of those types of provision in some way. Leaving aside the continual misappropriation of disadvantaged young people having “low aspirations,” when research shows that they have high aspirations but lack the tools and social and cultural capital to enact those aspirations (pdf), this data shows Post 16 Careers Leaders how to best frame their offer to explain value to Senior Leaders. The potential areas to offer provision in that would gain benefit can be found in the responses to the next question, “Barriers faced be post-16 institutions in raising aspiration within the student population.”

fe omnibus survey 4

Many of which are structural barriers (e.g. cost of continuing education, socio-economic) but also barriers which Careers Teams can help tackle with clear messaging. For example, with the use of Martin Lewis’ campaign materials to tackle some of the myths around Higher Education tuition fees to assuage student fears over the impact of these costs and offering to play a central role in parental engagement events and activities.

Wide scale tracking of CEIAG provision is valuable to note the impacts that policy or changes in accountability focus can ripple through the system. These annual surveys from the DfE are an important data point to achieve this. Another survey that may interest you or benefit from your involvement is the CEC survey of Careers Leaders in schools which will hopefully report interesting data on the workforce change that the Careers Strategy and DfE Careers Guidance for schools has influenced so get your response sent if this is applicable to you. A similar survey for FE Careers Leaders is planned for later this year.

 

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

The 2017 student destinations of the original Gatsby pilot group

With the recent release by the DfE of the 2016/17 Destinations Data, I thought it would be a useful exercise to look at the Data of those institutions that were involved in the original Gatsby Benchmarks pilot to see how that improvement in CEIAG provision is effecting student outcomes.

All of the 2017 Destination Data used for this post is sourced from the DfE KS5 & KS4 tables (revised editions) here. Any 2015 Destination Data is sourced from the DfE KS5 & KS4 tables for that cohort which can be found here.

In the original North East pilot which started in September 2015, 16 providers (including 3 Further Education providers) used the Gatsby Benchmarks to assess and plan their own provision. With the support of a LEP appointed area lead and £9,000 central funding for each institution they made significant progress to improving their CEIAG offer against the Benchmarks.

In 2015, 50% of the schools and colleges in the pilot achieved no benchmarks, but after two years of hard work over 85% now reach between six and eight benchmarks.

I’ve taken the Destinations Data for those institutions from the DfE tables above and put them in their own Excel table (with the national regional North East figures) which you can download here > gatsby providers destinations

You can also compare that Data against the trends in nationwide Destinations Data in table 1 in the accompanying report to the 2017 release.

national destinations data

Destinations Data

Each year Destinations Data is a snapshot of a cohort of leavers so it is always wise to a) not draw too definitive a set of conclusions and b) place in context of region and historical Destinations Data if possible. In my table above I have also included the regional figures from 2015 and 2017.

There will also be your own personal approach to using Destination Data as a tool. I think that (with the above caveats) it is useful for judging the impact of CEIAG work. If a school is enabling leavers to progress into sustained destinations that cover the variety of routes and perhaps even buck regional or national trends, then I am much more convinced by the efficacy of a school or college’s CEIAG provision.

So we can see that for 2017 KS4 leavers, the Gatsby schools were under-performing for overall sustained destinations against both 2017 regional and national averages. In fact, the achieved average of the schools of 89% in a positive sustained destination has been left behind nationally since the 2012/13 leavers cohort (table 1). The percentage of KS4 leavers (5.8%) securing an Apprenticeship is a touch above the national average but only in line with the regional average and below the 2015 regional average of 8%. Perhaps the affects of the Apprenticeship Levy and the lag that has incurred on young people securing apprenticeships is shown here. Elsewhere the destination not sustained average of 9.5% is higher than both the regional and nation averages (excluding alternate provision providers) and the 2015 regional figure. The percentage of learners moving onto Further Education or Sixth Form providers is varied and can depend heavily on locally available institutions and their offer that students can travel to so not much value can be drawn from those data points.

At KS5 the three institutions involved offer a more mixed story. (It is worth noting at the outset the clear size differences between the institutions involved, Bishop Auckland College had only 60 KS5 leavers in the data while Sunderland College included 1,082) A percentage of 79% for the Gatsby group transitioning into any positive sustained destination is below both regional and national averages while 9% of learners moving into apprenticeships is above both regional and national comparison rates. The greatest distinction can be found in the Destination not sustained results as an average of 16% of students not achieving a sustained destination is well above regional and national averages.

Conclusions

With the roll-out of both the Gatsby Benchmarks as part of the Careers Strategy and DfE school and College guidance and the Hub structure across much of the country I would expect that most officials within the DfE would be wanting to see the growth shoots of a more sustained and significant impact on positive student destinations in the original pilot area. These may yet come as the 2017 Destinations Data is only looking at the second cohort of school leavers to exit KS4 or KS5 since the start of the pilot area’s Gatsby journey. But the desire for improvement in CEIAG provision must come with goals. Benchmarks are either a method of standarising provision types that has impact on outcomes or they’re not. All CEIAG practitioners (and, I would guess) researchers are aware of the difficult nature of capturing the value of CEIAG work, so much happens in a young person’s life that can have an impact on the journey they take, but if we all do really believe that CEIAG can have a positive impact on those young people; that comes with the responsibility of accepting some metrics will be valued by policy makers. Currently, one of those metrics isn’t moving.

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

The media and career choice

I would imagine that most Careers professionals working with young people have seen the impact that media has on their perceptions of work and jobs. From the surge of interest in forensics that spiked in the late 2000s as shows such as CSI and Criminal Minds hit the height of their popularity and resulted in a swell of applicants to Criminology degrees in the UK and the United States to the sudden boom in applications to the US Navy that followed Top Gun in the 1980s, it seems that popular media does have an influence on the career choice of individuals.

These anecdotal examples are also supported by research. In 2017 Konon & Kritikos showed that positive media representations of entrepreneurs resulted increase the probability of self-employment and decrease the probability of salaried work. While this paper from Hoag, Grant & Carpenter (2017) concluded that individuals who consume and have exposure to a wide range of news media were more likely to choose journalism as their major. A 2014 article in the Popular Culture Studies Journal (Tucciarone) conducted a research project looking at representations of the advertising industry and found that even negative or exaggerated portrayals of an industry can have an enticing effect on viewers

One research participant explained: “I think any type of portrayal, even if exaggerated a bit, is better than being completely blind about what goes on in an advertising agency. By watching various depictions of the industry and the careers, I am able to decide if I would even want to take ad courses and be involved with such an industry.”

We also have recent survey data that may point to the impact media consumption can have on young people’s career choices. The 2018 DfE Omnibus survey of pupils and their parents/carers might give some hints as it includes responses on what sources of information young rated as offering helpful careers IAG

dmucmhvwwaei1mq

“Any other source” is a term there that I am sure is a catch-all for a wide range of sources but I would suspect that “the media” (in all it’s guises for young people so including streaming services) is present. I’ve previously posted on the use of vloggers to attempt to capture a young audience and introduce them to a broader range of careers but more traditional narrative media, just streamed by young people on demand, may still play a large part. The BBC itself has found that young people watch more Netflix in a week than all of the BBC TV services. Just how binge worthy shows such as Chef’s Table or Better Call Saul are influencing young people’s views on hospitality or law careers remains to be seen, while free to air channels can still find success with formats that see celebrities trying out different job roles.

 

Again, the positive/negative view of the role or even the realism of the portrayal (I suspect few of the other midwives on Ms Willis’ ward will also be finding the time to design a range of home ware) may not matter. As the quoted research participant above notes, all it can take is the job being introduced to the viewer for the spark of interest to ignite.