dfe

What Exams 2020 tells us about public policy

This years exam process has been tumultuous to say the least. After a Gavin Williamson interview on Radio 4 during which he failed to explain the most basic steps of the 2020 process, it seemed to me a good idea to look what the Exams fiasco can tell us about public policy and the lines of accountability between the organisations that govern us.

Let me remind us of the sequence of events that have lead Ofqual and the DfE to here.

18th March – The Department for Education announces that all schools are to close expect for the children of key workers and that the summer exam series are to be cancelled

20th March – The DfE announces plans to still award grades from 2020 qualifications and that they have instructed Ofqual to devise a method to achieve this aim

26th March – Ofqual releases a holding statement explaining that they are working on achieving the task set for them and will publish updates soon

15th April – Ofqual launches a public consultation on their plans. They receive over 12,600 responses

15th April – The Chief Regulator, Sally Collier, writes to Gavin Williamson updating him on their progress. Collier writes:

“We note your direction that we should adopt an approach to standardise
those assessment grades across centres. In line with our statutory
objectives, it is vital to adopt an approach that is consistent, fair and
maintains standards as far as is possible
. We are therefore launching
today a consultation that, among other things, seeks views on the
principles that we propose should underpin the approach we take to the
standardisation of centre assessment grades. The outcomes of our
consultation will inform the choices we make, as we have due regard to
your direction that, as far as is possible, standards should be maintained
and the distribution of grades follows a similar pattern to that in previous
years.”

(A similar response regarding Vocational qualifications follows on the 9th April)

A quick look at Ofqual’s statutory objectives that Collier refers to there shows us that they would have to abide by these instructions

22nd May – Ofqual publish an analysis of the responses to the consultation. This reiterates their guiding principles

“Our aims are:
• to ensure students can receive grades in these qualifications this summer so they can progress to the next stages of their lives without further disruption
• that the grades will be as valued as those of any other year
• that the approach will be fair

The consultation includes questions and responses to the practical issues on the differing approaches to collecting Centre Assessed Grades (CAGs), where the lines of malpractice should be drawn and, most relevant to what subsequently happened, how the standardization process should work. Qfqual found the majority of respondents Agreed or Strongly agreed with their suggestion that the fairest way of standarising grades would be to take into the account the past performance of the centre.”

One of the questions directly relates to the DfE instruction to keep grades in line with past performance

(The splits in the types of respondents for each response are interesting. Schools are fairly split across the disagree/agree line, Employers & Exam Officers were more weighed to agree while Teacher Union representatives were much more likely to strongly disagree)

22nd May – At the same time, Ofqual also release a press release and accompanying inforgraphics to explain their plan. This includes:

“To make sure grades are as fair as possible, exam boards will standardise centre assessment grades using a statistical model which will include the expected national outcomes for this year’s students, the prior attainment of students at each school and college (at cohort, not individual level), and previous results of the school or college.

Because these arrangements have had to be put in place very quickly due to the Coronavirus (COVID-19) pandemic, it would have been impossible to provide school and college staff with national training to support them in making standardised judgements. As such, it is highly likely that all centres will see some adjustment, in at least one subject, to their centre assessment grades, however carefully they have made their judgements. Such adjustments are in the interests of fairness to all students because they will ensure, as far as possible, that individual centres have not been too severe or too generous in comparison with other centres.”

9th August – The Ofqual Chair has a piece in the Sunday newspapers explaining the process of awarding grades this year

13th August – A Level & Level 3 Vocational results day and huge amounts of criticism and unrest with individuals results are reported

13th August – Ofqual release a 319 page report on how they approached the standarisation of awarding grades in 2020

17th August – the DfE confirms that students will receive their CAGs instead of the Ofqual awarded grades

20th August – GCSE and Level 2 results day

20th August – Ofqual release an updated (to include the changes around mock grades) guide for teachers, students, parents and carers. This includes specific reference to

25th August – Sally Collier, Ofqual Chief Regulator, resigns from her post

26th August – DfE permanent secretary, Jonathon Slater is asked to step down from his post.

The fallout

The criticism of the process Ofqual embarked upon was fierce in the days following the A Level results. In the fallout the nuances of what the process had resulted in were lost. Those critical of the process showed that Independent schools had been rewarded with higher grades. Those who defended the process showed that students from higher socio-economic backgrounds were actually “downgraded” more at A*/A than those from lower socio-economic backgrounds. The nuance of why this had happened (smaller groups, more prevalent at Independant schools are impossible to standardise so the CAGs were left alone) failed to find sympathy or traction in the debate. In the end, it was the stories of those individual young people who received grades lower than their teachers expected them to achieve overwhelmed any other message.

The public mood was that these young people had suffered an unfair conclusion to an unfair process through no fault of their own. The Ofqual response that, overall grades for low SES students were up, that overall more disadvantaged young people were progressing onto Uni and that their model replicated what would happen (again, overall) in a regular year with some improvement, did not wash. The given reasons for the lowered grades for some students of the difficulty of standardizing small cohorts and the (approved by consultation remember) desire to anchor a centres results to past years, meant that all paled under the weight of the lack of agency that an individual student held over their own future. The failure of the Government to a) foresee the actual implications for individuals of their policy decision and then b) to have an appeal process in place ready to deal with the fallout then lead to a vacuum of direction which the U-turn of adopting CAGs as a solution which, unfortunately, only potentially shifts the unfairness onto other students.

Public policy

This whole episode has shown the power that stories of the individual still hold over public policy making. Ofqual were instructed by Government to achieve a hugely difficult task with clear directions. Whether they should have been or not is a moot point, once instructed they were bound to do this and achieve it in a manner that ensured fairness between year cohorts. They relied on published research to guide them. They asked the public, stakeholders and experts who replied in large numbers and guided their direction of travel and decisions. They told the media, the public and Government repeatedly what they were then planning to do and what it would likely lead to and yet still, when their solution began to impact individuals, the political pressure grew so quickly that the policy soon crumbled. Yet, it is those who have tried to enact the policy set for them who are no longer in their positions. Those who determined the course of action from the outset remain.

Meanwhile, the lesson seems to be for those in positions of advocacy wanting to shape public policy, Exams 2020 has been a frantic reminder that data, research and results in the aggregate can be trumped by the stories of individuals.

The CEC is heading into tricky strategic waters

careers_logo

Since it’s inception I would hope that this blog has been viewed as being demanding but fair to the Careers and Enterprise company. While some of their early work seemed more suited to the corporate sphere rather than the transparent world of the public sector they have since been given an wider remit by the Government, weathered (what in my view) has been some grandstanding but empty criticism from Robert Halfon and expanded their offer to schools and colleges through Careers Hubs, online tools and other funding streams. With this context, my position is that the sector should welcome that the DfE is funding careers work and tasking the CEC with looking at a fuller variety of careers provision rather than just the original remit of facilitating employer encounters. The DfE Guidance for Schools and Colleges has done much to focus attention and add impetus and importance to the CEC in the minds of School and College Senior Leaders. This work should be continued and built upon further. My fear for the future though is that the CEC is having to stray into tricky political waters.

Targets

As an indicator of their increased transparency, the CEC now publishes it’s annual grant funding letter. This sets out the clear targets and expectations of the DfE for the CEC and indicates the funding allocated to each strand of work

The DfE has determined those targets to have value and some of the data points around training, allocation of funds and sharing of best practice seem sensible for system improvement but overall these outcomes are very technical and input based. Key performance indicators such as “55% of schools and colleges in the Wave 1 Careers Hubs fully achieving Gatsby Benchmark 6” or “70,000 young people reached in Wave 1 Opportunity Areas by August 2019” are admirable in their specificity and adherence to the Gatsby research but also bring a danger for the CEC as they are ultimately lacking in both political and public impact.

Since it’s inception the CEC has received over £95 million (with £24.3 million of that for 2019/20). The issue here is not in terms of figures (careers work needs funding) but when Ministers have to justify previous and future expenditure. The DfE will need to present outcomes for this flagship policy to both political audiences (at Education Select Committees and in Parliament) and to the pubic through achievements that, they hope, will resonate with voters at election times.

We have already seen the CEC struggle to articulate their progress and achievements at two Eduction Select Committee sessions where the questions focused on the need to prove outcomes for students while Claudia Harris and Christine Hodgson’s answers relied on data showing the input provision that had been enacted. In Parliament and in previous speeches by Ministers, there has been confusion over the aims of the CEC. This mismatch between expectation of delivery and what is achieved is what will prove to be tricky for the CEC to manage.

The need for a compelling message

In 2016 I attended a session at an Education and Employers research conference where two ex-DfE civil servants spoke about the need to distill research and outcomes down to the simplest, most concise summary possible so that Ministers can digest and cascade it. They did not quite advocate Trumpian levels of “put in as many pictures as possible” but their reasons as to why the “4 or more employer engagements” research broke through so successfully are worth the attention of the CEC when considering promoting their work to MPs and the public.

The narrative battle

However, the CEC is tasked with showing progress against those very technical key performance indicators in their grant funding letter. Previously they achieved this through annual State of the Nation reports but now have released data which has gone further by showing progress against the Gatsby Benchmarks broken down to Local Enterprise Partnership level. This shows a

contrasting picture across the country, with the top performing areas made up of largely coastal and economically disadvantaged communities, while the bottom is made up almost exclusively of affluent counties.

This (with the caveat of noting that Compass is self reported data) is a positive picture indicating a large swell of change in CEIAG provision levels for young people and work. Unfortunately this does not translate to the mantra of keeping your outcomes simple and easily understood. Compare that positive picture based on Gatsby Benchmarks and the accompanying TES article from Anne Milton with other policy and research data released in the very same week as the CEC LEP level data. First came the Impetus Youth Jobs Report which utilised the LEO dataset

In March 2017 (the latest date we can analyse using the data we have access to) 26% of disadvantaged young people were NEET, compared to 13% of their better-off peers. This is the equivalent of around 78,000 additional disadvantaged NEETs aged 18-24. Looking at the same data from the opposite end of the lens, 26% of NEETs were from disadvantaged backgrounds, despite being only 16% of the population

and that

A disadvantaged young person is about 50% more likely to be NEET in the North East compared to London

This was soon followed by the 2018-19 State of the Nation report from the Social Mobility Commission. The key findings are stark and easily summarised:

  • The better off are nearly 80% more likely to end up in professional jobs than those from a working-class background.
  • Even when people from disadvantaged backgrounds land a professional job, they earn 17% less than their privileged colleagues.
  • social mobility has remained virtually stagnant since 2014. Four years ago, 59% of those from professional backgrounds were in professional jobs, rising to 60% last year
  • in 2014 only 32% of those from working class backgrounds got professional jobs, rising marginally to 34% last year
  • those from working class backgrounds earn 24% less a year than those from professional backgrounds, even if they get a professional job they earn 17% less than more privileged peers
  • by age 6 there is a 14% gap in phonics attainment between children entitled to free school meals and those more advantaged
  • by age 7 the gap has widened to 18% in reading, 20% in writing and 18% in mathematics
  • only 16% of pupils on free school meals attain at least 2 A levels by age 19, compared to 39% of all other pupils
  • twice the number of disadvantaged 16 to 18 year olds are at further education colleges compared to sixth-forms, and this segregation within the education system has risen by 1.2% since 2013
  • student funding for 16 to 19 year olds has fallen 12% since 2011 to 2012, and is now 8% lower than for secondary schools (11 to 15 year olds), leading to cuts to the curriculum and student support services that harm disadvantaged students
  • graduates who were on free school meals earn 11.5% less than others 5 years after graduating

The accompanying coverage resonated through articles across the media (some examples here and here) and gave enough political leverage for it to be raised at PMQs.

It’s worth reminding ourselves that these two reports and the CEC publication are talking about the very same disadvantaged communities yet through very different lenses. Of course, the CEC is reviewing current trends in provision which may not have an impact on outcomes for pupils for many years while adhering to reporting against it’s key performance indicators. Translating those KPIs when explaining the positive outcomes of their work to audiences without a CEIAG specialism is a huge hurdle for the CEC as they have to:

  1. explain what the Benchmarks are
  2. explain why they are good to achieve
  3. show that they are helping schools and colleges achieve them
  4. then outline the impact on positive outcomes for students in those disadvantaged communities.

I fear this means that achieving positive traction with politicians and the public will be extremely difficult.

Politically tricky waters

The political future of the country is currently in a highly unpredictable place but the CEC must be conscious of the need to persuade future Governments (of any colour ribbon) of their value. Labour’s Education Policy of a National Education Service is outlined in broad strokes without clarity on the need or role of a CEC type organisation. But whichever party is in power to make decisions on funding, they will not make those decisions merely based on research and evidence but research, evidence and outcomes that has been successfully communicated. If the CEC continues to constrain themselves to only communicating the value of the work in line with their key performance indicators then they will soon find themselves outmaneuvered by those able to use other statistics and research to paint a much more negative picture of the current state of CEIAG provision in disadvantaged communities to undermine any positive progress made.

Careers apps

Back in December 2018, the DfE announced two winners of a £300,000 funding pot open for digital IAG tools that would help young people make more informed choices about their University choice both in regard to provider and subject. Now, in April 2019, both of the tools have launched.

Offering IAG through digital platforms to young people has a mixed track record (we’ll always have the memory of Plotr) but practitioners know they can be a fundamental resource to use with your client base.

The two apps take different approaches to how they inform and advise their users with the Higher Education data now available. It’s worth saying that both platforms are (at the time of writing) still in beta testing so improvements in design, layout and usability will be ongoing and any judgments should be made with that in mind. The first app, ThinkUni is described as a “personalised digital assistant’ bringing together data on universities, courses and financial outcomes that are easy to explore and compare” and is from a team that has a good track record in offering social mobility enhancing schemes through the Brilliant Club. The current site looks fairly basic with text drop down boxes asking users on their preferences on University study (city or campus, course reputation or earnings outcomes etc).

think uni1On first impressions, I found there isn’t much to be impressed with here. The very first question assumes that the user knows what subject they want to study so relies on a baseline that simply isn’t there for a lot of young people and then site assumes that the user will be studying A Levels so widening participation also doesn’t seem to be a concern (I’m sure that other Level 3 qualifications will be incorporated into the site at some point but soft launching without them isn’t a good look). The “digital assistant” selling point is also over played with the course suggestion results being much the same as would result from a search using advanced filters on the UCAS search facility. If the user already knows their views on subject, location, course type etc. to input, then why not just go or be directed to the source site? Currently, the “assistant” part of ThinkUni seems extremely inactive.

The other competition winner comes from The Profs, a team who have previously built a Professional Tutor finding platform, and is a much more interactive looking experience. “The Way Up” games tasks users to pick an avatar and make choices on learning and earning routes at each step of their journey.

way up game1This approach develops a greater sense of ownership in the process and the results as the user is able to modify the route to reflect their own interests while still following the linear structure of the game. The interface isn’t the most aesthetically astounding you’ll see and I also thought that some of the presented LMI was easy to miss on-screen but, once you notice it, the format does incorporate a significant amount of LMI data into each stage. I also think that the biggest learning gain for young people using the platform might not be regarding their career choice or route but the realistic balance to be found when budgeting monthly in-comings and outgoings.

As a format for simulated learning, turn based, point and click games were also used back in the days of late 2000s Aimhigher University visits when one of the regular activities was a web-based game that allowed secondary school students to take control of a new University student avatar and make choices for their study, work and social life. The implications of those choices displayed in a character health chart which valued balance above partying too hard or studying too much. The user was able to see the realistic choices on offer and the consequences of those choices and reflect on how they would react in that possible environment. So the format isn’t new but the inclusion of the LMI and HE data is.

The “Way Up Game” is designed to have the widest possible capture point so that it includes career routes and choice options for lots of young people. At the more specific and detailed end of the simulation market, flight and even truck driving simulations are PC games that can require high level computers to run with the amount of detail their fan base demands while still offering career learning opportunities. More accessible versions of this format can be found in sector skills funded apps such as Construction Manager from the CiTB. Allowing users to take charge of a construction business, hire employees, pitch for contracts and then take on those jobs all presented within a SIMS type graphical interface make for an engaging career learning experience. Place these alongside digital diagnostic tools and digital communication tool there is a rich variety of online CEIAG resource.

Research

Research evidence on the value of digital and online IAG experiences offers some guidance to both of the creative teams on what could help their products have the impact they are looking for with users.

Two excellent summaries of research in this area are the CEC What Works edition: “Careers Websites” and this recent webinar from Tristram Hooley “Approaches to online guidance”

Neither of the two apps offer any links to expanding social networks or sharing results so building users social capital does not seem to be on the agenda.

The CEC document references research from Dunwall et al (2014) which evaluated the MeTycoon careers game and found that

87% of participants said playing the game had given them new career ideas and 66% said they had shared or discussed the game with friends.

The format of “The Way Up Game” more closely matches MeTycoon so those developers will be hoping for that level of impact with their users. The ThinkUni platform perhaps gains research backing with its slight nod towards the user involving CEIAG professionals in the findings from using the site. The CEC summary states:

The use of careers websites should be integrated into schools’ careers education provision, and may be more effective for pupils when use, at least initially, is mediated and supported by careers and education professionals.

Once the user has contemplated their suggestions, the final screen ThinkUni suggests

think uni assistant1

This is only a very slight prompt though. The user is not asked, for example, if they wish to email their results to a named individual which could be a CEIAG professional or school tutor so perhaps both developers would benefit from designing accompanying session plans that could enable teachers/CEIAG practitioners to use the apps in group sessions and build upon the learning experiences of the young people in the room. A further step could even be to incorporate “nudge” techniques by communicating to both user and professional so conversations could occur to see if further research tasks have been undertaken by the user. Neither of the platforms require the involvement of CEIAG professionals in the learning journey of the user.

This failure to build in involvement of practitioners places both of the apps well behind more detailed digital offers such as Start Profile. This program combines both personalisation lead by the user lead and exploration of career routes with LMI drawn from LMI for all and the ability for practitioner oversight and involvement. As this ICEGS evaluation of Start concludes

Start builds on much of what the existing evidence base tells us about the efficacy of online products. It brings together information and advice for a young person and allows them to personalise their learning journey. It offers a blended learning technology in which the school can connect the online learning to classroom based career learning. It builds on longstanding career assessment practices by building a personal profile of work preferences, qualities, skills and interests and using this to match users to jobs and learning opportunities based on their suitability and how available those jobs are in the labour market.

Differences do remain though between Start Profile and these two new apps in their data sources. LMI for All utilises a range of sources (detailed on page 10 here) but they (and so Start Profile) do not seem to include data from the Office for Students on HE access, continuation, attainment and progression.

By side-stepping CEIAG professionals both apps purely user focussed offers but this could still offer positive impact. The CEC Moments of Choice research concluded that young people desire the presentation of careers data that:

moments choice1

and it would be fair to conclude that both apps achieve at least 7 of those requirements to varying degrees. Young people can access the data in a method that is convient to them, when they require it, be safe in the knowledge that it is using reliable sources, receive suggested future actions and be able to personalise it. Only the involvment of influencers is missing.

International comparisons

These formats for offering HE focused CEIAG learning are also available in other countries. For example, Australia has Campus Quest which offers users two games, Campus Quest based on a student attending a University campus and E-Study Quest based on a student studying from home.

The graphical interface is slightly more interesting than both of the new UK apps but in particular the 3D presentation is more eye-catching than “The Way Up” game.

Value

For the DfE to offer funding, policy holders must hope that any resulting resources will add value to the marketplace of existing CEIAG digital products either through successfully filing a niche or building upon existing products. For me, currently the two apps (still at testing stage remember) do neither and they also choose to set aside a proportion of the research in this area. It may be more politically satisfying for the DfE to achieve a new CEIAG platform through this process but questions should be asked whether a more worthy platform could have been achieved through the adaption of existing products and how any resulting products are able to fit into, adapt and shape for the positive the current CEIAG landscape supporting young people.

CEIAG in Post 16 providers – a survey

Over the years of writing this blog the annual omnibus survey from the DfE has always offered a useful insight into the reality of the scale of CEIAG provision across the country. Up until now I did not realise that they also undertake a Post 16 version of the survey, the most recent release of which includes plenty of interesting information about the scale and types of provision on offer by providers.

fe omnibus survey

The first point to make about the results is that respondents do appear to come from the wide variety of providers across the FE landscape (Table A1: 421 responses) but overall it’s heartening to see just how widespread the range of CEIAG work is across the Post 16 stage.

fe omnibus survey 1

The rise in employer encounters since 2017 was noted by the CEC looking for signs of impact of their work.

The figures that provide the most surprise to me though come from the split into type of provision by type of institution

fe omnibus survey 2

My assumption would be that FE Colleges would be offering more employer encounters for students than Sixth Forms attached to schools. Employer engagement is a central tenet of the mission of FE providers and the qualifications they offer. In my experience at least, the range and scale of employer engagement is much more frequent and in-depth then what you would expect in a school with a small sixth form but that seems to not to be the case here. The other interesting data point is the scale of difference between the students at different providers participating in University visits but this comes with a word of warning. There is some confusion across the document in the way this question is worded; fig 10.1 phrases it “All students considering university have had at least two visits to universities” while 10.2 uses “University applicants have had at least two visits to universities.” These differences appear subtle but for an FE College who will have a significant proportion of their student population studying qualifications at Level 2 and below, the wording of this question could elicit significantly different results from respondents.

Elsewhere in the survey, it is heartening to see CEIAG provisions taking center stage in respondents thinking when detailing their “activities to encourage students to have high aspirations or to help them achieve their potential.”

fe omnibus survey 3

Careers Teams in Sixth Forms, FE Colleges, UTCs & Studio Schools would be involved in the organisation or delivery of all of those types of provision in some way. Leaving aside the continual misappropriation of disadvantaged young people having “low aspirations,” when research shows that they have high aspirations but lack the tools and social and cultural capital to enact those aspirations (pdf), this data shows Post 16 Careers Leaders how to best frame their offer to explain value to Senior Leaders. The potential areas to offer provision in that would gain benefit can be found in the responses to the next question, “Barriers faced be post-16 institutions in raising aspiration within the student population.”

fe omnibus survey 4

Many of which are structural barriers (e.g. cost of continuing education, socio-economic) but also barriers which Careers Teams can help tackle with clear messaging. For example, with the use of Martin Lewis’ campaign materials to tackle some of the myths around Higher Education tuition fees to assuage student fears over the impact of these costs and offering to play a central role in parental engagement events and activities.

Wide scale tracking of CEIAG provision is valuable to note the impacts that policy or changes in accountability focus can ripple through the system. These annual surveys from the DfE are an important data point to achieve this. Another survey that may interest you or benefit from your involvement is the CEC survey of Careers Leaders in schools which will hopefully report interesting data on the workforce change that the Careers Strategy and DfE Careers Guidance for schools has influenced so get your response sent if this is applicable to you. A similar survey for FE Careers Leaders is planned for later this year.

 

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

The 2017 student destinations of the original Gatsby pilot group

With the recent release by the DfE of the 2016/17 Destinations Data, I thought it would be a useful exercise to look at the Data of those institutions that were involved in the original Gatsby Benchmarks pilot to see how that improvement in CEIAG provision is effecting student outcomes.

All of the 2017 Destination Data used for this post is sourced from the DfE KS5 & KS4 tables (revised editions) here. Any 2015 Destination Data is sourced from the DfE KS5 & KS4 tables for that cohort which can be found here.

In the original North East pilot which started in September 2015, 16 providers (including 3 Further Education providers) used the Gatsby Benchmarks to assess and plan their own provision. With the support of a LEP appointed area lead and £9,000 central funding for each institution they made significant progress to improving their CEIAG offer against the Benchmarks.

In 2015, 50% of the schools and colleges in the pilot achieved no benchmarks, but after two years of hard work over 85% now reach between six and eight benchmarks.

I’ve taken the Destinations Data for those institutions from the DfE tables above and put them in their own Excel table (with the national regional North East figures) which you can download here > gatsby providers destinations

You can also compare that Data against the trends in nationwide Destinations Data in table 1 in the accompanying report to the 2017 release.

national destinations data

Destinations Data

Each year Destinations Data is a snapshot of a cohort of leavers so it is always wise to a) not draw too definitive a set of conclusions and b) place in context of region and historical Destinations Data if possible. In my table above I have also included the regional figures from 2015 and 2017.

There will also be your own personal approach to using Destination Data as a tool. I think that (with the above caveats) it is useful for judging the impact of CEIAG work. If a school is enabling leavers to progress into sustained destinations that cover the variety of routes and perhaps even buck regional or national trends, then I am much more convinced by the efficacy of a school or college’s CEIAG provision.

So we can see that for 2017 KS4 leavers, the Gatsby schools were under-performing for overall sustained destinations against both 2017 regional and national averages. In fact, the achieved average of the schools of 89% in a positive sustained destination has been left behind nationally since the 2012/13 leavers cohort (table 1). The percentage of KS4 leavers (5.8%) securing an Apprenticeship is a touch above the national average but only in line with the regional average and below the 2015 regional average of 8%. Perhaps the affects of the Apprenticeship Levy and the lag that has incurred on young people securing apprenticeships is shown here. Elsewhere the destination not sustained average of 9.5% is higher than both the regional and nation averages (excluding alternate provision providers) and the 2015 regional figure. The percentage of learners moving onto Further Education or Sixth Form providers is varied and can depend heavily on locally available institutions and their offer that students can travel to so not much value can be drawn from those data points.

At KS5 the three institutions involved offer a more mixed story. (It is worth noting at the outset the clear size differences between the institutions involved, Bishop Auckland College had only 60 KS5 leavers in the data while Sunderland College included 1,082) A percentage of 79% for the Gatsby group transitioning into any positive sustained destination is below both regional and national averages while 9% of learners moving into apprenticeships is above both regional and national comparison rates. The greatest distinction can be found in the Destination not sustained results as an average of 16% of students not achieving a sustained destination is well above regional and national averages.

Conclusions

With the roll-out of both the Gatsby Benchmarks as part of the Careers Strategy and DfE school and College guidance and the Hub structure across much of the country I would expect that most officials within the DfE would be wanting to see the growth shoots of a more sustained and significant impact on positive student destinations in the original pilot area. These may yet come as the 2017 Destinations Data is only looking at the second cohort of school leavers to exit KS4 or KS5 since the start of the pilot area’s Gatsby journey. But the desire for improvement in CEIAG provision must come with goals. Benchmarks are either a method of standarising provision types that has impact on outcomes or they’re not. All CEIAG practitioners (and, I would guess) researchers are aware of the difficult nature of capturing the value of CEIAG work, so much happens in a young person’s life that can have an impact on the journey they take, but if we all do really believe that CEIAG can have a positive impact on those young people; that comes with the responsibility of accepting some metrics will be valued by policy makers. Currently, one of those metrics isn’t moving.

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

The media and career choice

I would imagine that most Careers professionals working with young people have seen the impact that media has on their perceptions of work and jobs. From the surge of interest in forensics that spiked in the late 2000s as shows such as CSI and Criminal Minds hit the height of their popularity and resulted in a swell of applicants to Criminology degrees in the UK and the United States to the sudden boom in applications to the US Navy that followed Top Gun in the 1980s, it seems that popular media does have an influence on the career choice of individuals.

These anecdotal examples are also supported by research. In 2017 Konon & Kritikos showed that positive media representations of entrepreneurs resulted increase the probability of self-employment and decrease the probability of salaried work. While this paper from Hoag, Grant & Carpenter (2017) concluded that individuals who consume and have exposure to a wide range of news media were more likely to choose journalism as their major. A 2014 article in the Popular Culture Studies Journal (Tucciarone) conducted a research project looking at representations of the advertising industry and found that even negative or exaggerated portrayals of an industry can have an enticing effect on viewers

One research participant explained: “I think any type of portrayal, even if exaggerated a bit, is better than being completely blind about what goes on in an advertising agency. By watching various depictions of the industry and the careers, I am able to decide if I would even want to take ad courses and be involved with such an industry.”

We also have recent survey data that may point to the impact media consumption can have on young people’s career choices. The 2018 DfE Omnibus survey of pupils and their parents/carers might give some hints as it includes responses on what sources of information young rated as offering helpful careers IAG

dmucmhvwwaei1mq

“Any other source” is a term there that I am sure is a catch-all for a wide range of sources but I would suspect that “the media” (in all it’s guises for young people so including streaming services) is present. I’ve previously posted on the use of vloggers to attempt to capture a young audience and introduce them to a broader range of careers but more traditional narrative media, just streamed by young people on demand, may still play a large part. The BBC itself has found that young people watch more Netflix in a week than all of the BBC TV services. Just how binge worthy shows such as Chef’s Table or Better Call Saul are influencing young people’s views on hospitality or law careers remains to be seen, while free to air channels can still find success with formats that see celebrities trying out different job roles.

 

Again, the positive/negative view of the role or even the realism of the portrayal (I suspect few of the other midwives on Ms Willis’ ward will also be finding the time to design a range of home ware) may not matter. As the quoted research participant above notes, all it can take is the job being introduced to the viewer for the spark of interest to ignite.

The calls for a “UCAS – Apprenticeships” portal

Over the years I have been keeping up to date with CEIAG policy and news, a recurring recommendation in Careers reports and speeches has been that Government should establish or encourage a UCAS style portal (let’s call it AAS – Apprenticeship Application Service) through which young people (or anyone I assume) could apply for an Apprenticeship vacancy. It’s promoters believe that this will encourage more young people to apply for and gain apprenticeships and it has resurfaced in the recent Education Select Committee report “The apprenticeships ladder of opportunity: quality not quantity

We recommend that the Government introduces a proper UCAS-style portal for
technical education to simplify the application process and encourage progression to
further training at higher levels. (Paragraph 89)

It has also been raised by Gerald Kelly & Partners in their report “Not for them: Why aren’t teenagers applying for apprenticeships?” which surveyed young people to find that

While almost two-thirds (63%) say if they could apply for apprenticeships using an UCAS-style format they would

While the Social Mobility Commission under Alan Milburn called for

a UCAS-style body to give young people better information about which apprenticeships are available and what career prospects they could lead to

Vocational and Technical education supporters such as the Edge Foundation also promote

 A well designed portal could explain each option in detail and give advice on how and where to apply. The portal would also make signing up for apprenticeships easier and more managed, as this can currently be a lengthy process and students taking GCSEs already have a lot to focus on.

and opinion pieces have called for a “one stop shop” website to be designed.

UCAS is a monopoly service but it does gain buy-in and brand reach beyond education because it offers a consistency of service year on year. The dates of the application cycle are clearly predetermined and the format of a learners application set, no matter whether the learner is applying to the highest tariff Russell Group Universities or a Foundation Degree at the local FE College; the application form is the same. The institutions in receipt of these applications may also add their own requirements post application form submission before making an offer decision (such as an interview or portfolio assessment) but those institutions all still use that initial form and stick to communal deadlines. The application deadline for Oxbridge, Veterinary, Dentistry & Medicine may be sooner than the main application deadline but, within those categories, there is still agreement across all of the institutions offering those courses on a common deadline.

Would a UCAS style portal for Apprenticeships achieve the same goals and how would it be different to the already established “Find An Apprenticeship?”

  1. Timing and deadlines

Employers can hire apprenticeships throughout the year

apprenticeship starts sept 2018

so there isn’t much agreement on common deadlines. You can see from the graph that the trends do show an increase in starts at the end and beginning of the academic year as (mostly larger) employers have moved their recruitment cycles to capture school and college leavers and also start the off the job training component of the apprenticeship in line with the academic year yet a common deadline is still nowhere to be seen. Whereas now UCAS applicants are clear on the common deadline and Advisers are able to structure application advice towards that deadline the proposals of any AAS system do not seem to envisage that employers could only advertise apprenticeship vacancies in certain periods of the year so this would mean that individual employer deadlines would still apply. As the 2016 Employer Perspectives Survey (p 113) shows that around 18% of all UK institutions offer apprenticeships so this would still mean a multitude of deadlines to hit and advisers to be aware of.

2. Employer control over applications

Much of the Government rhetoric over the reform of the Apprenticeship system through the introduction of standards and the levy has been built around the theme of placing employers “at the heart” of apprenticeship training. Presumably this also includes allowing employers to determine their own apprenticeship recruitment processes. Currently employers can list their apprenticeship vacancies on the “Find An Apprenticeship” site (plus their own sites or third-party sites such as “Get My First Job“) and support and advice is offered on how to recruit, but the employer remains in charge of the process. Sometimes an employer will choose to use the more generic application questions and form contained within the Find An Apprenticeship site

Such as this mock application

or require applicants to apply through their own website

site management apprenticeship

This seems to be a flexibility required by employers. The recruitment process an SME will need to source a suitable applicant for a Level 2 vacancy will be very different to the procedure a multinational corporation will undertake on their annual recruitment of a multitude of apprenticeship standards at higher levels. So forcing a common application form onto all employers offering apprenticeships also seems beyond the reach of an AAS.

3. Age of applicants & references

Higher Education applicants of all ages use UCAS to apply but it would fair to say that the majority of HE starters come from applicants who are of a school or college leaving age.

ucas stats

This is not true of those starting apprenticeships

apprenticeship starts

where the majority of current starters from the applicant pool would not be in education to receive support from an Adviser. Of course the very point of the AAS would be to increase the number of younger applicants but that site would have to be one that would accommodate and be user-friendly for applicants of all ages, whether in education or not.

4. Numbers of applicants

All of the reports suggesting a AAS do so in the commendable hope that it would increase the number of young apply for and so starting apprenticeships. With its title, the Gerald Kelly report is particularly flagrant in its acceptance that young people aren’t applying for apprenticeships. This is strange, as I’ve posted about previously, the DfE no longer publishes the data showing apprenticeship applicants by age, only starts. Misappropriating the number of Apprenticeship starts by age as an indicator of the number of applications by age is not acknowledging the historic data we do have which showed that young already apply for apprenticeships in far greater numbers than the number of vacancies posted. For as AAS portal to be truly warranted, the data on applications by age needs to be regularly shared by the DfE.

5. Differences between Find An Apprenticeship

In any of the reports linked, AAS recommendations come seemingly without reference to the Find An Apprenticeship website which already exists or, if they do acknowledge it, they are unclear about what differences the proposed UCAS style Apprenticeships portal would have. Find An Apprenticeship already allows people to search on a common site for all apprenticeships, research opportunities laid out in a standard format and, in some cases, complete an application through the same site. As I have shown, just establishing a new portal with aspirations to be more like UCAS fails to acknowledge or offer solutions to the fundamental differences between the Apprenticeship and Higher Education processes and routes which would leave any new portal looking and performing much the way as the current Find An Apprenticeship already does.

An AAS portal also offers a suggested quick fix which fails to address the central issue. The Gatsby Benchmarks have shown us what works in CEIAG provision. This is time and cost intensive provision as Apprentices themselves acknowledge

and Gatsby evidenced but it is that support that would really enable young people in greater numbers to strive for and successfully secure Apprenticeships.

 

 

The CEC in front of the Education Select Committee May 2018 – not the one sided thrashing you were led to believe

Link to the Education Select Committee Video here:

https://parliamentlive.tv/event/index/90b1eb8a-1eca-40c2-8916-0956c5cce7a0

So far in its existence (at least to those of us in the Careers community that don’t work for it) it seemed that the Careers and Enterprise Company (CEC) was the golden child, arrived here to save careers work for young people in England. Central funding wise, they essentially are the only show in town as they scale up their pilot work and their communications, PR and branding have been a fresh breeze of modern professionalism in a sector that (if I may) has always been behind the curve in shaping its own public perception. This period of cosy positivity ended though with a bruising session for the CEC in front of Robert Halfon and his Education Select Committee. The trade press reported the session in typical combative framing and the CEC did itself no favourites with a poorly judged call for social media support afterwards.

The Select Committee (well the 7 present of the 11 members) seemed aghast at a number of areas of the CEC’s work and track-record

  • that the CEC had spent £900,000 on research publications which were monies that had not been spent on the front line
  • that the CEC was not yet able to report on the destinations impact of the provision that their work had funded
  • that their board meeting minutes were not made public
  • that the long mooted Enterprise Passport had been put “on hold” despite it being one of the three main strands of the CEC’s original remit
  • that funding pots supposedly dedicated to providing provision for disadvantaged areas were not being totally allocated to those areas
  • paying Enterprise Co-ordinators and other central, senior roles significant salaries above comparable school based roles

Some of these criticisms hold an element of truth but what was also apparent from the session was (yet again) just how woefully ignorant of the Careers landscape (and by extension the work of the CEC) the MPs were.

Of course, it is only fair for MP’s to ask for the upmost transparency and compliance when investigating the value gained for the spending for tax payers money and beginning to focus on the actual impact (rather than merely the quantity) of provision would have been something you might have read about on this blog back in July 2017. Funding from Government comes with strings attached, it must be accounted for so taking the CEC to task for not being clear on the destination data of the pupils receiving CEIAG provision funded by the CEC is to be expected. What was not expected was just how difficult it was for the MPs to grasp that this destination data was;

a) only part of the impact feedback with evaluations and further social mobility measures, employer feedback, skill shortage data etc also to be taken into account

b) not going to be ready yet as many of the young recipients of CEC funded provision were probably still in school at this moment – Mr Halfon seemed unable to comprehend this fairly simple point

and

c) extremely difficult to collect and place comparative value on as the inputs (the type of CEIAG provision) are varied and delivered by a multitude of different providers funded by the CEC

It was also astonishing to see Emma Hardy, the MP for Hull West, at one moment criticize the CEC for not publishing pupil level destination data to show the impact of their work only then to also harangue them for not funding grassroots organisations such as National Careers Week who also do not publish or collect pupil level destination data. NCW are a fine organisation but they are not providers of provision, they are a banner organisation whose launch events and social media exposure allow others to brand their own work. Their own reporting reflects this with the number of tweets and resource downloads indicating a successful impact rather than the actual outcomes of young people. Moments such as this highlight a complete lack of mastery of the Select Committee brief from some of the Members and this was only to continue throughout the session.

Trudy Harrison was the most clueless of the bunch, at times advocating that the CEC should only be judged on the hugely reductive measure of rising or falling youth unemployment in an area in which they are funding provision and showing her utter unpreparedness for the session by repeatedly asking what a “Cold Spot” was. In the end I admired Claudia Harris’ restraint as the Member for Copeland asked for definitions, clarifications and to be sent information that was published on the CEC website back in October 2015 and forms a fundamental basis for all of the subsequent work of the organisation.

(I also enjoyed Lucy Powell noting that the advertised circa £80k CEC Director of Education role is “more than we get paid” considering that an MP’s current salary is very close at £77,379 and Mrs Powell also enjoys income from a number of rental properties according to the Register of MP’s Financial Interests)

Despite the general ignorance of the line of questioning some important points were raised. The fact that the Enterprise Passport is “on hold” to use Christine Hodgson‘s phrase is of note but it was more a pity that the MPs did not have the forensic insight to ask how much had been spent on this project to date. The figures for the amount of applications for funding the CEC received should also have caused a greater swell of interest. For the original £5m funding pot, they received over 10 times (£50m) worth of applications which just shows that there could be vastly more CEIAG work happening with young people if only the funding was there. Again, the MP’s did not pick up on this huge appetite for provision that is currently being unfilled.

As the session progressed, both Hodgson and Claudia Harris struggled gainfully and mostly unsuccessfully to overcome the MPs preordained views. At times, this was the fault of the two representatives of the CEC as they struggled to recall funding amounts or specific data that would’ve helped their push-back and appear more in charge of their remit. This was clearly apparent as they struggled to articulate the processes and structure of the biding and allocation of both the Personal Guidance funds and the Career Hubs monies. This was not helped by Robert Halfon confusing his brief over the remit of two distinct pots of money but also the failure of Harris to explain why biding processes had been designed with certain methodologies and if the £5m allocated for disadvantaged young people was definitively going to be spent on disadvantaged young people. The promises that current schemes (Compass and the 2019 publication of destination data of pupils involved with CEC funded activities) would soon bear fruit also failed to appease the Committee. The central point remains though, it is clearly fair for Select Committee’s to ask for clarity on expenditure and impact and the CEC, with their multitude of funding pots and provision schemes, certainly dropped the ball in explaining this coherently.

Equally though, dissatisfaction arose due to the fact that the roles of the CEC still seem undefined to those MPs who oversee them. Despite Hodgson’s appeals to the contrary that their DfE grant letter provides a clear remit, throughout the session the CEC was tasked by different Members with being a provider of CEIAG provision, an umbrella organisation channelling funding to organisations on the front-line and a research intensive body such as the Education Endowment Foundation only finding what does and doesn’t work (somehow despite their earlier criticisms of too high a research budget) or all of those things or even some mixture of those things.

Perhaps, through no fault of its own, by the time of its creation, the marketplace the CEC hopes to shelter under its umbrella and stakeholder’s perceptions of CEIAG provision had grown so distinct and varied that bringing all of the partner organisations and oversight bodies together will provide a much harder task than they imagined. It’s not that everybody isn’t yet singing from the same hymn sheet, it’s that, despite the huge research investment, the debate over which hymn sheet to use is still happening.