dfe

CEIAG in Post 16 providers – a survey

Over the years of writing this blog the annual omnibus survey from the DfE has always offered a useful insight into the reality of the scale of CEIAG provision across the country. Up until now I did not realise that they also undertake a Post 16 version of the survey, the most recent release of which includes plenty of interesting information about the scale and types of provision on offer by providers.

fe omnibus survey

The first point to make about the results is that respondents do appear to come from the wide variety of providers across the FE landscape (Table A1: 421 responses) but overall it’s heartening to see just how widespread the range of CEIAG work is across the Post 16 stage.

fe omnibus survey 1

The rise in employer encounters since 2017 was noted by the CEC looking for signs of impact of their work.

The figures that provide the most surprise to me though come from the split into type of provision by type of institution

fe omnibus survey 2

My assumption would be that FE Colleges would be offering more employer encounters for students than Sixth Forms attached to schools. Employer engagement is a central tenet of the mission of FE providers and the qualifications they offer. In my experience at least, the range and scale of employer engagement is much more frequent and in-depth then what you would expect in a school with a small sixth form but that seems to not to be the case here. The other interesting data point is the scale of difference between the students at different providers participating in University visits but this comes with a word of warning. There is some confusion across the document in the way this question is worded; fig 10.1 phrases it “All students considering university have had at least two visits to universities” while 10.2 uses “University applicants have had at least two visits to universities.” These differences appear subtle but for an FE College who will have a significant proportion of their student population studying qualifications at Level 2 and below, the wording of this question could elicit significantly different results from respondents.

Elsewhere in the survey, it is heartening to see CEIAG provisions taking center stage in respondents thinking when detailing their “activities to encourage students to have high aspirations or to help them achieve their potential.”

fe omnibus survey 3

Careers Teams in Sixth Forms, FE Colleges, UTCs & Studio Schools would be involved in the organisation or delivery of all of those types of provision in some way. Leaving aside the continual misappropriation of disadvantaged young people having “low aspirations,” when research shows that they have high aspirations but lack the tools and social and cultural capital to enact those aspirations (pdf), this data shows Post 16 Careers Leaders how to best frame their offer to explain value to Senior Leaders. The potential areas to offer provision in that would gain benefit can be found in the responses to the next question, “Barriers faced be post-16 institutions in raising aspiration within the student population.”

fe omnibus survey 4

Many of which are structural barriers (e.g. cost of continuing education, socio-economic) but also barriers which Careers Teams can help tackle with clear messaging. For example, with the use of Martin Lewis’ campaign materials to tackle some of the myths around Higher Education tuition fees to assuage student fears over the impact of these costs and offering to play a central role in parental engagement events and activities.

Wide scale tracking of CEIAG provision is valuable to note the impacts that policy or changes in accountability focus can ripple through the system. These annual surveys from the DfE are an important data point to achieve this. Another survey that may interest you or benefit from your involvement is the CEC survey of Careers Leaders in schools which will hopefully report interesting data on the workforce change that the Careers Strategy and DfE Careers Guidance for schools has influenced so get your response sent if this is applicable to you. A similar survey for FE Careers Leaders is planned for later this year.

 

Advertisements

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

The 2017 student destinations of the original Gatsby pilot group

With the recent release by the DfE of the 2016/17 Destinations Data, I thought it would be a useful exercise to look at the Data of those institutions that were involved in the original Gatsby Benchmarks pilot to see how that improvement in CEIAG provision is effecting student outcomes.

All of the 2017 Destination Data used for this post is sourced from the DfE KS5 & KS4 tables (revised editions) here. Any 2015 Destination Data is sourced from the DfE KS5 & KS4 tables for that cohort which can be found here.

In the original North East pilot which started in September 2015, 16 providers (including 3 Further Education providers) used the Gatsby Benchmarks to assess and plan their own provision. With the support of a LEP appointed area lead and £9,000 central funding for each institution they made significant progress to improving their CEIAG offer against the Benchmarks.

In 2015, 50% of the schools and colleges in the pilot achieved no benchmarks, but after two years of hard work over 85% now reach between six and eight benchmarks.

I’ve taken the Destinations Data for those institutions from the DfE tables above and put them in their own Excel table (with the national regional North East figures) which you can download here > gatsby providers destinations

You can also compare that Data against the trends in nationwide Destinations Data in table 1 in the accompanying report to the 2017 release.

national destinations data

Destinations Data

Each year Destinations Data is a snapshot of a cohort of leavers so it is always wise to a) not draw too definitive a set of conclusions and b) place in context of region and historical Destinations Data if possible. In my table above I have also included the regional figures from 2015 and 2017.

There will also be your own personal approach to using Destination Data as a tool. I think that (with the above caveats) it is useful for judging the impact of CEIAG work. If a school is enabling leavers to progress into sustained destinations that cover the variety of routes and perhaps even buck regional or national trends, then I am much more convinced by the efficacy of a school or college’s CEIAG provision.

So we can see that for 2017 KS4 leavers, the Gatsby schools were under-performing for overall sustained destinations against both 2017 regional and national averages. In fact, the achieved average of the schools of 89% in a positive sustained destination has been left behind nationally since the 2012/13 leavers cohort (table 1). The percentage of KS4 leavers (5.8%) securing an Apprenticeship is a touch above the national average but only in line with the regional average and below the 2015 regional average of 8%. Perhaps the affects of the Apprenticeship Levy and the lag that has incurred on young people securing apprenticeships is shown here. Elsewhere the destination not sustained average of 9.5% is higher than both the regional and nation averages (excluding alternate provision providers) and the 2015 regional figure. The percentage of learners moving onto Further Education or Sixth Form providers is varied and can depend heavily on locally available institutions and their offer that students can travel to so not much value can be drawn from those data points.

At KS5 the three institutions involved offer a more mixed story. (It is worth noting at the outset the clear size differences between the institutions involved, Bishop Auckland College had only 60 KS5 leavers in the data while Sunderland College included 1,082) A percentage of 79% for the Gatsby group transitioning into any positive sustained destination is below both regional and national averages while 9% of learners moving into apprenticeships is above both regional and national comparison rates. The greatest distinction can be found in the Destination not sustained results as an average of 16% of students not achieving a sustained destination is well above regional and national averages.

Conclusions

With the roll-out of both the Gatsby Benchmarks as part of the Careers Strategy and DfE school and College guidance and the Hub structure across much of the country I would expect that most officials within the DfE would be wanting to see the growth shoots of a more sustained and significant impact on positive student destinations in the original pilot area. These may yet come as the 2017 Destinations Data is only looking at the second cohort of school leavers to exit KS4 or KS5 since the start of the pilot area’s Gatsby journey. But the desire for improvement in CEIAG provision must come with goals. Benchmarks are either a method of standarising provision types that has impact on outcomes or they’re not. All CEIAG practitioners (and, I would guess) researchers are aware of the difficult nature of capturing the value of CEIAG work, so much happens in a young person’s life that can have an impact on the journey they take, but if we all do really believe that CEIAG can have a positive impact on those young people; that comes with the responsibility of accepting some metrics will be valued by policy makers. Currently, one of those metrics isn’t moving.

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

The media and career choice

I would imagine that most Careers professionals working with young people have seen the impact that media has on their perceptions of work and jobs. From the surge of interest in forensics that spiked in the late 2000s as shows such as CSI and Criminal Minds hit the height of their popularity and resulted in a swell of applicants to Criminology degrees in the UK and the United States to the sudden boom in applications to the US Navy that followed Top Gun in the 1980s, it seems that popular media does have an influence on the career choice of individuals.

These anecdotal examples are also supported by research. In 2017 Konon & Kritikos showed that positive media representations of entrepreneurs resulted increase the probability of self-employment and decrease the probability of salaried work. While this paper from Hoag, Grant & Carpenter (2017) concluded that individuals who consume and have exposure to a wide range of news media were more likely to choose journalism as their major. A 2014 article in the Popular Culture Studies Journal (Tucciarone) conducted a research project looking at representations of the advertising industry and found that even negative or exaggerated portrayals of an industry can have an enticing effect on viewers

One research participant explained: “I think any type of portrayal, even if exaggerated a bit, is better than being completely blind about what goes on in an advertising agency. By watching various depictions of the industry and the careers, I am able to decide if I would even want to take ad courses and be involved with such an industry.”

We also have recent survey data that may point to the impact media consumption can have on young people’s career choices. The 2018 DfE Omnibus survey of pupils and their parents/carers might give some hints as it includes responses on what sources of information young rated as offering helpful careers IAG

dmucmhvwwaei1mq

“Any other source” is a term there that I am sure is a catch-all for a wide range of sources but I would suspect that “the media” (in all it’s guises for young people so including streaming services) is present. I’ve previously posted on the use of vloggers to attempt to capture a young audience and introduce them to a broader range of careers but more traditional narrative media, just streamed by young people on demand, may still play a large part. The BBC itself has found that young people watch more Netflix in a week than all of the BBC TV services. Just how binge worthy shows such as Chef’s Table or Better Call Saul are influencing young people’s views on hospitality or law careers remains to be seen, while free to air channels can still find success with formats that see celebrities trying out different job roles.

 

Again, the positive/negative view of the role or even the realism of the portrayal (I suspect few of the other midwives on Ms Willis’ ward will also be finding the time to design a range of home ware) may not matter. As the quoted research participant above notes, all it can take is the job being introduced to the viewer for the spark of interest to ignite.

The calls for a “UCAS – Apprenticeships” portal

Over the years I have been keeping up to date with CEIAG policy and news, a recurring recommendation in Careers reports and speeches has been that Government should establish or encourage a UCAS style portal (let’s call it AAS – Apprenticeship Application Service) through which young people (or anyone I assume) could apply for an Apprenticeship vacancy. It’s promoters believe that this will encourage more young people to apply for and gain apprenticeships and it has resurfaced in the recent Education Select Committee report “The apprenticeships ladder of opportunity: quality not quantity

We recommend that the Government introduces a proper UCAS-style portal for
technical education to simplify the application process and encourage progression to
further training at higher levels. (Paragraph 89)

It has also been raised by Gerald Kelly & Partners in their report “Not for them: Why aren’t teenagers applying for apprenticeships?” which surveyed young people to find that

While almost two-thirds (63%) say if they could apply for apprenticeships using an UCAS-style format they would

While the Social Mobility Commission under Alan Milburn called for

a UCAS-style body to give young people better information about which apprenticeships are available and what career prospects they could lead to

Vocational and Technical education supporters such as the Edge Foundation also promote

 A well designed portal could explain each option in detail and give advice on how and where to apply. The portal would also make signing up for apprenticeships easier and more managed, as this can currently be a lengthy process and students taking GCSEs already have a lot to focus on.

and opinion pieces have called for a “one stop shop” website to be designed.

UCAS is a monopoly service but it does gain buy-in and brand reach beyond education because it offers a consistency of service year on year. The dates of the application cycle are clearly predetermined and the format of a learners application set, no matter whether the learner is applying to the highest tariff Russell Group Universities or a Foundation Degree at the local FE College; the application form is the same. The institutions in receipt of these applications may also add their own requirements post application form submission before making an offer decision (such as an interview or portfolio assessment) but those institutions all still use that initial form and stick to communal deadlines. The application deadline for Oxbridge, Veterinary, Dentistry & Medicine may be sooner than the main application deadline but, within those categories, there is still agreement across all of the institutions offering those courses on a common deadline.

Would a UCAS style portal for Apprenticeships achieve the same goals and how would it be different to the already established “Find An Apprenticeship?”

  1. Timing and deadlines

Employers can hire apprenticeships throughout the year

apprenticeship starts sept 2018

so there isn’t much agreement on common deadlines. You can see from the graph that the trends do show an increase in starts at the end and beginning of the academic year as (mostly larger) employers have moved their recruitment cycles to capture school and college leavers and also start the off the job training component of the apprenticeship in line with the academic year yet a common deadline is still nowhere to be seen. Whereas now UCAS applicants are clear on the common deadline and Advisers are able to structure application advice towards that deadline the proposals of any AAS system do not seem to envisage that employers could only advertise apprenticeship vacancies in certain periods of the year so this would mean that individual employer deadlines would still apply. As the 2016 Employer Perspectives Survey (p 113) shows that around 18% of all UK institutions offer apprenticeships so this would still mean a multitude of deadlines to hit and advisers to be aware of.

2. Employer control over applications

Much of the Government rhetoric over the reform of the Apprenticeship system through the introduction of standards and the levy has been built around the theme of placing employers “at the heart” of apprenticeship training. Presumably this also includes allowing employers to determine their own apprenticeship recruitment processes. Currently employers can list their apprenticeship vacancies on the “Find An Apprenticeship” site (plus their own sites or third-party sites such as “Get My First Job“) and support and advice is offered on how to recruit, but the employer remains in charge of the process. Sometimes an employer will choose to use the more generic application questions and form contained within the Find An Apprenticeship site

Such as this mock application

or require applicants to apply through their own website

site management apprenticeship

This seems to be a flexibility required by employers. The recruitment process an SME will need to source a suitable applicant for a Level 2 vacancy will be very different to the procedure a multinational corporation will undertake on their annual recruitment of a multitude of apprenticeship standards at higher levels. So forcing a common application form onto all employers offering apprenticeships also seems beyond the reach of an AAS.

3. Age of applicants & references

Higher Education applicants of all ages use UCAS to apply but it would fair to say that the majority of HE starters come from applicants who are of a school or college leaving age.

ucas stats

This is not true of those starting apprenticeships

apprenticeship starts

where the majority of current starters from the applicant pool would not be in education to receive support from an Adviser. Of course the very point of the AAS would be to increase the number of younger applicants but that site would have to be one that would accommodate and be user-friendly for applicants of all ages, whether in education or not.

4. Numbers of applicants

All of the reports suggesting a AAS do so in the commendable hope that it would increase the number of young apply for and so starting apprenticeships. With its title, the Gerald Kelly report is particularly flagrant in its acceptance that young people aren’t applying for apprenticeships. This is strange, as I’ve posted about previously, the DfE no longer publishes the data showing apprenticeship applicants by age, only starts. Misappropriating the number of Apprenticeship starts by age as an indicator of the number of applications by age is not acknowledging the historic data we do have which showed that young already apply for apprenticeships in far greater numbers than the number of vacancies posted. For as AAS portal to be truly warranted, the data on applications by age needs to be regularly shared by the DfE.

5. Differences between Find An Apprenticeship

In any of the reports linked, AAS recommendations come seemingly without reference to the Find An Apprenticeship website which already exists or, if they do acknowledge it, they are unclear about what differences the proposed UCAS style Apprenticeships portal would have. Find An Apprenticeship already allows people to search on a common site for all apprenticeships, research opportunities laid out in a standard format and, in some cases, complete an application through the same site. As I have shown, just establishing a new portal with aspirations to be more like UCAS fails to acknowledge or offer solutions to the fundamental differences between the Apprenticeship and Higher Education processes and routes which would leave any new portal looking and performing much the way as the current Find An Apprenticeship already does.

An AAS portal also offers a suggested quick fix which fails to address the central issue. The Gatsby Benchmarks have shown us what works in CEIAG provision. This is time and cost intensive provision as Apprentices themselves acknowledge

and Gatsby evidenced but it is that support that would really enable young people in greater numbers to strive for and successfully secure Apprenticeships.

 

 

The CEC in front of the Education Select Committee May 2018 – not the one sided thrashing you were led to believe

Link to the Education Select Committee Video here:

https://parliamentlive.tv/event/index/90b1eb8a-1eca-40c2-8916-0956c5cce7a0

So far in its existence (at least to those of us in the Careers community that don’t work for it) it seemed that the Careers and Enterprise Company (CEC) was the golden child, arrived here to save careers work for young people in England. Central funding wise, they essentially are the only show in town as they scale up their pilot work and their communications, PR and branding have been a fresh breeze of modern professionalism in a sector that (if I may) has always been behind the curve in shaping its own public perception. This period of cosy positivity ended though with a bruising session for the CEC in front of Robert Halfon and his Education Select Committee. The trade press reported the session in typical combative framing and the CEC did itself no favourites with a poorly judged call for social media support afterwards.

The Select Committee (well the 7 present of the 11 members) seemed aghast at a number of areas of the CEC’s work and track-record

  • that the CEC had spent £900,000 on research publications which were monies that had not been spent on the front line
  • that the CEC was not yet able to report on the destinations impact of the provision that their work had funded
  • that their board meeting minutes were not made public
  • that the long mooted Enterprise Passport had been put “on hold” despite it being one of the three main strands of the CEC’s original remit
  • that funding pots supposedly dedicated to providing provision for disadvantaged areas were not being totally allocated to those areas
  • paying Enterprise Co-ordinators and other central, senior roles significant salaries above comparable school based roles

Some of these criticisms hold an element of truth but what was also apparent from the session was (yet again) just how woefully ignorant of the Careers landscape (and by extension the work of the CEC) the MPs were.

Of course, it is only fair for MP’s to ask for the upmost transparency and compliance when investigating the value gained for the spending for tax payers money and beginning to focus on the actual impact (rather than merely the quantity) of provision would have been something you might have read about on this blog back in July 2017. Funding from Government comes with strings attached, it must be accounted for so taking the CEC to task for not being clear on the destination data of the pupils receiving CEIAG provision funded by the CEC is to be expected. What was not expected was just how difficult it was for the MPs to grasp that this destination data was;

a) only part of the impact feedback with evaluations and further social mobility measures, employer feedback, skill shortage data etc also to be taken into account

b) not going to be ready yet as many of the young recipients of CEC funded provision were probably still in school at this moment – Mr Halfon seemed unable to comprehend this fairly simple point

and

c) extremely difficult to collect and place comparative value on as the inputs (the type of CEIAG provision) are varied and delivered by a multitude of different providers funded by the CEC

It was also astonishing to see Emma Hardy, the MP for Hull West, at one moment criticize the CEC for not publishing pupil level destination data to show the impact of their work only then to also harangue them for not funding grassroots organisations such as National Careers Week who also do not publish or collect pupil level destination data. NCW are a fine organisation but they are not providers of provision, they are a banner organisation whose launch events and social media exposure allow others to brand their own work. Their own reporting reflects this with the number of tweets and resource downloads indicating a successful impact rather than the actual outcomes of young people. Moments such as this highlight a complete lack of mastery of the Select Committee brief from some of the Members and this was only to continue throughout the session.

Trudy Harrison was the most clueless of the bunch, at times advocating that the CEC should only be judged on the hugely reductive measure of rising or falling youth unemployment in an area in which they are funding provision and showing her utter unpreparedness for the session by repeatedly asking what a “Cold Spot” was. In the end I admired Claudia Harris’ restraint as the Member for Copeland asked for definitions, clarifications and to be sent information that was published on the CEC website back in October 2015 and forms a fundamental basis for all of the subsequent work of the organisation.

(I also enjoyed Lucy Powell noting that the advertised circa £80k CEC Director of Education role is “more than we get paid” considering that an MP’s current salary is very close at £77,379 and Mrs Powell also enjoys income from a number of rental properties according to the Register of MP’s Financial Interests)

Despite the general ignorance of the line of questioning some important points were raised. The fact that the Enterprise Passport is “on hold” to use Christine Hodgson‘s phrase is of note but it was more a pity that the MPs did not have the forensic insight to ask how much had been spent on this project to date. The figures for the amount of applications for funding the CEC received should also have caused a greater swell of interest. For the original £5m funding pot, they received over 10 times (£50m) worth of applications which just shows that there could be vastly more CEIAG work happening with young people if only the funding was there. Again, the MP’s did not pick up on this huge appetite for provision that is currently being unfilled.

As the session progressed, both Hodgson and Claudia Harris struggled gainfully and mostly unsuccessfully to overcome the MPs preordained views. At times, this was the fault of the two representatives of the CEC as they struggled to recall funding amounts or specific data that would’ve helped their push-back and appear more in charge of their remit. This was clearly apparent as they struggled to articulate the processes and structure of the biding and allocation of both the Personal Guidance funds and the Career Hubs monies. This was not helped by Robert Halfon confusing his brief over the remit of two distinct pots of money but also the failure of Harris to explain why biding processes had been designed with certain methodologies and if the £5m allocated for disadvantaged young people was definitively going to be spent on disadvantaged young people. The promises that current schemes (Compass and the 2019 publication of destination data of pupils involved with CEC funded activities) would soon bear fruit also failed to appease the Committee. The central point remains though, it is clearly fair for Select Committee’s to ask for clarity on expenditure and impact and the CEC, with their multitude of funding pots and provision schemes, certainly dropped the ball in explaining this coherently.

Equally though, dissatisfaction arose due to the fact that the roles of the CEC still seem undefined to those MPs who oversee them. Despite Hodgson’s appeals to the contrary that their DfE grant letter provides a clear remit, throughout the session the CEC was tasked by different Members with being a provider of CEIAG provision, an umbrella organisation channelling funding to organisations on the front-line and a research intensive body such as the Education Endowment Foundation only finding what does and doesn’t work (somehow despite their earlier criticisms of too high a research budget) or all of those things or even some mixture of those things.

Perhaps, through no fault of its own, by the time of its creation, the marketplace the CEC hopes to shelter under its umbrella and stakeholder’s perceptions of CEIAG provision had grown so distinct and varied that bringing all of the partner organisations and oversight bodies together will provide a much harder task than they imagined. It’s not that everybody isn’t yet singing from the same hymn sheet, it’s that, despite the huge research investment, the debate over which hymn sheet to use is still happening.