Face 2 Face IAG

Compass Plus

Coming soon to your school computer that takes a solid 30 minutes to boot up in the morning is a new website service from the Careers & Enterprise Company called Compass Plus.

A revamp and an expansion of their Compass self evaluation tool and Tracker provision recording tool, this new method of planning, recording and evaluating your Careers provision has the potential to have a number of benefits throughout the system. Currently being presented around the country after a period of development and testing, the tool will help (initially only school based) Careers Leaders.

compass plus 1

For ease of understanding, imagine that the CEC had taken that spreadsheet with your activities throughout the year mapped against the Benchmarks, reached into your drawer and added the student registers you scribbled down at those activities and also pinched your other Excel sheet with your employer contact details and the stack of slowly returning student destination forms on your desk and put all of that into an online tool that you could share with colleagues – that’s Compass Plus.

The improvements this could offer a Careers Leader are clear. Rather than completing cumbersome spreadsheets, your activities can be uploaded against the Benchmarks immediately updating your Compass completion scores both for planned and completed activities. The ability to integrate with your school’s MIS so that activities are added at a pupil level is potentially a huge positive for the both Leaders and Careers Advisers working in schools.

compass plus 3

As the CEIAG record of an individual student will be there for a Personal Guidance session to build upon or for a Leader to then tailor registers for future events so that all learners have access to CEIAG.

Your employer engagement could also benefit from the ability to store your employer contact details so that all of your colleagues access them for their own activities plus, the soon to come Tracker Enterprise which will allow Enterprise Co-ordinators to also add provider and employer details who can then connect with your activities and opportunities for engagement.

compass plus 4

The CEC is managing the on-boarding process quite tightly

compass plus 2

which is an effort to manage the demand pipeline so that the technology copes with the growth in usage. The potential for Plus to work with other systems such as Unifrog or Start Profile is also exciting.

For the CEC, the rollout of this type of system is logical as it will also offer benefits to their data collection. Currently, Compass evaluations are based on Leaders judgement without the link to activity to evidence those claims. The structure of Compass Plus, (a school’s Gatsby Benchmark compliance rating being automatically generated from actual provision and activities recorded at student level) provides a much stronger evidence base on which a school will self evaluate and then how the CEC collates that data across LEP, region and national pictures for publications such as their annual State of the Nation report. This tool should make it clear to Enterprise Advisers and Co-ordinators (and perhaps in time even Ofsted) that 39 weekly Careers lunchtime drop in sessions that the same 4 students attend isn’t a Careers programme that meets the standard required. With this in mind, I could envisage, even expect, that some schools who had previously scored highly against the Benchmarks, even those achieving full compliance, would see lower Benchmark scores when using the Compass Plus tool.

As a Careers Leader working in FE, much of the Compass Plus tool struck me as processes that already happens in FE. Student level activity recording will already happen in most FE Colleges not only for Careers activities but also other enrichment provision. Those with helpful data teams will have their own versions of the reporting ability that Compass Plus offers to show how many students have attended a Personal Guidance interview or been present at X number of employer encounters. When ever the FE version is developed and tested, the CEC might find that Colleges are much more reluctant to give up systems they have designed to achieve similar aims.

There is a FAQ on the CEC site

https://www.careersandenterprise.co.uk/compass-plus-faqs

and I would encourage all secondary school Careers Leaders to set up their on-boarding as soon as possible.

https://www.careersandenterprise.co.uk/schools-colleges/compass-plus

Advertisements

The numbers in the Careers Hubs Benchmark 8 progress stats are pretty wild

Launched in September 2018 with 20 Hubs across the country (plus the orginal North East pilot area), the Careers & Enterprise Company is now expanding this policy with another 20 Hubs. When launched, I was positive about the structure of support they would be able to offer local areas and could see the rationale behind expanding the North East pilot but was concerned that the funding model those schools and colleges enjoyed was not also being replicated. The initial wave of hubs covers locales across the country:

  1. Black County – 36 schools and colleges
  2. Buck Careers Hub – 21
  3. Cornwall – 40
  4. Cumbria – 40
  5. Greater Manchester – number of schools & colleges involved not clear
  6. Heart of the south west – 40
  7. Humber – 26
  8. Lancashire – number not clear
  9. Leeds City Region – 35
  10. Leicester – 20
  11. Liverpool City Region – 34
  12. New Anglia – 32
  13. North East – 40 (plus 10 colleges?)
  14. Solent – 32
  15. South East – ?
  16. Stoke – 20
  17. Swindon – 40
  18. Tees Valley – 35
  19. West of England – 25
  20. Worcestershire – 40
  21. York – 35

The CEC says the total number of schools and colleges involved is 710.

As we reach the end of the first academic year of their existence, the CEC claims that schools and Colleges in those Hubs are progressing faster towards meeting the Gatsby benchmarks than schools and colleges not located in Hubs and large proportions of them are already meeting a number of the Benchmarks.

 

Which shows rapid improvement in the percentage of Hub schools & Colleges reporting that they are fully meeting Gatbsy benchmarks. Within those figures though a truly eye opening amount of work must be happening.

d6r-s2mw4aab-8f

Let’s take one benchmark in particular – Benchmark 8, Personal Guidance. The claim from the CEC is that 61% of Hub schools and colleges are reporting that they are fully meeting this Benchmark.

The School Guidance for this Benchmark is clear that to achieve it, every pupil should have a guidance interview with a Careers Adviser by the age of 16 and, if the school has a sixth Form, another if required by the age of 18.

gatsby8

While in the Sixth Forms & Colleges Guidance the wording is slightly different to take into account that students can complete Entry, Level 1, Level 2 or Level 3 study programmes at different ages up to 19 so the age of the student isn’t the limiting factor, just as long as the IAG interview occurs during the learners study programme.

fegatsby8

But the aim remains the same; every young person gets a 1:1 Careers meeting with a qualified professional.

Across the 710 schools and colleges in the Hubs it’s hard to find published the exact numbers of schools and the exact number of dedicated Post 16 providers (I’ve included the total number of providers for each Hub above where I could find it) but whatever those figures are, the CEC is now claiming that 61% of Hub providers are fully meeting Benchmark 8. This is extraordinary in itself but what I find even more remarkable is that 56% of those providers were reporting that they were already fully compliant with Benchmark 8 back in July 2018 before the Hub started. That is a very high level of provision in terms of pupil numbers.

Dfe data is that, on average, there are 948 pupils in a secondary school.

Across the 20 Hubs lets say, conservatively, 700 schools of the 710 participants are secondary schools that gives a total school pupil population of 663,600.

That leaves around 10 Sixth Forms or Colleges (in reality, it’s likely that these Post 16 providers take up a greater number) and these providers can vary tremendously in size. For example, Sunderland College has around 4,800 full time learners while Sixth Form Colleges have, on average 1,823 and School Sixth Forms even smaller at 202 students on average.

Sunderland College were part of the North East pilot Hub so I’ll include their learners but be conservative on the other participants and say the rest are smaller Sixth Form Colleges. That would result in a total of 21,207 Post 16 learners included in the Benchmark 8 figures in the pilot.

So the total number of students covered by the Hubs = 684,807 pupils (although this is likely to be larger)

If 61% of providers are now reporting fully meeting Benchmark 8 then that’s approx 423,832 young people in those 20 areas that have had a Careers interview. In July 2018, before the Hub started, 389,092 (56%) of young people were having a Careers interview. This is a huge amount of Careers and guidance provision occurring in those localities.

There should be huge lessons for those practitioners in the rest of the country to learn from these figures.

  1. What was the practice and structure already in place that allowed those 56% of those providers to already meet everyone of their students for a Careers interview? Considering that Hub areas were chosen specifically in response to the CEC’s own cold spots research which was meant to indicate a dearth of Careers provision,

cec 2

There should be learning opportunities here for the CEC as well as their Personal Guidance fund is another pot of money looking to support innovative practice in this area of CEIAG. Their publication in the “What works series: Personal Guidance” shows though that there are not many short cuts to providing provision in this area and how time and cost intensive Personal Guidance is by it’s very nature.

personal guidacne1

In a 948 roll secondary school, a Year 11 cohort would equal around 190 pupils. Seeing 5 of those pupils a day for a Careers interview would take nearly 38 days or over 7.5 weeks so this is a significant staffing allocation and that is just one year group. As a practitioner in an FE College with around 3000 full time students attending, I am another Careers Leader looking for ways to offer a guidance service that meets all of the quality points above but is also flexible enough to maximize capacity.

Hopefully the CEC is learning from those providers in the Hub areas how, despite rating lowly on the Cold Spot metrics, over half of them were able to previously achieve Benchmark 8.

2. How does that level of provision compare to providers outside of Hub areas?

Other sources offer insights but not directly comparable data. The most recent DfE omnibus survey (surveying pupils in Year 8 to 11) reports 47% of (under 16) pupils say the experienced a face to face guidance session

face to face iag

while the 2019 Youth Employment Survey (3008 young people aged 14-24) reports that 67% of young people had an interview with a Careers Advisor.

face to face youth iag

The most recent CEC State of the Nation report shows that 48% of all schools and colleges completing Compass reported that they were fully meeting Benchmark 8 in their first submission but this figure has risen to 55.4% on second rating.

personal guidacne2

So the Hub areas were already starting from a higher base than the rest of the country before the Hubs had even started.

3. Is this stable and should a new Benchmark 8 rating be submitted by the provider every year?

As Deirdre Hughes asks here for Benchmarks 5 & 6 but her question is equally applicable to Benchmark 8

 

 

Each academic year will bring new students for a school or college to work with and many things (loss of staff, internal restructures, expanding school roll) could result in a provider not maintaining their 100% compliance with Benchmark 8. Could the percentage of providers meeting Benchmark 8 in a Hub area fall as well as rise?

4. What changes have lead to the increase in capacity to be able to offer more or attain more take up of Careers interviews since the Hubs started?

Is it more schools and colleges dedicating more staffing towards this provision or something else?

It will be interesting to see how the new Hubs add to the lessons the CEC is learning over the next academic year and whether the rate of progress against Benchmarks continues particularly in areas which require high resource allocation.

Careers apps

Back in December 2018, the DfE announced two winners of a £300,000 funding pot open for digital IAG tools that would help young people make more informed choices about their University choice both in regard to provider and subject. Now, in April 2019, both of the tools have launched.

Offering IAG through digital platforms to young people has a mixed track record (we’ll always have the memory of Plotr) but practitioners know they can be a fundamental resource to use with your client base.

The two apps take different approaches to how they inform and advise their users with the Higher Education data now available. It’s worth saying that both platforms are (at the time of writing) still in beta testing so improvements in design, layout and usability will be ongoing and any judgments should be made with that in mind. The first app, ThinkUni is described as a “personalised digital assistant’ bringing together data on universities, courses and financial outcomes that are easy to explore and compare” and is from a team that has a good track record in offering social mobility enhancing schemes through the Brilliant Club. The current site looks fairly basic with text drop down boxes asking users on their preferences on University study (city or campus, course reputation or earnings outcomes etc).

think uni1On first impressions, I found there isn’t much to be impressed with here. The very first question assumes that the user knows what subject they want to study so relies on a baseline that simply isn’t there for a lot of young people and then site assumes that the user will be studying A Levels so widening participation also doesn’t seem to be a concern (I’m sure that other Level 3 qualifications will be incorporated into the site at some point but soft launching without them isn’t a good look). The “digital assistant” selling point is also over played with the course suggestion results being much the same as would result from a search using advanced filters on the UCAS search facility. If the user already knows their views on subject, location, course type etc. to input, then why not just go or be directed to the source site? Currently, the “assistant” part of ThinkUni seems extremely inactive.

The other competition winner comes from The Profs, a team who have previously built a Professional Tutor finding platform, and is a much more interactive looking experience. “The Way Up” games tasks users to pick an avatar and make choices on learning and earning routes at each step of their journey.

way up game1This approach develops a greater sense of ownership in the process and the results as the user is able to modify the route to reflect their own interests while still following the linear structure of the game. The interface isn’t the most aesthetically astounding you’ll see and I also thought that some of the presented LMI was easy to miss on-screen but, once you notice it, the format does incorporate a significant amount of LMI data into each stage. I also think that the biggest learning gain for young people using the platform might not be regarding their career choice or route but the realistic balance to be found when budgeting monthly in-comings and outgoings.

As a format for simulated learning, turn based, point and click games were also used back in the days of late 2000s Aimhigher University visits when one of the regular activities was a web-based game that allowed secondary school students to take control of a new University student avatar and make choices for their study, work and social life. The implications of those choices displayed in a character health chart which valued balance above partying too hard or studying too much. The user was able to see the realistic choices on offer and the consequences of those choices and reflect on how they would react in that possible environment. So the format isn’t new but the inclusion of the LMI and HE data is.

The “Way Up Game” is designed to have the widest possible capture point so that it includes career routes and choice options for lots of young people. At the more specific and detailed end of the simulation market, flight and even truck driving simulations are PC games that can require high level computers to run with the amount of detail their fan base demands while still offering career learning opportunities. More accessible versions of this format can be found in sector skills funded apps such as Construction Manager from the CiTB. Allowing users to take charge of a construction business, hire employees, pitch for contracts and then take on those jobs all presented within a SIMS type graphical interface make for an engaging career learning experience. Place these alongside digital diagnostic tools and digital communication tool there is a rich variety of online CEIAG resource.

Research

Research evidence on the value of digital and online IAG experiences offers some guidance to both of the creative teams on what could help their products have the impact they are looking for with users.

Two excellent summaries of research in this area are the CEC What Works edition: “Careers Websites” and this recent webinar from Tristram Hooley “Approaches to online guidance”

Neither of the two apps offer any links to expanding social networks or sharing results so building users social capital does not seem to be on the agenda.

The CEC document references research from Dunwall et al (2014) which evaluated the MeTycoon careers game and found that

87% of participants said playing the game had given them new career ideas and 66% said they had shared or discussed the game with friends.

The format of “The Way Up Game” more closely matches MeTycoon so those developers will be hoping for that level of impact with their users. The ThinkUni platform perhaps gains research backing with its slight nod towards the user involving CEIAG professionals in the findings from using the site. The CEC summary states:

The use of careers websites should be integrated into schools’ careers education provision, and may be more effective for pupils when use, at least initially, is mediated and supported by careers and education professionals.

Once the user has contemplated their suggestions, the final screen ThinkUni suggests

think uni assistant1

This is only a very slight prompt though. The user is not asked, for example, if they wish to email their results to a named individual which could be a CEIAG professional or school tutor so perhaps both developers would benefit from designing accompanying session plans that could enable teachers/CEIAG practitioners to use the apps in group sessions and build upon the learning experiences of the young people in the room. A further step could even be to incorporate “nudge” techniques by communicating to both user and professional so conversations could occur to see if further research tasks have been undertaken by the user. Neither of the platforms require the involvement of CEIAG professionals in the learning journey of the user.

This failure to build in involvement of practitioners places both of the apps well behind more detailed digital offers such as Start Profile. This program combines both personalisation lead by the user lead and exploration of career routes with LMI drawn from LMI for all and the ability for practitioner oversight and involvement. As this ICEGS evaluation of Start concludes

Start builds on much of what the existing evidence base tells us about the efficacy of online products. It brings together information and advice for a young person and allows them to personalise their learning journey. It offers a blended learning technology in which the school can connect the online learning to classroom based career learning. It builds on longstanding career assessment practices by building a personal profile of work preferences, qualities, skills and interests and using this to match users to jobs and learning opportunities based on their suitability and how available those jobs are in the labour market.

Differences do remain though between Start Profile and these two new apps in their data sources. LMI for All utilises a range of sources (detailed on page 10 here) but they (and so Start Profile) do not seem to include data from the Office for Students on HE access, continuation, attainment and progression.

By side-stepping CEIAG professionals both apps purely user focussed offers but this could still offer positive impact. The CEC Moments of Choice research concluded that young people desire the presentation of careers data that:

moments choice1

and it would be fair to conclude that both apps achieve at least 7 of those requirements to varying degrees. Young people can access the data in a method that is convient to them, when they require it, be safe in the knowledge that it is using reliable sources, receive suggested future actions and be able to personalise it. Only the involvment of influencers is missing.

International comparisons

These formats for offering HE focused CEIAG learning are also available in other countries. For example, Australia has Campus Quest which offers users two games, Campus Quest based on a student attending a University campus and E-Study Quest based on a student studying from home.

The graphical interface is slightly more interesting than both of the new UK apps but in particular the 3D presentation is more eye-catching than “The Way Up” game.

Value

For the DfE to offer funding, policy holders must hope that any resulting resources will add value to the marketplace of existing CEIAG digital products either through successfully filing a niche or building upon existing products. For me, currently the two apps (still at testing stage remember) do neither and they also choose to set aside a proportion of the research in this area. It may be more politically satisfying for the DfE to achieve a new CEIAG platform through this process but questions should be asked whether a more worthy platform could have been achieved through the adaption of existing products and how any resulting products are able to fit into, adapt and shape for the positive the current CEIAG landscape supporting young people.

CEIAG in Post 16 providers – a survey

Over the years of writing this blog the annual omnibus survey from the DfE has always offered a useful insight into the reality of the scale of CEIAG provision across the country. Up until now I did not realise that they also undertake a Post 16 version of the survey, the most recent release of which includes plenty of interesting information about the scale and types of provision on offer by providers.

fe omnibus survey

The first point to make about the results is that respondents do appear to come from the wide variety of providers across the FE landscape (Table A1: 421 responses) but overall it’s heartening to see just how widespread the range of CEIAG work is across the Post 16 stage.

fe omnibus survey 1

The rise in employer encounters since 2017 was noted by the CEC looking for signs of impact of their work.

The figures that provide the most surprise to me though come from the split into type of provision by type of institution

fe omnibus survey 2

My assumption would be that FE Colleges would be offering more employer encounters for students than Sixth Forms attached to schools. Employer engagement is a central tenet of the mission of FE providers and the qualifications they offer. In my experience at least, the range and scale of employer engagement is much more frequent and in-depth then what you would expect in a school with a small sixth form but that seems to not to be the case here. The other interesting data point is the scale of difference between the students at different providers participating in University visits but this comes with a word of warning. There is some confusion across the document in the way this question is worded; fig 10.1 phrases it “All students considering university have had at least two visits to universities” while 10.2 uses “University applicants have had at least two visits to universities.” These differences appear subtle but for an FE College who will have a significant proportion of their student population studying qualifications at Level 2 and below, the wording of this question could elicit significantly different results from respondents.

Elsewhere in the survey, it is heartening to see CEIAG provisions taking center stage in respondents thinking when detailing their “activities to encourage students to have high aspirations or to help them achieve their potential.”

fe omnibus survey 3

Careers Teams in Sixth Forms, FE Colleges, UTCs & Studio Schools would be involved in the organisation or delivery of all of those types of provision in some way. Leaving aside the continual misappropriation of disadvantaged young people having “low aspirations,” when research shows that they have high aspirations but lack the tools and social and cultural capital to enact those aspirations (pdf), this data shows Post 16 Careers Leaders how to best frame their offer to explain value to Senior Leaders. The potential areas to offer provision in that would gain benefit can be found in the responses to the next question, “Barriers faced be post-16 institutions in raising aspiration within the student population.”

fe omnibus survey 4

Many of which are structural barriers (e.g. cost of continuing education, socio-economic) but also barriers which Careers Teams can help tackle with clear messaging. For example, with the use of Martin Lewis’ campaign materials to tackle some of the myths around Higher Education tuition fees to assuage student fears over the impact of these costs and offering to play a central role in parental engagement events and activities.

Wide scale tracking of CEIAG provision is valuable to note the impacts that policy or changes in accountability focus can ripple through the system. These annual surveys from the DfE are an important data point to achieve this. Another survey that may interest you or benefit from your involvement is the CEC survey of Careers Leaders in schools which will hopefully report interesting data on the workforce change that the Careers Strategy and DfE Careers Guidance for schools has influenced so get your response sent if this is applicable to you. A similar survey for FE Careers Leaders is planned for later this year.

 

The Destinations data still isn’t there for the Gatsby pilot

It has now been three and half academic years since the North East Gatsby pilot kicked off with the aim of implementing the Gatsby career benchmarks in a range of education providers. The Pilot was funded by the Foundation to the tune of £9,000 per school or college plus the central support offered by an appointed Pilot Lead.

Any impacts and lessons of the pilot should be key indicators for both the CEC and the DfE as they monitor the value of the Gatsby benchmarks in improving CEIAG provision. The fact that the Pilot is already being replicated (with lower funding levels) across the country as the CEC rolls out its Careers Hubs programme should only reinforce this need.

To learn these lessons, iCEGS have been commissioned to conduct an evaluation of the pilot which aims to

document a systematic attempt by education providers to implement all eight Benchmarks and establish what impacts might result from the implementation of the Benchmarks.

and, last month, an interim report was published giving an update on the findings of the evaluation.

The interim report finds that the schools and colleges involved in the Pilot self-reported that they did expand or improve their CEIAG provision according to the Benchmarks

icegs report1

This finding is not itself cross referenced against any external accountability of the schools or colleges CEIAG provision such as Ofsted reports. On a positive note though, the report does show that improvement (again, with the self-reported caveat) is possible across all types of education establishment. It seems that the clarification of CEIAG offered by the Benchmarks and the categorisation of the success criteria can be molded to work across the variety of providers in the English education system which is a major positive for policy makers looking for whole sector improvement strategies.

The report also finds that students across the providers responded with “significantly higher career readiness scores” which is an important variable to measure but, of the potential impacts, not the one that would hold the most sway with policy makers I would imagine. For this to be the case, further work would need to be done here to show a link with higher career readiness scores to actual employment and earning outcomes for young people much like the, now very well-known, employer engagement research from the Education & Employers taskforce.

The report also notes that, during the Pilot evaluation period, the schools involved reported an increase in the number of A-C GCSE grades but that it is not possible to draw any conclusions from this as there is no historical data to judge trends and no causation ties to be found to the Gatsby pilot. The success rates of FE College learners is not mentioned nor is any impact on attendance statistics of pupils in the area.

The interim report then praises the role of the Pilot area facilitator in

Creating a community of shared knowledge, networks and practice

which is clearly a benefit that a supported Pilot scheme could enact but one that would cause many professionals working in Local Authority skills development to wonder how that is different from their work which, in my experience at least, usually includes a termly forum where school and college leaders or careers leaders can meet.

Lessons

Perhaps the part of the report most useful for other Hub Leads and Careers Leaders is the Emergent Challenges section (page 14). The fact that FE Colleges will struggle with Benchmark 8 (Personal Guidance) won’t be news to any practitioner grappling with the requirements to offer this highly individual service within current funding envelopes but the issue of tracking provision against the Benchmarks is one which I think the CEC do need to be on top of. Their Tracker tool has the basis to offer much support in this area but the wide range of approaches in how providers input activities will soon cause reporting problems for them. I’ve seen examples of a school inputting every Careers lunchtime drop in session they run, meaning huge numbers of repetitive provisions being added which boost the total number of activities in a Benchmark.

Destinations

This then leaves the elephant in the room. The destinations data of the actual students participating in all of this provision is not mentioned in this interim report but this is due to the timescales involved and it will be referenced in the final report. Many of the pupils attending those providers during the pilot time will not have left those providers yet and, for those that have, only one years worth of data (2017) has been published by the DfE. I blogged about that 2017 data here and the lack of (positive) impact perceivable in those positive destination indicators so it will be interesting to see what the final report concludes with a researcher’s eye looking at a more complete data picture.

Conclusion

At the moment the Pilot evidence shows that providers are reporting that their provision has grown and their institutional behaviours changed because of the Gatbsy Benchmarks and that pupils are more confident about their career readiness. These are small rewards for the sheer scale of system change that has occurred with the formation of the CEC and its subsequent policies (including Hubs, Careers Leaders and employer encounters). The evidence for actual outcomes for students is still lacking. What this is proving though to policy makers is that system change in education is possible (without much funding) if the provision aims are formalised and grouped into ‘benchmark’ style targets. It seems that schools and colleges, despite their protestations to the contrary, do like to be measured on some of the things they do.

 

An important destinations distinction

In October 2018 the DfE published “Destinations Data: Good Practice guide for schools” which, as DfE guidance documents go, is a snappy publication that sets out how schools and Colleges should approach collecting destination information from their learners, the duties beholden on Local Authorities in this area, where this information is then published and how it can be used to adapt provision.

The important section that I wanted to highlight for this post was the definition of “Destinations data” vs “Destinations Measures” which I had never considered before and will now endeavour to adhere to use as a definition in future posts and discussions about destinations and would hope that other practitioners join me in sticking to.

  • What is Destinations data?

destinations1

  • What are Destinations Measures?

destinations2

This is important because, as the Gatsby benchmarks and the Careers Strategy gain momentum and Ofsted continue to inspect CEIAG provision in schools, positive destination data will become more of badge of honour to schools keen to show they are taking Careers work seriously. Differences could then arise between what a school claims is their Destination data and what is published by the DfE and then included in their performance tables as the school data may rely on leavers future intended destinations while the DfE data looks back at sustained destinations.

In fact this has already happened with UTCs who have long claimed extremely positive destination data has a significant benefit to their model of education only to recently have their claims undermined by the more robust and historically confirmed DfE Destination Measures. As the DfE Measures record

the number of students who have been in a sustained destination for six months in the year after finishing key stage 4 or 16- 18 study (from October to March, or any six consecutive months for apprenticeships). The headline accountability measure at both key stage 4 and 16-18 study is the proportion of students staying in education or employment for at least two terms

they will be a much better reflection of the actual destinations of learners.

It is important that schools do not solely use their own data to evaluate their CEIAG provision and are using Destination Measures as well as comparison between the two may also highlight useful factors (for example, if many learners where intending to secure apprenticeships but then did not or if learners from disadvantaged backgrounds were struggling to progress). It is also vital that Ofsted inspectors brief themselves on the historical trends in a school’s Destination Measures before an inspection which may show the steady progress in leavers securing more apprenticeships or other positive and sustained destinations which would reflect well on the school’s Careers work.

So, from this point on – Destinations data = a school’s intended or checked leaver destination figures. Destination Measures = the DfE published figures.

Going for a job interview in the States around Halloween has it’s own special challenges

“An actual interview going on at the office. We take Halloween very seriously.”

ji61tt3nvlu11

Some of the comments:

I too had an interview on halloween last year. Yes, the interviewers, all four, were wearing costumes ranging from sorceress to tin man.

I interviewed 3 years ago at a company during their halloween dress up day. The people interviewing me were dressed up as Forrest Gump and Lt Dan.

I’m honestly not certain what the Careers profession can do to prepare clients for this!

via r/funny