Month: May 2014

The hidden pill for schools to swallow in the CEIAG guidance

The updated Careers Guidance has been out for a few weeks now which is long enough for it to be read, digested and (in some cases) spat back out by those with an interest in these things. The initial media coverage concentrated on the clear desire in the document(s) for schools to be much more proactive in their approaches and collaborations with the business community to provide the much vaunted and discussed “inspiration” that will illuminate the clear routes ahead of young people on their paths to success. Or something.

What gained less attention was the inclusion of instructions for schools which, arguably, could require a greater amount of change from them.

The original guidance, published in March 2013, contained the Duty including the highlighted sentence below:

while the expanded and updated Guidance in 2014 contains this whole, much more detailed, section:

The difference between the two excerpts could not be clearer in the detail covered or the expectation placed on schools. Or to be more precise, the expectation placed on Careers leads in schools. We now can’t hide away from the fact that we are the forefront of the growth of the marketplace for students at 14 and our requirements to spread IAG may cause disquiet and unease among colleagues and ripples through our local educational landscape. I would imagine, in most schools,  it’s something that needs airing with all of our Senior Leadership teams explicitly and soon.

The issues Studio schools and UTCs have previously encountered with enrolling students have already been noticed by both the national press and the Ministerial team writing the checks so in response, some of these individual schools have been pushing their marketing boat out with focused, local campaigns whilst being supported by a national presence with substantial PR nous and which herald the positive employability skills gained by their alumni. In some areas, this marketing push hasn’t gone smoothly and, I must admit, I’m surprised there hasn’t been more coverage of localised political shenanigans resulting from these transitions (if I’ve missed any, please let me know in the comments). If I was the Head of a newly or soon to be opened Studio School or UTC I would be sending that second image above to the Heads of all my local secondary schools with an offer to come in and run an assembly. Of course, not all of those offers would result in collaborative work but schools who refuse or ignore those requests are on much more shaky ground should Ofsted arrive and ask the questions they should be asking.

There will be Careers Leads in schools who may be reading this and feel content in the knowledge that a UTC or Studio School is not due to open near their patch. They would be wallowing in the relief that I feel when speaking to Colleagues who work in schools with Sixth Form provisions about the long running and well-known battles had about introducing other routes to students at the 16 transition point. Well, I’d hesitate to feel totally at ease yet because, included in that second image, is the line “opportunities for 14-year-old enrollment at local Colleges” and with the funding squeeze being felt by Post 16 providers it’s not difficult to imagine many more of them looking into establishing provision at 14 to both shore up funding and subsequent enrolment at Level 3. This is an issue coming all our ways.

Advertisements

How do you solve a problem like STEM?

Some interesting thoughts and a teased report about IAG and students at UTCs is towards the end of this post from the AQA Policy Blog. For a lot of youngsters, just choosing GCSE subject choices is daunting task because of the ‘career’ importance they can (sometimes unjustifiably) place upon them so moving school at this point to pursue a set career path must be an even bigger decision. The research on the IAG that has been there (or not) to aid those choices will be very interesting and, perhaps, highlight another side of the story to the marketing focus on the employability benefits and desire of those students which is part of the drive to raise enrollment numbers in UTCs.

AQA Policy Blog

16.05.14 There is a significant concern in the UK right now about a STEM “skills gap”, resulting from a failure to provide young people with the necessary STEM skills and knowledge in school required to succeed at university or in employment. A note from the Parliamentary Office of Science and Technology in 2013 suggested that 42% of employers reported difficulties recruiting “STEM-proficient staff”. In attempting to tackle this, the government has developed numerous initiatives to encourage STEM subject take-up, but until now, few have focused specifically on improving STEM progression routes through school and into HE or employment. The question remains: what can be done to plug the gaps in the STEM pipeline?

One new idea is the “Your Life” scheme, a “business and entrepreneur led” campaign, which launched last week and hopes to bring together business, educators, civil society and government with the aim of growing the…

View original post 745 more words

The stories told and not told by school Destination data

A common theme throughout all of the recent commentary on the state of CEIAG in schools has been that the publication of Destination statistics for all schools is a ‘good thing.’ In the modern world, the argument goes, transparency of outcomes for schools should not just rely on qualifications gained by students but on the stability and suitability of the progress those students then make in their first steps beyond the school gates.

With this in mind I wanted to post something concentrating purely on the Destination Data of my own school’s leavers to show how this does and does not offer insight when looking at figures on a school size level.

I’ll be using 4 sets of Destination data to give some context.

Firstly, there is the data currently on the DfE performance tables website. This relates to our 2009/10 leavers.

Second, is the data for the 2010/2011 cohort that is due to be published on the performance table site in June.

So, what to notice between those two? The trend in our numbers to FE seem to be falling while the numbers to Sixth Form College are rising, Apprenticeships are steady and “Destinations not sustained” are falling. The FE and Sixth Form trends have the biggest swing in numbers so could tell the story of a more definitive trajectory. The Apprenticeships and Not Sustained numbers are pleasing but I’m wary of hanging out the bunting because, as you can see from the second table, the numbers of students involved are small. One or two students either way and those percentages alter significantly.

A hugely important factor to bear in mind is that this data is based not on a snapshot but on an extended time period. As the guidance tells us Participation is determined as enrollment for “the first two terms (defined as October to March) of the year after the young person left KS4″ and not sustained destinations are defined as  “young people who had participated at an education destination during the academic year but did not complete the required six months participation.” There is much to commend on the longer term measurement being used here which does more thoroughly test a school’s CEIAG legwork to suitably place their students post KS4. A negative consequence of this more considered approach though is the sheer amount of time that has to be allowed before publication to let the students travel through the system. The most recent set of data above covers students who left us 3 years ago. 3 years can be a lifetime of change in a school with new initiatives, new curriculum, staff turnover, Leadership changes, new priorities and events so to use this to judge that school in the here and now seems to be a little redundant.

The third set of data for our 2011/2012 cohort is from our Local Authority, who, alongside their Youth Service partners, work their way through enrollment lists, phone calls and house visits to get all of the stats which the DfE then utilise in future.

The first thing to notice is that some of the Destination terms are not the same. This immediately causes issues in comparison. Compared to the first two sets of data, the trend away from FE routes and towards Sixth Form (not differentiated between School Sixth Form and Sixth Form College here) reduces but continues. The NEET category (not known in the DfE data) is pleasing again (with the same caveat as above) while the Part Time Education numbers are odd and appear towards the larger end of the local spread (more about this below) but they lead to another concern; any conclusions we draw are only as sound as the data collection and entry job that went before them.

The biggest difference in the data sets is that the Local Authority data is a snap shot taken on the 1st of November 2012, just a few short weeks after the GCSE results. If published then, the immediacy of this data could provide interested parties such as Ofsted or parents much more reactive numbers on which to judge local secondary schools but this immediacy could also cause problems. Any snap measurement could offer a warped view of a reality that would produce very different data if captured on a different date (were the statistics exactly the same on the 2nd of November?) and perhaps not highlight gradual drop out as those learners went through the first term of their KS5 routes. To combat this and to show trends the Authority repeat the exercise in the following April with the same year group and the results of this follow up snapshot for the 2012 leavers are in the columns on the right below.

Clearly the largest change between the November and April is the Part time Education number now reads zero and the number of Apprenticeships has jumped by the same number to 12. How much of this change can be attributed to data entry decisions or to the steady progress of our leavers securing Apprenticeships in the year school would only be known to those with local knowledge of our alumni. It’s a tale not told in the stats.

So, what can we learn from all this data?

1) The considered publication timeframe on the DfE performance tables has both good and bad sides for judging school performance

2) When you drill down to school level, the numbers of actual students involved moving from category to category can be small enough so that only a few students fluctuating between them can significantly impact the percentages

and that

3) Trends in destination growth or reduction for different routes can only be properly identified with multiple data sets over a longer period

If Ofsted and stakeholders such as parents are to get the most out of Destination data in its current form, a considered and measured view and a desire to understand the stories behind the figures really will be required.