The Great Grading Scandal Engineering Challenge

This guest blog has been kindly provided by Dr Dennis Sherwood of Silver Bullet machine, an intelligent innovation consultancy, who was a speaker at the first of this year’s Recruitment & Admission Forum series of webcasts.


Calling all engineers!

Engineers love solving problems, and are very good at it. So this blog poses a real problem, a problem that has eluded solution for at least a decade, and a problem that does much damage every year. You are invited to think of a solution – or indeed more than one – and either post your thoughts in the comments on this page or in the thread on the Engineering Academics Network page on LinkedIn.

The problem – the Great Grading Scandal

Every year, about 6 million GCSE, AS and A level grades are awarded in England. And every year, about 1.5 million of those grades are wrong – about half too high, half too low. That’s, on average, 1 wrong grade in every 4. In this context, “wrong” means “the originally-awarded grade would be changed if the script were to be re-marked by a senior examiner, whose mark, and hence grade, is deemed by Ofqual, the exam regulator, to be ‘definitive’” – or, in more every-day language, ‘right’. 

But when a student is informed “Physics, Grade B”, the student is more likely to think “Oh dear, I didn’t do as well as I had hoped”, rather than “the system got it wrong – the grade should have been an A”. So there are very few appeals: for example in 2019 in England, there were 343,905 appeals resulting in 69,760 grade changes, when in fact, as I have just mentioned, nearly 1.5 million grades were wrong.  Exam grades are therefore highly unreliable, but very few people know. That’s what I call the “Great Grading Scandal”.

The evidence – Ofqual’s research

Ofqual’s November 2018 report, Marking Consistency Metrics – An update, presents the results of a study in which whole cohorts of GCSE, AS and A level scripts, in each of 14 subjects, were marked twice, once by an ordinary examiner and once by a senior examiner.  For each subject, Ofqual could then determine the percentage of the originally-awarded grades for each subject that were confirmed by a senior examiner, so determining a measure of the reliability of that subject’s grades. Since this research involved whole cohorts, the results are unbiased – unlike studies based on appeals, which tend to be associated with scripts marked just below grade boundaries.

If grades were fully reliable, 100% of the scripts in each subject would have their original grades confirmed. In fact, Ofqual’s results ranged from 96% for Maths to 52% for the combined A level in English Language and Literature. Physics grades are about 88% reliable; Economics, about 74%; Geography, 65%; History, 56%. The statement “1 grade in 4 is wrong” is an average, and masks the variability by subject, and also by mark within subject (in all subjects, any script marked at or very close to a grade boundary has a probability of about 50% of being right – or indeed wrong).

The cause – “fuzzy” marks

Why are there so many erroneous grades? The answer is not because of “sloppy marking”, although that does not help. The answer is attributable to a concept familiar to every engineer reading this: measurement uncertainty. Except for the most narrowly defined questions, one examiner might give a script 64, and another 66. Neither examiner has made any mistakes; both marks are legitimate. We all know that.

In general, a script marked m is a sample from a population in the range m ± f, where f is the measure of the subject’s “fuzziness” – a measure that, unsurprisingly, varies by subject with Maths having a smaller value for , and History a larger value.

Ofqual’s current policies 

This fundamental fact is not recognised by Ofqual. Their policy for determining grades – a policy that is current and has been in place for years – is to map the mark m given to a script by the original examiner onto a pre-determined grade scale. And their policy for appeals is that if a script is re-marked m*, then the originally awarded grade is changed if m* corresponds to a grade different from that determined by the original mark m.

Ofqual policies therefore assume that the originally-given mark m and the re-mark m* are precise measurements. In fact, they are not. That’s the problem.

Your challenge

Your challenge is to identify as many alternatives as you can for one or both of these policies such that your solutions:

  1. recognise that the original mark m is not a precise measurement, but rather of the form m ± f, where the fuzziness f is a constant for each subject (and not dependent, for example, on the mark m, and which, for the purposes of this challenge, is assumed to be known), and
  2. result in assessments, as shown on candidates’ certificates, that have a high probability (approaching 100%) of being confirmed, not changed, as the result of a fair re-mark m*, thereby ensuring that the first-awarded assessment is reliable.

Genuinely, we want to hear your thoughts either in the comments on this page or in the thread on the Engineering Academics Network page on LinkedIn.

Click here for more details about the forthcoming webcasts in the EPC Recruitment and Admissions Forum Series and to book your place.

New “Engenius Film” added…

P1010130Dr Emma Carter of the University of Sheffield was one of the two winners of the EPC's 2014 Engaging in Engineering public engagement grant awards.  The latest of her series of films, developed as a resource for schoolchildren, their teachers and advisers, talks about "Designing your own racing car..."

Make sure you subscribe to her RSS feed to be the first to hear about new films as they’re published…

 

She’s also written a guide to how to make your own films too.  Check it out here.

For more information and help on support for recruitment activity, please go to our toolkit.

 

 


Engenius Films

2014/15 undergraduate applications to engineering up 10.4%

14-07 applications position as at June 2014The UCAS data released on 11 July, 2014 shows the number of undergraduate applications to UK universities for 2014/15 entry received by the June deadline were up 4.1% compared with the same point in 2013/14.  Applications received from now on will be dealt with in the clearing system.  But it’s even better news for engineering with applications up 10.4% – one of the three biggest percentage increases (after Technologies at 12.8% and Computer Science at 12.1%) – compared with 2013/14 and 17.1% compared with 2012/13.  Applications from UK and EU students were particularly strong.

The proportion of engineering applications as a percentage of all applications is on the increase too.  In 2010/11, engineering accounted for 5% of all applications – for 2014/15 it’s 5.7%.

EPC members can click on the chart to view the underlying data – analysed for UK, EU and non EU applicants.  Non members may read the UCAS briefing at this link.

Protected: New discussion board for Recruitment Working Group

This content is password protected. To view it please enter your password below: