Are the OfS proposals on quality and standards good for the sector?

In this blog, Chief Executive Johnny Rich provides a critical commentary on the Office for Students (OfS) proposals on quality and standards – potentially one of the most important changes to its practice since its inception. 


These are his personal opinions to help stimulate the community into synthesising for themselves the possible implications and they do not necessarily reflect the views of the EPC nor its members.


We’ve posted a summary of the OfS consultation paper here and the OfS’s own documents can be found here. 

We are conducting a survey, so that the EPC can give a representative response. Please contribute your thoughts.

(Rhys A. via Flickr, CC-BY-2.0)

The Department for Education has got a hold of the idea that ‘low quality’ courses are rife in UK higher education. It is determined that they must rooted out. It’s hard to argue with a policy of tackling shortcomings – that makes it good politics, without necessarily being good policy. 

The problem is that there’s not actually sound evidence to support this idea of low quality. What do we even mean by it? 

Until recently the terminology was ‘low value’, but a pandemic-related newfound appreciation for low-paid nurses made it seem too reductive to focus too much on graduate salaries and that highlighted how problematic it is to define value. So the DfE now prefers to talk about ‘quality’ but the definition is no clearer. 

Never mind, that’s not the Government’s problem. The English regulator can worry about that. That’s why we now have a potentially far-reaching consultation from the Office for Students (OfS) on quality and standards and what, in future, the OfS proposes to do to hunt down low quality and quash it with “tough new measures”. 

As the consultation was launched, Universities UK boldly stepped up to defend the honour of courses everywhere by announcing a new charter on quality. Well, not actually a new charter, but a working group to consider one. I fear, however, that the possibility of a charter is like trying to slay this dragon with a plastic butter knife.

“This is full-on chemotherapy to treat a few bunions.”

So what’s so bad about the proposals? Surely, if there is any low quality, we should all want to cut it out? And if there’s not, there’s nothing to fear? Sadly, the cure is not harmless. This is full-on chemotherapy to treat a few bunions. 

It would be complacent to imagine there are not many courses and whole institutions that can be improved and there are many in HE who say we won’t win friends by being close-minded to criticism. I agree. Indeed, the EPC regularly engages in activities to support enhancement of teaching, learning and student experience. 

But we don’t do anyone any favours by not being rigorous about critics’ motives and the evidential basis for change. Every international comparison suggests we have one of the best HE systems in the world and calling for “tougher measures” as if there’s a regulation deficit is more to do with doing something than doing something necessary or even justified. 

This isn’t about fixing an actual problem. It’s about pandering to backward-looking diehards that see unis as too open, full of ‘Mickey Mouse’ courses with too many students. 

No one could call engineering a Mickey Mouse course, though, so perhaps we needn’t worry? Well, even if we didn’t care about colleagues and students in other disciplines, rather than improve HE, the proposals are likely to narrow fair access, stunt social mobility and protect elitism while cementing the goal of high education as basically a job conveyor belt. That’s not good for anyone.

Let’s just start by mentioning the timing. As EPC members know all too well, in recent months it’s been really tough to deliver high-quality courses and so choosing this moment to launch a major consultation on delivering high-quality courses, is, to say the least, insensitive. In all likelihood, it may distract from the very delivery that they want to improve and any current data that may be used to inform the consultation is going to be from the most outlying wildernesses of every bell curve. 

OfS proposes gifting itself greater powers to assess Higher Education Institutions (HEIs) more closely, including at subject level, and to apply sanctions for performance metrics (on continuation, completion and progression) that it deems unsatisfactory. These sanctions could include fines and even deregistration. A full summary of the OfS proposals is here.

“There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences.”

There are many obvious concerns about this. For a start, metrics are a very blunt instrument. There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences. (Ironically, this is exactly the argument that the Government has recently been deploying about the National Student Survey, which it has suddenly decided is the cause of supposed dumbing down. Again, there’s no actual evidence of this. Nevertheless, the OfS is currently reviewing that too.)

OfS wants to take a more metric-based approach in order to reduce bureaucratic load, which is fair enough, but if you want data to tell you anything useful, you really need to understand the context. No two HEIs are alike and the same numbers can tell very different stories. 

The consultation does explicitly acknowledge that the context needs to be considered, but the context will explicitly exclude anything to do with the socioeconomic disadvantage or other protected characteristics of the students (disability, ethnicity, etc). OfS intends to impose non-negotiable “numerical baselines” – ie. cut-offs for universities with outlying data – whatever the reason.

Some unis and courses will end up being ‘low quality’ for a host of reasons to do with their intake rather than anything that they might actually be doing wrong. Quite the opposite, trying too hard to do the right thing will open them up to sanctions. 

For example, dropout rates are higher among students with extra financial or social challenges, and bias in recruitment practice disadvantages certain graduates. So if students are from lower socioeconomic or minority ethnic backgrounds, or they are disabled or they are returners to study, their course might look ‘low quality’ while actually the prospects of those students (compared to not having achieved that degree) have been greatly improved.

BTEC students, for instance, have far higher non-continuation rates on engineering courses than students with A level maths and physics. When they do graduate, they face higher hurdles in gaining employment because they may not have the connections, the extra-curricular brownie points and the right accent. Is it really fair for the OfS to hold an HEI that helps these students establish fulfilling lives to the same standards as a university with nearly half of its intake with straight As from private schools?

HEIs could also be penalised for being based in parts of the country with lower employment rates or for drawing students from the locality who might want to stay in their home region post-graduation. Social mobility should not have to mean geographic mobility. To many students a positive outcome means worthwhile employment in their home region rather than maximising their income by moving away. 

This is not only a fair choice for them to make, it’s a really positive choice for the Government’s goal of levelling up regions by creating high-skilled employment in disadvantaged areas. Penalising universities that support this is counterproductive.  

“Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades?”

Outcomes data will also be subject to the vagaries of economic circumstances. Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades? These effects can happen locally too, which means they affect individual universities and subjects. For example, if a big local employer exits a region, there may be a knock-on effect for local courses and graduates.

Employment effects take time to show up in the data – a lag of several years if you want to get a reliable picture. By design, the metrics will identify only those stables where horses have long since bolted. By the time problems show up in the data, the HEI will have known about it for a while and may well have either improved or closed a course if it was genuinely deficient. “Tougher measures” won’t support this in any way, but they might close courses that have turned around.

Positive employment outcomes need to show themselves quickly. HEIs won’t want to encourage enterprising students to start businesses that may take a few years to mature, earn money for their founders and create wider jobs and prosperity. Because that would be ‘low quality’.

And, of course, the focus on continuation and employment penalises any subject that attracts students who are studying for love of learning rather than for the sake of optimising their employment outcomes.

It also penalises those courses that allow students to do anything other than join a course, stay the duration, and graduate. None of the hop-on-hop-off flexibility that the Government has been urging in other policy initiatives and which the evidence says is needed.

By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics.”

Worst of all, depending on how the ‘tougher measures’ are applied, it is statistically inevitable that a recklessly wielded axe will cut off healthy branches. By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics. 

There will always be a bottom of the pile to be removed. Someone will always need to be penalised to justify the quality assurance process. Being in the relegation zone in the Premier League doesn’t mean yours is a bad team, it simply means a team whose performance has been relatively less good amongst the very best. Sadly, the sanction for HEIs and courses will not be relegation, but elimination.  

If the comparison of what constitutes a ‘low quality’ course is made at a subject level, rather than across all HE courses, then some departments that have good metrics compared to other subjects, will be made to suffer. For example, an engineering course that is ‘low quality’ in comparison to other engineering courses, may be sanctioned. 

However, on the other hand, if the comparison is made at HEI level, then certain subject areas will be the victim because their outcomes do not translate easily into highly paid workplaces. Heads I win, tails you lose. 

Ultimately, this is likely to encourage universities to be risk-averse in their admissions, effectively raising the bar for any students that don’t look like those who have been successful in the past, closing down the opportunities until only those least in need of a break can get a look-in.

Even if all these proposals were a good idea – and you may have gathered I don’t think they are – this level of oversight by the regulator might not even be legal. I am sure the OfS has consulted its lawyers carefully, but it’s hard to square this with what was intended by the Higher Education & Research Act (HERA). 

HERA’s progress through the chamber of Parliament saw more amendments than almost any other bill ever laid before the Lords and, among those that the Government was forced, reluctantly, to accept was the legal protection of the institutional autonomy of universities. The Lords could see that one day the regulator would be asked to overstep the mark and tried to set protections in stone. These proposals would undermine those protections, undermine that autonomy and – ironically – undermine the very standards of high-quality education for all who can benefit from it that they seek to improve.


Do you agree? Do you disagree? Are there other important points to make? How might this affect engineering courses? Please respond to our survey here.

The OfS consultation on quality and standards in a nutshell

The Office for Students has just launched a consultation on one of the most important changes to its practice since its inception. What does it say? We’ve summarised the key takeaways. We have also published a personal perspective on the wisdom of the proposals by the EPC Chief Executive.

In 2017, the Higher Education & Research Act (HERA) dissolved HEFCE, which was a funding body, and replaced it with the OfS which began work the following year as the regulator of higher education in England. In the process it subsumed the remaining activities of HEFCE and OFFA (the Office for Fair Access). 

Since then, some of OfS’s main activities have included establishing a register of approved higher education institutions and signing off on the ‘Access and Participation Plans’ of those institutions that want to be able to claim funding via the Student Loans Company. 

The OfS’s regulation of HE quality and standards has been through signalling and recognisable processes, mostly farmed out under a contract with the QAA. There have been a few interventions from OfS on grade inflation, unconditional offers and TEF, but these haven’t been accompanied by significant new regulatory controls. 

Although OfS does have powers in case of failure (and it has used them by rejecting the registration of a few institutions), its light-touch approach was in keeping with the spirit of HERA, which, during its difficult passage through the Lords was amended to include an explicit commitment to the autonomy of higher education institutions (HEIs) over their admissions and the education they deliver. 

But now the OfS is consulting on a what it calls “tougher minimum standards” with the threat of fines and even deregistration for HEIs that don’t meet them. These powers, it is proposed, will be exercised not merely at an institutional level, but at a subject level too, which, in effect, might allow OfS to exert direct or indirect pressure on an HEI into closing a department whose metrics looked like underperformance. 

The EPC will be responding to this consultation on behalf of members and we’re keen to hear what you think. We will be inviting members views through a survey shortly. (Come back here for the link.) To help you, we’ve provided the following summary of the proposals.

So what are the proposals? There are four areas:

1. “Define ‘quality’ and ‘standards’ more clearly for the purpose of setting the minimum baseline requirements for all providers”

‘Quality’ will be defined in a metric way. This is, it is said, intended to reduce the regulatory burden. The metrics will relate to five areas: access and admissions; course content, structure and delivery; resources and academic support; successful outcomes; secure standards. 

The inclusion of ‘access’ does not mean wider participation targets, but rather admitting students who “have the capability and potential to successfully complete their course”. OfS has been explicit in saying that it “is not acceptable for providers to use the proportion of students from disadvantaged backgrounds they have as an excuse for poor outcomes”. In other words, they are rejecting the idea that non-academic circumstances or lower prior attainment might be mitigating circumstances for lower (according to the metrics) student outcomes. The argument put forward is that using the greater challenges of certain students as an “excuse” would “risk baking disadvantage into the regulatory system”.

The goalposts will be different for new HE institutions, because they can’t be judged on track record.

OfS will also set ‘standards’ for higher education – that is any courses beyond A level or equivalent (so that means drawing higher apprenticeships and other programmes into a unified quality framework). These standards will involve “sector-recognised” definitions of achievement – in other words, OfS intends to establish common standards for degree grades.

2. “Set numerical baselines for student outcomes and assess a provider’s absolute performance in relation to these”

OfS would impose “a numerical baseline”: this is intended to be a cliff edge for outcomes metrics, namely continuation to second year, course completion and progression into graduate-level work or further study. (There’s also a reference to employer satisfaction, but as there are no measures for that, it’s only an aside.) If you fall off the cliff, there’s a range of sanctions (see below) including fines or even deregistration of the institution.

What will matter is absolute – not relative – data. There is a reference to considering the context, but this is more to do with what may have changed rather than a profile of the student body. Unequivocally, the consultation paper states, “We would not set lower regulatory requirements for providers that recruit students from underrepresented groups, or with protected characteristics.” The idea is to spell out “more challenging” minimum standards that students can expect. 

Further consultation will be conducted around the exact metrics.

3. “Clarify the indicators and approach used for risk-based monitoring of quality and standards”

As the metric used for the baseline are about things that have happened in the past, the OfS proposes to keep an eye on potential risks in institutions by monitoring other metrics and being clear about which metrics those are. Among those mentioned are admissions data (offers, grades achieved, student demographics), student complaints, National Student Survey results, other regulators’ and PSRBs’ activities, TEF, and the outcomes metrics as above. It should be noted, by the way that NSS is currently under a separate OfS review and we’ve been awaiting the publication of an independent Review of TEF for DfE for nearly two years (which is believed to be critical).

There may be some extra data gathering and reporting for universities, but the intention is to minimise the need for unnecessary interference in the long-run by identifying risks before they become problematic outcomes. 

4. “Clarify our approach to intervention and our approach to gathering further information about concerns about quality and standards”

This proposal sets out what might be called a precautionary approach to intervention. In other words, the OfS makes it clear they would be willing to step in to investigate or gather evidence in the case of a feared risk of an institution failing to meet quality thresholds. 

It also sets out their available “enforcement” actions: impose conditions on an institution in order for it to continue to be registered; issue a fine; suspend some of the privileges of being registered (such as access to student loan funding for fees or OfS public grants); remove an institution’s degree-awarding powers or its right to use ‘University’ in its title; deregistration.

Please note: This precis is intended as guidance only. The aim has been to summarise the proposals objectively while providing some interpretation of their implications. Necessarily this involves some subjective inference and the omission of details. We advise referring to the OfS’s own consultation documents for the full details. Also, if you feel we have interpreted any proposals wrongly, unfairly or left out critical details, please let us now and we can make changes to this summary as needed.

The Great Grading Scandal Engineering Challenge

This guest blog has been kindly provided by Dr Dennis Sherwood of Silver Bullet machine, an intelligent innovation consultancy, who was a speaker at the first of this year’s Recruitment & Admission Forum series of webcasts.


Calling all engineers!

Engineers love solving problems, and are very good at it. So this blog poses a real problem, a problem that has eluded solution for at least a decade, and a problem that does much damage every year. You are invited to think of a solution – or indeed more than one – and either post your thoughts in the comments on this page or in the thread on the Engineering Academics Network page on LinkedIn.

The problem – the Great Grading Scandal

Every year, about 6 million GCSE, AS and A level grades are awarded in England. And every year, about 1.5 million of those grades are wrong – about half too high, half too low. That’s, on average, 1 wrong grade in every 4. In this context, “wrong” means “the originally-awarded grade would be changed if the script were to be re-marked by a senior examiner, whose mark, and hence grade, is deemed by Ofqual, the exam regulator, to be ‘definitive’” – or, in more every-day language, ‘right’. 

But when a student is informed “Physics, Grade B”, the student is more likely to think “Oh dear, I didn’t do as well as I had hoped”, rather than “the system got it wrong – the grade should have been an A”. So there are very few appeals: for example in 2019 in England, there were 343,905 appeals resulting in 69,760 grade changes, when in fact, as I have just mentioned, nearly 1.5 million grades were wrong.  Exam grades are therefore highly unreliable, but very few people know. That’s what I call the “Great Grading Scandal”.

The evidence – Ofqual’s research

Ofqual’s November 2018 report, Marking Consistency Metrics – An update, presents the results of a study in which whole cohorts of GCSE, AS and A level scripts, in each of 14 subjects, were marked twice, once by an ordinary examiner and once by a senior examiner.  For each subject, Ofqual could then determine the percentage of the originally-awarded grades for each subject that were confirmed by a senior examiner, so determining a measure of the reliability of that subject’s grades. Since this research involved whole cohorts, the results are unbiased – unlike studies based on appeals, which tend to be associated with scripts marked just below grade boundaries.

If grades were fully reliable, 100% of the scripts in each subject would have their original grades confirmed. In fact, Ofqual’s results ranged from 96% for Maths to 52% for the combined A level in English Language and Literature. Physics grades are about 88% reliable; Economics, about 74%; Geography, 65%; History, 56%. The statement “1 grade in 4 is wrong” is an average, and masks the variability by subject, and also by mark within subject (in all subjects, any script marked at or very close to a grade boundary has a probability of about 50% of being right – or indeed wrong).

The cause – “fuzzy” marks

Why are there so many erroneous grades? The answer is not because of “sloppy marking”, although that does not help. The answer is attributable to a concept familiar to every engineer reading this: measurement uncertainty. Except for the most narrowly defined questions, one examiner might give a script 64, and another 66. Neither examiner has made any mistakes; both marks are legitimate. We all know that.

In general, a script marked m is a sample from a population in the range m ± f, where f is the measure of the subject’s “fuzziness” – a measure that, unsurprisingly, varies by subject with Maths having a smaller value for , and History a larger value.

Ofqual’s current policies 

This fundamental fact is not recognised by Ofqual. Their policy for determining grades – a policy that is current and has been in place for years – is to map the mark m given to a script by the original examiner onto a pre-determined grade scale. And their policy for appeals is that if a script is re-marked m*, then the originally awarded grade is changed if m* corresponds to a grade different from that determined by the original mark m.

Ofqual policies therefore assume that the originally-given mark m and the re-mark m* are precise measurements. In fact, they are not. That’s the problem.

Your challenge

Your challenge is to identify as many alternatives as you can for one or both of these policies such that your solutions:

  1. recognise that the original mark m is not a precise measurement, but rather of the form m ± f, where the fuzziness f is a constant for each subject (and not dependent, for example, on the mark m, and which, for the purposes of this challenge, is assumed to be known), and
  2. result in assessments, as shown on candidates’ certificates, that have a high probability (approaching 100%) of being confirmed, not changed, as the result of a fair re-mark m*, thereby ensuring that the first-awarded assessment is reliable.

Genuinely, we want to hear your thoughts either in the comments on this page or in the thread on the Engineering Academics Network page on LinkedIn.

Click here for more details about the forthcoming webcasts in the EPC Recruitment and Admissions Forum Series and to book your place.

Does accreditation help or hinder innovation?

In advance of the EPC’s forthcoming live webcast, one of the panellists, Prof Sean Wellington, considers whether the requirements of accreditation help foster new approaches to engineering higher education.


Academic accreditation of engineering degrees is a well-established feature of UK higher education. It is seen as a valuable ‘kite mark’ for degree providers operating in a marketized higher education system and confers some benefits for graduates who wish to seek professional registration. However academic accreditation has both costs and benefits. 

Prof Sean Wellington
Professor Sean Wellington FIET PFHEA is Pro Vice-Chancellor and Executive Dean of the Faculty of Science and Technology at Middlesex University. A past Chair of the IET Academic Accreditation Committee, Sean has a particular interest in engineering education and the professional formation of Engineers. He chaired the Engineering Council Working Group that developed AHEP Edition 4 and is a member of the Accreditation Review Working Group.

Some costs are obvious, such as the staff time required to prepare for an accreditation visit and possibly a fee payable to the Professional Engineering Institution (PEI). The degree provider (the university) also has to abide by the ‘rules of the game’. This is where things can get complicated because there are several sets of rules in play.

The Engineering Council handbook for academic accreditation is a permissive document that defines output standards for the various types of accredited degree through learning outcomes, but it does not define how the learning outcomes are taught or assessed. The standard, Accreditation of Higher Education Programmes (AHEP), also outlines the requirements and process for academic accreditation.

Additionally, there are the documented policies and procedures of the different PEIs licensed by the Engineering Council to accredit degree programmes, and finally the unwritten custom and practice of the PEI and the interpretation and application of the written and unwritten ‘rules’ by a particular accreditation visit panel.

PEIs are encouraged not to define rules beyond the AHEP standard. However, many chose to do so: for example, requiring major group or individual projects, perhaps with a specified credit weighting, specific curriculum content or the use of formal written examinations. The Engineering Council has licensed some 35 PEIs to accredit degree programmes and many higher education providers are working with several PEIs who may have different (and even antagonistic) approaches. These differences are particularly noticeable where units concerned with distinct engineering specialisms have been integrated into larger multidisciplinary engineering schools or departments.

Universities, when required to navigate different PEI requirements, may be forgiven for taking a defensive approach. Visit panels represent another unknown since the outcome of the engagement is heavily dependent on the individual and collective judgement of the panel members. These panel members, normally unpaid volunteers, do vitally important work, however relatively few of the PEIs that accredit degree programmes operate at the scale necessary to support a dedicated staff team for academic accreditation and the training and support for volunteers is somewhat variable. Panel members may also lack familiarity with new approaches to teaching, learning and assessment.

There is a long tradition of scholarship and innovation in engineering higher education so change is possible. For accreditation to be conferred, a degree provider must convince the PEI that their approach is equivalent to established practice and PEIs have different ‘red lines’ that limit what can be achieved. This has the potential to inhibit new thinking, however professional accreditation can also be used as a convenient defence mechanism by those unwilling or reluctant to embrace change.

It should also be possible to use the accreditation process to share innovative practice, particularly where this can help address issues of general concern to the sector. Many PEIs identify and record good practice in their accreditation visit reports, however such practice is not widely shared or celebrated. A mechanism to share innovative practice might involve AdvanceHE and connect with existing awards such as CATE and NTF.

The Engineering Council has responded to concerns expressed by higher education providers and sector bodies – including the EPC – by initiating a review of accreditation. I believe we need to retain the strengths of the current system but reduce unnecessary and unhelpful differences in approach. There are real and perceived barriers to innovation, however AHEP Edition 4, to be launched in September 2020, is quite clear –

Higher Education providers are encouraged to develop innovative degree programmes in response to industry needs and the Engineering Council does not favour any particular approach to teaching, learning or assessment. The key consideration is that all graduates from an accredited degree programme must meet all of the prescribed learning outcomes. Assessment should be designed to minimise opportunities for students to commit academic misconduct, including plagiarism, self-plagiarism and contract cheating.

We must not lose our willingness to innovate. For example, our recent experiences of remote teaching and assessment forced by the COVID-19 crisis can shape long-term changes to our teaching, learning and assessment practice that will benefit students. To this end, we should work with Engineering Council and PEIs to support the current accreditation review and ensure unnecessary barriers to innovation are removed.


The live webcast ‘Accreditation & Innovation’ will be held at 2pm on 14th July 2020. Registration is free to EPC members, but booking is essential. This webcast is part of the New Approaches to Engineering Higher Education series, held in partnership with the IET. Recordings from the webcast series are available on the recent events page.

New approaches to Engineering Higher Education

BEST PRACTICE IN ENGINEERING EDUCATION

Featured Articles

The EPC has been working with the Institution of Engineering and Technology (IET), academics and industry for over two years to encourage and support changes to how students are taught to become engineers. 

During this time, we have seen innovative courses that encourage project work, include industry whenever they can and work on increasing diversity.

Our recent conference (November 2019) showcased a wide range of case studies from universities, and the people within them, who have led the changes to create innovative and forward-thinking degree courses.

We have, with the IET, pulled together these case studies conference proceedings focus not only on the changes that have been made but also how they were achieved. Download the proceedings here.

In addition to the case studies, the day also included poster presentations from universities in the process of making some equally innovative changes to their engineering courses. The papers for these can be seen here:

Canterbury Christchurch University of Hertfordshire
Imperial College LondonUniversity of Sheffield
NMiTETEDI – London
University of Strathclyde

More information on our work to date is available here.

Guest blog: ‘If you were an engineer, what would you do?’

By Dr Susan Scurlock MBE – CEO of Primary Engineer

If you are one of the 125,000+ passengers per day heading through Gatwick South this summer, you may just spot your university’s Leaders Award prototype on the huge hoarding showcase.

Thanks to 49,000 school children aged between 3 and 19, 33 regional funders, three new national funders – Facebook, Network Rail and Gatwick Airport – and 19 university supporters (not forgetting the EPC’s support!) Primary Engineer is delighted to announce its ‘Wall of Fame 19’.

Gatwick Airport has today (August 13th) launched a three-week long exhibition of winners of the Primary Engineer Leaders Award ‘If you were an engineer, what would you do?’. The intention is to profile the university-builds from this and previous years and ask for a popular vote from the £2.6 million+ passengers walking through the terminal during the 3-week exhibition at the busiest time of year.

‘Wall of Fame 19’ showcases 11 inspirational prototypes of inventions designed by pupils from across the country and built by engineering students and technicians from universities in every UK region. Three working prototypes will be displayed – the Bicycle Sucker (built by Kingston University), the SMA Jacket (built by UCLan) and the Flat Pack Wind Turbine (built by Glasgow Caledonian University).

The Primary Engineer Leaders Award – “If you were an engineer, what would you do?” – links both primary and secondary schools with engineering professionals from across the sectors.  The competition promotes engineering to young people, with a 50/50 gender split for entries, and allows them to find the ‘engineer within’ by designing solutions to problems they have identified.

Primary Engineer is a not for profit educational organisation. Its approach brings engineering and engineers into primary and secondary classrooms and curricula; inspiring children, pupils and teachers through continued professional development, whole class projects, and the competition.

Dr. Susan Scurlock, MBE, founder of Primary Engineer said: “This exhibition at one of the most important travel hubs in the UK is testament to the commitment of commercial organisations, schools and universities who are all doing their bit to help pupils tap into their inner engineer. Each year I am astounded by the designs by pupils, some as young as 3, as they identify problems to solve which are important to them and in turn inspire engineers to build their solutions. We started by asking engineers to inspire children and have found that children inspire engineers. Perfect!” 

You don’t need to be passing through Gatwick to vote. The voting page is available at www.leadersaward.com/walloffame19/ and will feature each drawing, and photograph of each invention from this year and, in a separate section an opportunity to vote for previous years’ builds – we are looking to identify 2 winning builds. Please do vote and tweet “I have voted for my favourite design #walloffame19 @leadersaward!”.

New! DATA BLOG: Grade inflation?

Earlier this month, the OfS published a new release of degree classification data, concluding that the growing proportion of the first and upper second class degrees awarded cannot be fully explained by factors linked with degree attainment. Specifically, the new analysis finds that in 2017-18, 13.9 percentage points’ worth of first-class degree attainment is unexplained by changes in the graduate population since 2010-11, an increase of 2.4 percentage points from the unexplained attainment in 2016-17. So we have it – grade inflation.

So, we’ve fished some unfiltered HESA data out of our archives, updated it, and looked at the distributions between first, second and third-class honours in engineering. And it seems that engineering paints a very different (worse?) picture than the sector as a whole. We award a notably higher proportion of firsts and, at a glance, a commensurately lower proportion of 2nd class honours. The proportion of 3rd class honours/pass awarded has come into line with the all subjects over recent years. It varies by engineering discipline, but nowhere is the proportion of firsts lower than for all subjects.

You might think, then, that high-level degree awards in engineering (firsts plus upper-class seconds) were nothing to write home about. But in 2016/17, at 77.3%, the proportion of high-level degree awards in engineering was one percentage point higher than for all subjects (and the difference has fluctuated around the one percent mark for the past ten years).

A simplified index plot, where 1 (the central y axis) represents all subjects, shows the propensity of a first in engineering is consistently greater than for all subjects (where the longer the bar, the greater the over-representation). The over-representation of firsts in engineering has shown a notable reduction over the past ten years and, at 1.4, was at its lowest yet in 2017/18. The overrepresentation of third-class honours in engineering visible from 2007/08 to 2015/15 has now been eliminated. You can see from this analysis that the over-representation of firsts is in fact greater than the combined under-representation of 2:1s and 2:2s.

So, what does this tell us? That the rise in higher degree classifications doesn’t apply to engineering? The number of high-level degrees in engineering has increased from 10,180 in 2007/8 to 18,690 in 2017/8, an increase of 83.6%. Proportionally, this has risen from 62.7% of all degree awards in engineering to 77.3%. That’s just marginally less proportional growth than the 14.9 percentage point difference for all subjects. But we are making progress.

Here’s the rub, who’s to say that rises in high-level degree classifications (which, sector-wide, cannot be explained by the data readily available – not my data) is necessarily a problem per se, or that is signals grade inflation? There are many reasons – not accounted for in the OfS statistical models – for degree outcome uplift, not least the massive expansion of student numbers in the last 20 years (leading to a less socially constrained pool of students); greater awareness of student support needs; the increased cost of higher education to students; more incentivised and focused students; and improved teaching in both schools and universities. Further, there is evidence that market forces; course enrolments; progression rules (e.g. progression from BEng to MEng requires achievement of marks for the first two or three years of study suggesting a minimum 2:1 standard, and therefore likely transfer of the best students away from the BEng); and the marking processes adopted by different subject areas impacts the proportion of upper degrees between subjects.

The evidence of improvement in teaching (and the development of pedagogy in UK universities) is much stronger than the evidence for grade inflation. As a discipline, this is what we must celebrate. Higher education (HE) is the gold standard in the delivery of engineering skills in the UK and has a strong international standing and reputation.

Let’s face it, the assumption that institutions need to account for grade inflation rather than educational improvement is perverse. Instead, let’s talk about and encourage innovation in teaching, learning and assessment, precisely what our New Approaches to Engineering Higher Education initiative (in partnership with the IET) aims to do. Earlier this year we launched six case study examples for each of the six new approaches, evidencing that the required changes can be achieved – are already being achieved – and we now want other institutions who have been inspired to come up with new approaches of their own to showcase their work at a New Approaches conference at the IET in November. More details will be circulated shortly.

Attribution: EPC analysis of HESA Student Qualifiers Full Person Equivalent (FPE) using Heidi Plus Online Analytics service.

Blending arts and sciences: gimmick or necessity?

The two culture of arts and sciences are like oil and water, but, asks Prof Mehmet Karamanoglu, could they be mixed? Indeed, perhaps it’s essential that we get them to learn from each other?


The higher education sector has been battling with the issue of introducing ‘creativity’ into engineering education for decades, as if this never exists in engineering programmes. 

Many institutions in the UK have tried to address this by creating collaborative programmes between departments of Engineering and Art & Design. The academic programme often sits in an Engineering department with modules from the Art & Design department, but less so the reverse. 

Over the past 30 years, I have seen such projects come and go and the end result has been the same – not a positive experience for students or staff involved. It goes without saying that there are also issues in the use of the terminology – we often talk about ‘Arts and Sciences‘, but what we really mean is ‘Design and Engineering‘. 

In an attempt to explain why such collaborations have not been successful, we often put this down to the fact that the two areas have their own cultures. This gives rise to the term you now see used by the media and politicians, the ‘Two Cultures‘: although the term has been used in academic debate for decades since C P Snow’s lecture of that name in 1959.

To look at this more closely, first we need to understand the obstacles that get in the way. Let’s call these two cultures, Camp A and Camp S.

Some key characteristics:

  • Camp A has a monopoly on the word ‘creative’ and no other camp can use it.
  • Camp S does not associate itself with the word ‘creative’ even though it practices it daily to solve problems. 
  • Camp A hates structures and rules, an inherent part of its often rebellious makeup.
  • Camp S cannot operate without structures and rules – operates systematically and hates change.
  • Camp A is territorial even within itself. Not really happy to share resources. Each of its constituents operates in an autonomous mode.
  • Camp S is territorial externally but unified within itself. 
  • Camp A are divergent thinkers, hate constraints, often not interested in the end result but the journey it takes and the experience of that journey. The destination is often irrelevant.
  • Camp S applies constraints too soon and arrives at a destination but may miss vital opportunities along the way. It operates too rigidly.
  • Camp A practices team teaching, often with contradictory views among its members.
  • Camp S operates in solo mode – one class, one master.
  • Camp A showcases their work and teaches by teams of staff. Each team owns their programme and has their own work space.
  • Camp S keeps their work preserved for themselves, does not show off.

Barriers to making the two camps work together:

  • Financial barriers – budgets that are devolved to individual camps is a key obstacle and will lead to effort being spent on counting pennies than producing useful work.
  • Having own physical facilities – ends up in duplication of resources, neither as good as they ought to be.
  • Lack of trust, value and respect in each other’s way of working.
  • Each camp retaining their work environments and students visiting each camp for their studies.
  • If this is an academic programme, as the approaches are so different, this will set some serious confusion for students, they will end up as academic schizophrenics.

My personal experience to crack this issue:

  • Do not force the two camps to come together artificially. It is akin to making an academic emulsion but with far worse side effects. So many try to create joint ventures or programmes, but blending the two cultures from two separate entities does not work as they always preserve their inherent make-up. Short term success is possible, but it is not sustainable. It relies heavily on individual personalities which often clash and so the success does not last. 
  • The only successful way that has stood the test of time is to grow a single but a mixed-culture camp from scratch. In the camp you will need staff with Camp A and Camp S characteristics, but the critical point is that they belong to the same camp.
  • There are no financial barriers – it is a single camp with a single budget. In fact, take the staff cost out of the camp’s budget to the next layer up and what is left is not worth arguing about.
  • There are no mine-and-yours physical resource issues. It is all ours
  • Most critically, Camp A and Camp S type staff will depend on each other to survive, learn to get on together and accept that there are different ways to do things for both. In other words, accept, value and respect each other.
  • The mixed-camp needs to be given time to evolve and this will take a while. The more urgent the survival becomes, the sooner the integration will happen. Once established, the new camp develops its own culture.

Having been through such an experience myself in 1996 at Middlesex University, it took four years to realise that operating as two separate camps would not work, so I started from scratch. Now, nearly two decades down the road from setting up the Design Engineering Department, there is no looking back, but I’ll probably always remain a recovering engineer.

To return to my opening point, it is not that we wanted to introduce ‘creativity’ into our engineering programmes, but rather it was actually about changing our practice and our way of doing things in order to acknowledge the evolving nature of the discipline, which has became practice-based. It was this that led to the creation of what I call the three pillars of practice-based learning in this new camp:

  • A curriculum model that recognises the appropriate teaching, learning and assessment approaches needed;
  • A physical Environmentthat supports the pedagogy adopted;
  • Staff resourcesthat can embrace the pedagogy adopted and operate within the environment created.

Prof Mehmet Karamanoglu is Professor of Design Engineering and Head of the Department of Design Engineering and Mathematics at Middlesex University, London.

Augar arrives

EPC Chief Executive, Johnny Rich reports on the long-awaited Review of Post-18 Education Funding in England and the possible implications for engineering in HE.

At over 200 pages and featuring 50 recommendations, the Augar Review will take some time to chew, let alone digest and (to follow the nutritional metaphor perhaps a couple of steps too far) turn into a burst of energy or perhaps a pile of waste. However, at the time of writing, the report has now been out for one day, so here’s my quick take on some of the most important points for EPC members.

The fee cut: As has been widely reported and trailed before publication, the Review recommends a cut in the headline tuition fee from £9,250 to £7,500. Obviously, for most engineering departments, that’s way below the per student cost of delivery.

However, the Review also recommends that the total investment in the HE sector remains the same – topped up by teaching grants – albeit frozen for the next few years. It argues that this will be manageable because there is a demographic uplift in the number of 18-year olds coming until 2025. The increased economies of scale should mitigate the freeze. The comfort is a little cold though. There are potential drops in international and EU students following the reputational fallout from Brexit (even if Brexit itself never happens) and, as the Review points out, too many universities are basing their finances on projections of growth of which at least some must, arithmetically, prove to be over-optimistic.

The Review does not envisage that top-up grants are evenly spread. Courses with good employment outcomes – measured, for the most part, in terms of salaries – would receive far bigger top-ups than those that result in less easily measured value. This appears to be good news for Engineering, which is specifically cited as a discipline where there are skills shortages and costs are recognised as high, and bad news for Creative Arts subjects which get a lot of stick for producing a lot of graduates without clear earnings premiums.

But it’s not as simple as that. Unless the top-up for Engineering is high enough to reflect the additional cost of teaching, we may have a situation where cheaper courses can still yield a margin on the basis of lower fees, but expensive ones not only cannot contribute to institutional overheads, but they can’t even pay for themselves. The commercial pressure will be to axe the expensive courses and do exactly the opposite of what the Review hopes to achieve.

Levels 4 and 5: Large parts of the Review report are devoted to a raft of measures to better support Further Education, including capital investment, access to loan-style tuition funding for level 4 and 5 qualifications on a par with the basic annual ticket price for degrees (£7,500), and a lifelong learning account (equivalent to the cost of four years of university study) allowing students to build up qualifications throughout their lives in modular chunks.

The Review does more to break down distinctions between HE and FE institutions rather than build them up, so, for universities that already offer qualifications at different levels, or those that decide to, there are opportunities here to build a diverse and financially sustainable offering.

Interim qualifications: Part of the drift away from seeing a level 6 (degree-level) qualification as the gold standard of post-18 education is the recommendation that university degrees should all include an interim qualification after the first or second year. The idea is to combat drop-outs – or at least to combat the stigma attached to dropping out without anything to show for it but debt.

It’s hard to think of significant objections to this recommendation, so universities need to start thinking about how it will work. For Engineering courses, it’s raises a number of particularly thorny issues. Would an interim qualification be accredited? How would this work in an integrated masters course?

Disadvantaged students: As well as topping up fees for expensive courses, the Review proposes a significant shift of top-up funds towards institutions that admit more students from disadvantaged backgrounds.

The reason for this is presented not merely as social engineering, but in recognition of the fact that, statistically speaking, for a host of reasons, it costs more to teach these students than their more affluent peers.

How you define ‘disadvantage’ is discussed and, while not completely shredding the POLAR metrics, the Review clearly thinks other alternatives may be better. There is no recognition of the fact that underrepresentation in HE takes different forms in different disciplines.

Engineering has particular challenges attracting women, BAME students and those from lower socio-economic groups. It has less of a problem attracting state-educated males than most subjects. Whatever intersectional measures of disadvantage are used may have unintended repercussions for Engineering. As with the threat of reduced fees, this well-intentioned recommendation may create reasons to axe Engineering courses and departments to massage the numbers of a university as a whole.

Foundation courses: In a move to support students from under-represented groups, some Engineering departments have introduced Foundation years as preparation for a full degree. The Review recommends that these be dropped altogether in favour of Access to HE diplomas, which currently are funded at a lower level. In other words, they want to stop universities from using Foundation years to ‘game’ an extra year of higher funding.

In a report where the arguments are usually clear and well evidenced (even if they don’t always reach the right solution), this recommendation seems unfounded and – I put my hands up – I just don’t understand how it achieves anything given that I would have thought Access to HE courses would, under the Review other proposals now attract the same funding as Foundation years. Meanwhile, it shuts down an access route to Engineering that some universities have found is a useful way of ensuring degree success for some students – such as those with BTECs or lower attainment in, say, maths or physics.

Entry requirements: Before the publication of the Review, there was lot of kite-flying (not least from Education Secretary Damian Hinds) about the possibility of a de facto cap on student numbers by saying that only those with equivalent to three Ds or above at A level would qualify for financial support.

There are very few students studying Engineering with entry grades that low. Those that are have usually gained their place on the basis of some particular exception. This exemplifies the problem with this policy: the few students it would have blocked are just the ones where investment in their education might have yielded the biggest difference to their prospects.

That’s presumably why the Review has not come out fully in favour of the idea. Yesterday, the Universities Minister Chris Skidmore tweeted his delight that it had “never featured” in the report. Given the section titled “A minimum entry threshold” on p99, the whole of the next page and a half devoted to discussing how such a threshold might be contextualised and then recommendation (3.7) on the next page, I’d say “never featured” is a bit of an overstatement.

Still, for now, that idea has gone away. Instead, universities are fairly firmly warned to put their recruitment business in order or else. Low offers must only be used judiciously and if ‘conditional unconditional’ offers aren’t curbed, then the Review has spelt out that the Government should step in. (Whether, under the Higher Education & Research Act, it has the power to do so without legislation is doubtful though.)


That’s just a few takeaways. No doubt I will kick myself for forgetting to mention dozens of others, but I will update EPC members further as the debate progresses.

One thing to add though is a comment on the status of these recommendations. The Augar Review is a high-profile independent report to the DfE as part of a government review. It is not a White Paper (ie. a plan for legislation). It is not even a Green Paper (a consultation document). It is just a series of considered ideas based on trying to come up with good solution rather than politically motivated ones.

There is every possibility the Review could be ignored, not least because Theresa May – principal sponsor of the exercise – is about to become a rather embarrassing footnote in political history. She put Damian Hinds in post and, although he’s one of the few Tory MPs who seems not to have designs on becoming prime minister, there’s no guarantee he will hang around in his job long enough to put the recommendations into action.

Putting them into action is easier said than done. Some of the recommendations would require legislation and whenever bills relating to student finance come to the Houses of Parliament their path tends to be rockier than a quarry dump-pile. Moreover, bear in mind party politics is so chaotic at the moment that the only vote anyone has dared put before the Commons for the past few weeks was on the anodyne issue of wild animals in circuses (although that is an apt metaphor).

All of this is why yesterday’s launch of the Review was introduced by Mrs May herself. She wanted to send a clear message to her successor that they should see this through. It’s her last ditch attempt at scribbling something, anything, on her CV under the heading of ‘achievements in role’.

The leadership contenders may or may not adopt these ideas. The chances of them engaging with them in detail are slim, but there are two main reasons they will want to do something, even if it’s not this.

Firstly, doing nothing is almost not an option because the Office for National Statistics ruled in December last year that the current accounting mechanism for student loans must change to reflect more accurately what they actually cost the public purse. This means we are entering the political bartering of a Comprehensive Spending Review with higher education costing tens of billions more than planned in terms of the public deficit. It’s all an accounting con, but it matters in terms of perceptions and economic confidence.

Secondly, Labour’s pitch at the 2017 election to axe fees altogether was seen as a major cause of the supposed ‘youthquake’ of support that wiped out May’s majority. Politically, it would be hard for any new Conservative leader to go into the next election – which could happen by accident at almost any time – without any response whatsoever to Labour’s offer.

That said, despite a lot of good reasoning and a host of suggestions at least some of which are very sensible, it’s hard to see how anything in the Augar Review is the vote-winning miracle that polls suggest the Conservatives need right now. After all, if £9,250 a year was off-putting, £7,500 with a more regressive repayment mechanism isn’t exactly anyone’s idea of a bargain.

Teaching students to learn for themselves

Dr Sunny Bains, author of a new book on emerging technologies, examines how to support students to make use of the technical literature and to look beyond it.

The best engineers can be thrown in at the deep end of a new problem and research their way out. That’s part of the ethos of combining conventional academic courses with more practical, project-based learning. 

This approach forces students to discover constraints and compromises for themselves, optimizing their solutions as well and as creatively as they can, rather than solving well-constructed questions with tractable answers. Often, they do this work as part of a group. 

Deep-end problem-based learning ticks a lot of boxes: teamwork, creativity, critical thinking, application of technical skills, and so on.

Unfortunately, what we choose to teach students formally before we launch them into these projects is often insufficient. 

Yes, they’re trained in the deep technical skills that we think they’ll need, and (if they’re lucky) even some of the transferable onesBut what we don’t normally teach them is how to systematically and thoroughly research a topic. 

More specifically, we don’t teach them where to look for answers to questions. Partly, this is because we are academics: to us the answer is usually a technical paper, possibly a book, and we’re so used to looking for these that we don’t think twice about it.

But to use technical literature first you need to be able to search for and find what you need effectively. Even if you do find the papers you think you’re looking for, you may not yet have the expertise to read them. This is especially, but not exclusively, true for undergraduates. Further, once you’re in industry, journals and proceedings aren’t going to alert you to what your competition (possibly start-ups in stealth mode) are up to. 

If I had to prioritize, my top three suggestions for helping students to research a new subject would be as follows: keywords, the technical press, and patents. Although you might think that the current generation (which grew up with the iPhone, never mind the internet) would be more expert at finding material on the web than we were, that’s far from true. Just a few minutes teaching them some basics can go a long way.

Keywords are key

First, we all know that keywords are critical to all kinds of searches, including the technical literature, but what students don’t realize is how creative you have to be in using them. Very similar ideas often have different names in different fields, and searching for the wrong terms can miss most of the most important information. 

Students need to know to gather lots of different keywords from the various sources, and then to search for them in different combinations to find the information they need.

Journals and magazines

Next, students should know that not all useful information has to be of the highly-technical variety. A good way of getting into a new field is to find news that’s readable but still contains specialist information. This might be in publications aimed at an industry (like Water and Wastewater Treatment), a society (like E&T Magazine), or even a popular science market like Wired.

A good place to start for articles like this is Engineering Inspiration, a website we set up at UCL (and free for all) that brings together interesting technical articles from across the web (we have 50K+ articles online to date). Reading enough of this kind of material can do wonders to set the context for a project: with the constraints and values of the industry coming through in every story.

Patently clear

Finally, patents (which are now freely available to search on the web) are a great source of information because they cover a lot of technology that is too commercially sensitive to be published in other forums. 

It’s true that they’re completely unreadable, but by following the breadcrumbs of who has filed what patent it’s possible to figure out who is doing roughly what. With a little imagination, engineers can pull together clues based on what the inventors did before the patent, who they’re working with now, what theydid before, and so make an educated guess about what is in the pipeline.

Of course, there are many more sources to look at: conference programmes can be even more informative than proceedings; books (remember books?) can be hugely helpful if used well, and peoplecan provide insights and feedback that no written source ever could… 

The main thing is not to assume that students will somehow learn their research skills by osmosis. We forget how much we take for granted after a lifetime of information-gathering: by giving our students just a little bit of formal instruction on how to do this critical task, we can make them hugely more productive.

Dr Sunny Bains (see sunnybains.com) is the author of Explaining the Future: How to Research, Analyze, and Report on Emerging Technologies.She teaches engineering and physical sciences students at University College London.