New! DATA BLOG: Grade inflation?

Earlier this month, the OfS published a new release of degree classification data, concluding that the growing proportion of the first and upper second class degrees awarded cannot be fully explained by factors linked with degree attainment. Specifically, the new analysis finds that in 2017-18, 13.9 percentage points’ worth of first-class degree attainment is unexplained by changes in the graduate population since 2010-11, an increase of 2.4 percentage points from the unexplained attainment in 2016-17. So we have it – grade inflation.

So, we’ve fished some unfiltered HESA data out of our archives, updated it, and looked at the distributions between first, second and third-class honours in engineering. And it seems that engineering paints a very different (worse?) picture than the sector as a whole. We award a notably higher proportion of firsts and, at a glance, a commensurately lower proportion of 2nd class honours. The proportion of 3rd class honours/pass awarded has come into line with the all subjects over recent years. It varies by engineering discipline, but nowhere is the proportion of firsts lower than for all subjects.

You might think, then, that high-level degree awards in engineering (firsts plus upper-class seconds) were nothing to write home about. But in 2016/17, at 77.3%, the proportion of high-level degree awards in engineering was one percentage point higher than for all subjects (and the difference has fluctuated around the one percent mark for the past ten years).

A simplified index plot, where 1 (the central y axis) represents all subjects, shows the propensity of a first in engineering is consistently greater than for all subjects (where the longer the bar, the greater the over-representation). The over-representation of firsts in engineering has shown a notable reduction over the past ten years and, at 1.4, was at its lowest yet in 2017/18. The overrepresentation of third-class honours in engineering visible from 2007/08 to 2015/15 has now been eliminated. You can see from this analysis that the over-representation of firsts is in fact greater than the combined under-representation of 2:1s and 2:2s.

So, what does this tell us? That the rise in higher degree classifications doesn’t apply to engineering? The number of high-level degrees in engineering has increased from 10,180 in 2007/8 to 18,690 in 2017/8, an increase of 83.6%. Proportionally, this has risen from 62.7% of all degree awards in engineering to 77.3%. That’s just marginally less proportional growth than the 14.9 percentage point difference for all subjects. But we are making progress.

Here’s the rub, who’s to say that rises in high-level degree classifications (which, sector-wide, cannot be explained by the data readily available – not my data) is necessarily a problem per se, or that is signals grade inflation? There are many reasons – not accounted for in the OfS statistical models – for degree outcome uplift, not least the massive expansion of student numbers in the last 20 years (leading to a less socially constrained pool of students); greater awareness of student support needs; the increased cost of higher education to students; more incentivised and focused students; and improved teaching in both schools and universities. Further, there is evidence that market forces; course enrolments; progression rules (e.g. progression from BEng to MEng requires achievement of marks for the first two or three years of study suggesting a minimum 2:1 standard, and therefore likely transfer of the best students away from the BEng); and the marking processes adopted by different subject areas impacts the proportion of upper degrees between subjects.

The evidence of improvement in teaching (and the development of pedagogy in UK universities) is much stronger than the evidence for grade inflation. As a discipline, this is what we must celebrate. Higher education (HE) is the gold standard in the delivery of engineering skills in the UK and has a strong international standing and reputation.

Let’s face it, the assumption that institutions need to account for grade inflation rather than educational improvement is perverse. Instead, let’s talk about and encourage innovation in teaching, learning and assessment, precisely what our New Approaches to Engineering Higher Education initiative (in partnership with the IET) aims to do. Earlier this year we launched six case study examples for each of the six new approaches, evidencing that the required changes can be achieved – are already being achieved – and we now want other institutions who have been inspired to come up with new approaches of their own to showcase their work at a New Approaches conference at the IET in November. More details will be circulated shortly.

Attribution: EPC analysis of HESA Student Qualifiers Full Person Equivalent (FPE) using Heidi Plus Online Analytics service.

Bid to host EPC Congress in 2020 or 2021

DEADLINE FOR SUBMISSIONS: 19th June 2019

Proposals are invited from higher education Engineering departments to host the Engineering Professors’ Council Annual Congress in 2020 or 2021.

‘Hosting the 2018 Engineering Professors’ Council Congress was a great way to showcase the University’s work to a wide range of experts in the field as well as to the professional bodies in engineering.  Our staff and students gained a lot from explaining their approach to engineering education and research, and we were also able to explore new collaborations to broaden the reach of our engineering activities.  We were delighted to welcome the EPC to Harper Adams and hope that other universities taking the opportunity act as the venue for the Congress will gain as much from the experience as we have.’
David Llewellyn, Vice-Chancellor, Harper Adams University (hosts of the 2018 Annual Congress) 

The Annual Congress is the flagship event in the EPC calendar, an opportunity for engineering academics from across the UK to come together to explore policy and practice and to network.

Download guidelines.

Download the form for submitting a proposal.

Each year, Congress is hosted by a different institution: 

The Congress usually takes place in April or May and lasts two days with a reception on the evening before the Congress formally starts.

  • 2016: The University of Hull hosted Congress as a prestigious addition to its preparations as European City of Culture. 
  • 2017: Coventry University hosted taking the opportunity to demonstrate the city’s close associated with transport engineering and manufacturing. 
  • 2018: Harper Adams University displayed its cutting edge status as a leading centre of agricultural engineering including automated farming and a range of off-road vehicles. 
  • 2019: UCL is host for this year’s congress where its proximity to the seat of Government has allowed an amazing line-up of high-profile speakers on a range of policy issues at a time of historic challenges. 

The host institution nominates a Congress Convenor who will become a member of the EPC Board for up to three years (2019-21 for the 2020 Convenor; 2020-22 for the 2021 Convenor) and who, with guidance from the EPC executive team, will lead the organisation of the Congress, including determining the themes and scope for the Congress, and the speakers and events. 

We are inviting bids to act as host for either of the next two years. You can specify one year or the other or apply without choosing a year. We will not select the same host for both years.

Download guidelines.

Download the form for submitting a proposal.


To submit a proposal, complete the form here and email it to Johnny Rich, Chief Executive, at j.rich@epc.ac.ukby 19thJune 2019. Johnny can also be contacted at the same address or by phone on 078-1111 4292 to discuss any aspect of Congress or the proposal process. 


What is expected from the host

The host institution (host) would be expected to provide:

  • an academic of suitable standing to act as Convenor and other staff resource as necessary to assist planning the Congress;
  • suitable function rooms such as a lecture theatre and smaller break-out rooms, as well as space for networking;
  • catering for the Congress;
  • possibly accommodation, particularly, for early career staff delegates to the Congress who may be provided free accommodation in student residences;
  • management of the Congress during the event;
  • financial accountability in accordance with the financial arrangements (see below).

There will be some support from the EPC executive, but it is advisable to ensure that the host can provide conference support staff as the smooth running of the Congress will primarily be the Convenor’s responsibility.

The Congress usually attracts up to 100 delegates, but the numbers have grown in recent years and the host should be able to provide for 150.


Selection process

The process for selection as host involves submission of your proposal to the EPC Board, which will conduct a vote. The basis for its decision is entirely at its discretion, but they will take into account issues such as the nominated Convenor, the suitability of the facilities, the arrangements for costs, the geographical suitability (although the EPC is keen not always to be restricted to big centres of population), the suggested activities such as Congress Dinner venue and other attractions, and other arrangements to ensure the smooth running of the Congress.

The host institution must be a member of the EPC. We would particularly welcome joint proposals from separate institutions to host jointly, such as two engineering departments at separate universities in the same city.


Financial arrangements

The suggestion for the financial arrangement between the EPC and the host forms part of the proposal. The EPC will seek to minimise its risk and, if possible, would like to generate a surplus from the event to contribute to its own in-house costs in running the Congress. However, the quality of the event and its appeal to members will be of greater weight in selecting the host institution.

That said, it may be helpful to provide as guidance the following arrangement that has been used in the past. The EPC would hope that the host would aim to meet at least this arrangement:

Costs may be divided into three categories as follows:

  • ‘External costs’: ie. costs that will genuinely have to be met, such as catering, external venue hire, student ambassadors, etc. The EPC would guarantee all these external costs and, if necessary, would pay them up-front. In any case, the EPC would be liable for these costs.
  • ‘Internal costs’: such as staff who are already employed by the host. The host would guarantee these costs and, in the event that registration income was insufficient to meet them, the host would be liable for them.
  • ‘Internal fees’: where the only cost to the host is a notional price that it sets internally – room hire, for instance. Once the two types of costs above have been met from revenue, 75% of any remainder may be used to defray the host’s internal fees and the other 25% will be due to the EPC to defray our internal costs and fees. After the host’s internal fees have been met, any surplus would be split equally.

The proposal should make it clear whether the host proposes to manages the bookings process and receive the registration fees or would prefer this to be handled by the EPC. If the host receives the fees, after the Congress it will be expected to provide a full account of income and expenditure (outlining the categories of expense as above, if that model is used). If the EPC receives the fees, the host may invoice the EPC for costs in accordance with the agreement. In either case, the host will be expected to agree with the EPC a full budget for the Congress at the earliest opportunity (and before substantial Congress planning) and would not be entitled to incur costs on behalf of the EPC outside the agreed budget without separate agreement.

While the host will be responsible for setting the registration fees and packages for delegates, these must be agreed in advance with the EPC. These should not include a more than 10% increase on equivalent packages for the previous year. A significant number of places for early careers staff (not more than 5 years in an academic post) should be made available at the lowest possible rate (including, ideally, some complimentary places).

In some years, the host has acted as a major sponsor of the event contributing to the costs or not passing on some or all of the costs it incurs. Any such support would be acknowledged and the EPC will seek to support the host’s objectives in sponsoring Congress. Any other sponsorship revenue will normally be retained by the EPC or used to offset the costs of running the Congress.

What is Engineering? Subject coding: HECoS, JACS and engineering, an unofficial guide

If you follow the HE data environment, or even just the policy headlines, you’ll probably have noticed that a new subject coding system – the Higher Education Classification of Subjects (HECoS) – has now been fully implemented. HECoS replaces the Joint Academic Coding System (JACS) shared by UCAS and HESA and commonly used across the sector; the detail behind what official statistics consider subject, subject group, subject line, or discipline. Luckily for us, Engineering features distinctly in both JACS and HECoS.

But first more techy background. The HECoS vocabulary (refined to version 9 currently, despite its appearance in open HESA data for the first time in the latest, 2019/20, data series) is confusing for most, and a minefield for the uninitiated. Although the codes are randomly generated and have no inherent meaning in themselves, for the purposes of analysis, each code is grouped into subject areas at a few levels of detail – the Common Aggregation Hierarchy (CAH). And although we are advised that CAH can be applied against both the old (JACS) and new (HECoS) coding frames with caution, disappointingly this does not allow for consistent analysis.

So, what does this mean for Engineering? In short, you will continue to be able to see patterns of application, acceptance (UCAS) enrolment and other student population data (HESA) for the engineering labels you recognise, plus a new bioengineering classification:

  • mechanical engineering
  • production and manufacturing engineering
  • aeronautical and aerospace engineering
  • naval architecture
  • bioengineering, medical and biomedical engineering
  • civil engineering
  • electrical and electronic engineering
  • chemical, process and energy engineering
  • others in engineering

Aside from the change in the order in which they’re typically presented, aerospace becomes aeronautical and aerospace engineering; electronic and electrical becomes electrical and electronic engineering; and bioengineering, medical and biomedical engineering gets its own line. Plus, for general engineering, you now need to think in terms of engineering (non-specific).

But this is not just semantics. Quite apart from the change in culture and practice in course coding at source, some fine jiggery pokery means apparently like-for-like comparisons are not so. Not least, the new bioengineering, medical and biomedical engineering courses have come from elsewhere, including elsewhere in engineering. Combinations within engineering have also been (more accurately) absorbed.

And at a discipline-by-discipline level:

  • general engineering exports courses to medicine, physics, geography and architecture, not to mention those that the new engineering (non-specific) imports from other subjects.
  • mechanical engineering passes numerous course codes across to production and manufacturing engineering as well as (naval) architecture and physics.
  • electronic and electrical engineering notably redistributes robotics and cybernetics to production and manufacturing engineering and virtual reality engineering to computing. It also helps to populate the new bioengineering classification.
  • civil engineering and aerospace engineering are truer to form but send a few JACS codes off into other engineering disciplines (and physics for aerospace).
  • production and manufacturing engineering exports nothing (but remember it’s quite an importer from other engineering disciplines at least).
  • chemical, process and energy engineering appears at a glance to be least touched by the changes.

Of course, this is a summary, not a detailed mapping. The takeaway is that, despite what the CAH titles may belie, these are not like-for-like mappings and are not comparable. To this end, the chart below shows UCAS accepted applicant data for 2019/20 and 2020/21 by CAH3 in engineering and its equivalent JACS3. There are clear differences between the CAH and JACS 2-year pairings, whilst the 2-year trends for most are broadly similar, albeit less pronounced by CAH.

Click on the chart to expand

The chart and underlying data / trends are also provided in a spreadsheet we’ve prepared to help you map in detail, should you want to do so.

A list of HECoS CAH codes (at levels 1 and 3) aligned to each JACS subject group can be found in the Summary JACS to CAH pivot worksheet. The reverse mapping is provided as Summary CAH to JACS pivot. The full version 1.2 HECoS Lookup, identifying each JACS subject group, course code and label (which relates to this season’s HESA data series but expires at the end of July) by HECoS code, label, CAH1, 2 and 3 is also provided, including summaries of their mapping category and relation (see remaining tabs for mapping and definitions).

Further support documentation available on the HESA website.

Data Blog: Become an expert in UCAS engineering data in ten steps

Spoiler: there is no data in this data blog! Instead, we bring you a mixed media UCAS engineering data masterclass to share what we’ve learned about the tools available this year while looking to analyse it.

You may already know that you can access engineering data using MS Power BI at discipline level on the UCAS website. If not, let me excite you.

You can quickly and easily produce headline tables and charts filtering UCAS applications and acceptances profiles for engineering, drilling down into a host of variables including the cohort’s gender, age and where they are from.

Below are EPC’s engineering focused instructions, coupled with a brief video tutorial to walk you through visually.

Don’t worry, an analysis will follow. In the meantime, if you discover any more UCAS self-service details, options or top tips, please do post a comment below.

And if you’d like to be involved in the development of the interactive data analysis tools planned for EPC online, please contact us.

Click to watch this 12 minute guide

Masterclass step-by-step guide

1. Go to UCAS.com and scroll down to Data and Analysis. Select Undergraduate statistics and reports and then End of cycle data resources. Alternatively, go directly via this link.

2. By selecting either Acceptances, Applications or Offers you can filter acceptances and main scheme applications, offers and offer rates for engineering.

Top tip: These all seem to lead to the same place, where there is a check box to choose from again in the top right hand corner.

Top tip: You can filter the chart by engineering but for better detail in linked charts and tables, leave the filter on all and click on the engineering colour in the key or in the chart itself.

3. Once you’ve homed in on engineering, you can filter or drill down by

  • Domicile
  • Age group
  • Gender

Top tip: We couldn’t find an export or copy functionality, so if you want to copy a whole crosstab into another document or report, you may need to resort to the downloadable datasets (see 7. Below).

4. By selecting Unconditional offers you can view unconditional offers (18-year-olds) by type of offer (direct unconditional, conditional unconditional, other unconditional or conditional component) and proportion.

Top tip: This includes English, Welsh and Northern Irish applicants only.

5. Technical notes and definitions are available above in the help section of the dashboard.

Top tip: You won’t get far this year without deciding whether to identify engineering using JACS3 – available at discipline (detailed subject, sometimes known as subject line) level which is available for 2007-2020 – or its replacement subject coding scheme, HECoS (detailed subject, sometimes known as CAH3, which is available for 2019-20 only. These aren’t comparable and the latest HESA data is only available by HECoS. If you want to know the details, a quick engineering guide and an unofficial engineering mapping spreadsheet is available here.

6. If all of this is too much, EPC members can download the headline applications and acceptances data from the EPC website.

Top tip: This is a password protected members page. If you are an EPC member and don’t know your password, please contact us.

7. Or, if you’ve got the bug, even more data is available (for home students) if you’re prepared to download some datasets. A full list of datasets, variables and combinations available can be found here.

Top tip: This is also your reference guide if you want to understand which of the many datasets you need to download to undertake your own analysis.

8. Using the datasets, you can filter applications (including applications type) and acceptances (including acceptance route) by engineering by:

  • Ethnicity
  • Disability
  • POLAR4
  • IMD
  • UK region
  • Provider region
  • School type

Top tip: If you want to access headline engineering data on all (not UK only) select provider region.

But not by combinations of those together. Those you can analyse multivariately are:

  • Domicile
  • Gender
  • Age group

Top tip: This dataset is one of several which exceeds MS Excel’s row limits making rookie analysis tricky. Remember, for a basic look at distributions, you don’t need to download the dataset as it can be explored via the UCAS website.

Top tip: It’s pretty quick and easy to use the online UCAS tool to check your subject totals tally back to the published figures. Note though that some of the overall totals across all subjects published by UCAS vary a little between their outputs, probably due to their rounding policy.

8. You can also consider main scheme offers by discipline, and unconditional offers for engineering as a whole.

9. There is other data at all-subject level you might find useful including Clearing plus, 18-year-old population estimates, post-result grade increases, and entry rates.

10. Phew! Well done for getting to the end. Any queries? Do feel free to contact us.

Policy summary, February 2021

It has been a busy start to the year for HE policy and politics, despite the roll-out of a third UK lockdown at the start of the year – significantly impacting campus presence and face-to-face teaching and cancelling summer level 3 exams again this summer across all UK administrations.

After numerous delays, the government FE white paper has now been published, alongside an interim response to the Augar Review and the Pearce review of the TEF. A summary of these and other “live” policies are outlined below.

  • Skills for Jobs FE white paper

The Skills for Jobs white paper presents the government’s post-compulsory skills agenda, setting out plans to boost quality, parity of esteem and take-up of higher technical qualifications at levels four and five. “Kitemarked” qualifications will be approved by the Institute for Apprenticeships and Technical Education (IfATE) based on the institute’s employer-led standards for higher apprenticeships. The triangulation will be completed through full alignment to T levels, enabling progression from T levels to HTQs.

The range of options at post-16 and post-compulsory will be showcased by a modest injection into careers advice through improvements to the national careers service website and further rollout of local careers hubs.

From 2023, funding for non-kitemarked qualifications will be reduced and a new IfATE/OFS system for assessing quality beyond initial qualification approval will be applied to all technical education providers. This will include apprenticeships, which are also targeted for expansion, through funding for smaller employers to offer apprenticeships, greater ease for larger employers to transfer their apprenticeship levy funds, and the publication of salary returns data for apprenticeships.

The lifetime skills guarantee, and lifelong loan allowance announced by the Prime Minister last September, intended to allow more flexible use of student loan entitlement over a lifetime, will be implemented from 2025. Aligned with this is a signal of future funding to support development of more modular, flexible higher education provision and credit transfer in 2021-22.

Prior to this (in summer 2021) there will be funding for a further eight Institutes of Technology charged with offering high quality higher technical STEM provision in all areas of England.

  • TEF report

Dame Shirley Pearce’s Independent Review of the Teaching Excellence and Student Outcomes Framework (TEF) called for clarity of purpose (and name) and improvements its metrics and their statistical application, transparency, relevance and balance (read burden). Pearce recommended greater granularity within four aspects – Teaching and Learning Environment, Student Satisfaction, Educational Gains, Graduate Outcomes – and a more nuanced rating system.

Pearce noted the need for broader input metrics, accounting for regional differences. Within Educational Gains, she noted an ambition for each university to demonstrate how, within their own particular mission, they articulate and measure the educational outcomes and learning that they aim to provide for their students.

A subject-level exercise was also recommended for inclusion in the provider-level assessment to inform ratings at provider rather than subject level.

  • Government’s response to the TEF report

The Government “mostly agreed” with the Review’s high-level recommendations and has responded by abandoning the subject level TEF exercise. Instead, they have asked OfS to develop a “revised and invigorated” provider-level TEF which will run not on a one-year cycle, but every four to five years, with the first group of assessments completed and published by 2022. Where the government didn’t agree with the recommendations was in its insistence that the TEF’s secondary purpose was to inform student choice. Furthermore, emphasis on ‘Student Satisfaction’ was rejected in favour of ‘Student Academic Experience’.

Unsurprisingly, driving out low quality provision permeated the Government’s vision for the new TEF (name unchanged) within a wider quality regime which will “apply across all providers, not just those at the lower end (where the OfS is consulting on plans to introduce a more rigorous quality baseline)” – see below. Four award levels will replace the existing bronze, silver and gold, where the new bottom category will capture those providers failing to show sufficient evidence of excellence and who need to improve the quality of their provision. The introduction of Limiting Factors is mooted, such that a provider should not achieve a high TEF rating if it has poor student outcomes.

A consultation on future iterations of TEF is expected in due course, including measures beyond

just earnings (including a reliable measure of educational gain) taking account of regional variations and flexible modes of study. There is a useful ONS Evaluation of the statistical elements of TEF which might guide this, at least in part.

In case you missed it, the OfS published the findings from the second subject-level pilot of the TEF in 2018-19 to coincide with the publication of the Pearce Review into the TEF.

  • Government’s (holding) response to the Augar Review

The Augar Review was the 2019 review of post-18 education and funding. For a summary in relation to engineering, see the EPC blog. In their much-delayed response to Augar, the Government stopped short of any serious funding reforms instead shoehorning these into further reforms to the higher education system to be consulted on in spring 2021 ahead of the Comprehensive Spending Review. The current freeze on the maximum fee levels, and the threat of a huge cut in Home undergraduate fees, remains until then.

There is some recycling of policy in Skills for Jobs white paper (see above), including the lifelong loan entitlement, local skills improvement plans, the rollout of approvals for higher technical qualifications, and signalled plans for incentivising more modular and flexible delivery apply across higher education.

The Government also outlined its plan to realign teaching grant funding towards national priorities (through the introduction of a bid basis) including STEM, healthcare and specific labour market needs (see below).

  • Teaching Grant

The Teaching Grant letter announces an £85 million increase to the amount allocated through the main “high-cost subject funding” method for high-cost and “strategically important” subjects, including engineering. The London weightings in student premium and T funding will be ended from 2021-22, which is a big hit for London universities, particularly the big multidisciplinary ones who won’t benefit from an increase for small and specialist providers.

The budget for Uni Connect goes from £60m to £40 million, with the savings going on £5m for student hardship and £15m for mental health. Finally, capital funding for providers will be distributed through a bidding competition rather than a formula method, and students from the Crown Dependencies will be subject to home fee status and counted for funding purposes.

  • Quality and standards

Although there is, as yet, no formal response from the Office for Students on the recent quality and standards consultation, Government will to exert power over metricised HE “underperformance” permeates the policies of the day. Within these, OfS is asked to roll questions of standalone modular provision into its thinking on the development of the quality regime.

We are also promised a consultation on “further reforms” to the higher education system in spring 2021 which, along with “other matters”, may pick up on some of the missing in action proposals form Augar et el including the future of foundation years, reforms to student finance, minimum entry requirements. Hopefully all ahead of a final decision on quality and standards.

Meanwhile, it’s clear that the sector – which has pretty much unanimously called for the Quality Code to be retained – recognises the fatal undermining of the proposed approach to the government’s other levelling-up and social mobility agendas.

  • Post qualification admissions

Following the flurry of reviews of university admissions by UCAS, Universities UK, the Office for Students and DfE late in 2020 the latest, DfE, consultation is aimed principally at when students receive and accept university offers (not the wider assessment, admission or policy agendas).

The consultation presents two options which are predicated on removing teacher predictions from the system altogether in favour of on exam results. The first, “post-qualification applications and offers”, creates a longer application window by moving results dates forward to the end of July, and higher education term dates back to the first week of October. The second, “Pre-qualification applications with post-qualification offers and decisions” would mean applications being made during term-time (as now) but offers being made after results day.

DfE recognises that courses which require additional entrance tests, auditions and/ or interviews will also need to be accommodated in either system, somehow (cue the consultation).

The EPC is currently considering its response. DfE’s consultation runs until mid-May.

  • Brexit

The Turing Scheme – a replacement for the Europe-wide Erasmus+ now that its door is closed following the UK’s departure from the EU – was launched by Gavin Williamson earlier this month. Alongside this, the government has updated its International Education Strategy with a commitment to increase the amount generated from education exports, such as fees and income from overseas students and English language teaching abroad, to £35 billion a year, and sustainably recruit at least 600,000 international students to the UK by 2030.

The Turing Scheme is the UK’s global programme to study and work abroad. Website. EPC research (to be published shortly) conducted in partnership with UCL’s Engineering Education, highlighted many of the benefits of engagement in European student and staff exchange.

  • Free Speech proposals

The government has published proposals on academic freedom and freedom of speech as follows:

  1. Legislate for a Free Speech and Academic Freedom Champion to be appointed as a member of the OfS board with responsibility to champion free speech and investigate alleged breaches of registration conditions related to freedom of speech and academic freedom.
  2. Legislate to require a new OfS registration condition on free speech and academic freedom.
  3. Explore further the option of strengthening the section 43 duty to include a duty on HEPs to ‘actively promote’ freedom of speech (where section 43 relates to the 1986 Education (no 2) Act).
  4. Legislate to extend the strengthened section 43 duty to cover SUs directly.
  5. Set clear minimum standards for the code of practice required under section 43
  6. Introduce a statutory tort that would give private individuals a right of redress for loss as a result of a breach of section 43
  7. Wider and enhanced academic freedom contractual protections

Professional recognition post-Brexit

Is my professional title still valid in the EU? Will my combination of academic qualifications and professional experience still count post Brexit? What does the information on recognition of professional registration in the EU on the Engineering Council website mean for me? Here’s the simplified version…

Now we have left the EU, the EU legislation adopted by all Member States (called the MRPQ Directive), which sets out obligations to mutually recognise each other’s professional qualifications, no longer applies to the UK.

Under the Trade and Co-operation agreement there is a mechanism for professions to negotiate a Mutual Recognition Agreement between the UK and all 27 Member States.  This would effectively replace the Directive and put in place new legislation. (For EEA/Swiss professionals who want to gain the UK professional titles, there is already a new piece of UK legislation that replaces the Directive).

In the meantime (during what is likely to be lengthy and difficult negotiation process) for UK Professionals who want recognition an EU country, the UK application will now be treated like any non-EU country. The EU professional title can still be awarded, but it may take longer, and the application process may be slightly different. The Engineering Council has advised that, in practice, many EU countries do not require the professional title to work (just as in the UK).

Membership of organisations such as ENAEE and FEANI is unchanged by Brexit, as they are European Higher Education Area associations.  ENAEE is particularly important for the academic community, as it means that we will continue to demonstrate that our engineering degrees meet the European standard (EUR-ACE).

Going forward, it would be helpful to know if there is member appetite to engage with the Trade and Co-operation agreement mechanism on behalf of professional engineering or if there are better ways to achieve the same objective?

Trusted Research: How safe is your research?

Overview

This was a discussion on the security precautions that academic institutions should take to prevent intellectual property being leaked to competitors or foreign governments. With the aim of preventing research and technologies being used for immoral or unethical means by external actors while still encouraging – and enhancing – collaboration both nationally and internationally.

Important links

CPNI’s advice on security best practices: https://www.cpni.gov.uk/managing-my-asset/leadership-in-security/board-security-passport

CPNI’s advice on what to include within a security considerations assessment: https://www.cpni.gov.uk/security-considerations-assessment

CPNI’s think before you link campaign: https://www.cpni.gov.uk/security-campaigns/think-you-link

CPNI’s Trusted research guidance for academia or industry: https://www.cpni.gov.uk/trusted-research

Any of CPNI’s guidance can be rebranded to fit your company/university branding.

Game of Pawns, FBI website: https://www.fbi.gov/video-repository/newss-game-of-pawns/view

Engineering council’s guidance on security: https://www.engc.org.uk/standards-guidance/guidance/guidance-on-security/

Proposed National security and investment bill 2019-2021: https://services.parliament.uk/bills/2019-21/nationalsecurityandinvestment.html

Key Points

Research, security-mindedness and transparency: David Sweeney, Executive Chair of UKRI, and Kelly Pullin, Head of Strategic Coordination at UKRI

Challenges:

  • Supporting universities and researchers to navigate the complex regulatory and ethical landscape while still encouraging international collaboration.
  • Ensuring the continued success of UK universities in research and innovation systems along with the success of international education in general.
  • Offering necessary protections and freedoms to institutions and companies in an environment of privacy concerns and cybersecurity challenges without compromising security.

Context:

  • National security and investment bill and understanding the potential implications of the bill.
  • The collective objective to protect UK research integrity and credibility.

Introductory points:

Nothing is risk free, the aim is to mitigate the risks to academic research and collaboration as best as possible. Universities and their industry partners need support in order to achieve this. However, we need to recognise research organisations’ autonomy and treat academic freedom as a necessity.

What is already in place?

  • Wealth of expertise in security across the sector
  • Guidance and information readily available
  • We try to be as informed as we can
  • There is access to respective agencies and links have been formed with government

What more is needed?

  • Greater coordination and sharing of information as appropriate – with an awareness of tensions with other policies
  • A transparency around challenges and threats
  • Clarity around expectations and requirements
  • Establishing strengthened and sensible processes to help academics navigate the security landscape

Current support:

  • From government, Department for International Trade (DIT), Foreign and Commonwealth Office (FCO), National Cyber Security Centre (NSCS) and Centre for the protection of national infrastructure (CPNI)
  • UKRI
  • Other organisations include UUK/UUKi, ARMA, OECD and Jisc

By Ewan Radford, an undergraduate in Integrated Masters in Electronic Engineering with Space Systems at the University of Surrey.

Emerging Stronger: Lasting impact from crisis innovation

Overview

What have we learned about learning under lockdown? How can we use the experience of trying to deliver high-quality engineering degree programmes to strengthen our teaching in future? This live webcast discussed the positive impacts of Covid-19 on university teaching and explored  the solutions to the unique challenges faced by online learning with a focus on lessons learnt which could potentially be carried forward post-Covid.

Important links

You can download the ‘Emerging Stronger’ report by Prof Bev Gibbs and Dr Gary C Wood at http://bit.ly/EmergingStronger

If you have any ideas or experiences with improving the online learning experience or have ideas on evaluation, you can share your 500-600 word case study at https://tinyurl.com/ES-Submit-2021,until 31st March 2021.

Catch up with the recorded webcast

Episode 1 including Keynote by Sir Michael Barber and first panel session (see details below).

Episode 2 including second panel session and summary (see details below).

Key points

Keynote (Rec 1 00:05:50): Sir Michael Barber, Chair of the OfS Board

Sir Michael congratulated the university sector for its resilience, vigour, innovative capacity and dedication in these challenging times. He outlined five thoughts around Emerging Stronger and potential innovations in pedagogy.

  1. Promoting social and economic value. While the OfS doesn’t pick and choose between subjects, high-cost subjects, including engineering, are highly prized by Government and the OfS for their contribution to our society and our economy through teaching future generations from an increasing breadth of backgrounds. (Watch the upcoming T grant consultation for some potentially good news for engineering.) What more can engineering do beyond what it already does to shape our economy and society in the future?
  2. Experimentation in course design. Engineering already demonstrates a diversity of models of teaching and learning, including degree apprenticeships. What is the right mix of applied and theoretical teaching and how should this change over time? What are the range of models that we want?
  3. Strengthening pedagogy through innovation and digital teaching. The EPC is in a strong position to build on what has already been developed and to move the sector forwards with a great diversity of innovative pedagogy models. It’s not too late to respond to the consultation on this. (Look out for the OfS report to be published in February or March). What can be done to prepare for the next academic year and the longer term?
  4. Contributing to the future. What is the role of engineering faculties within universities in spreading opportunities and economic growth more equitably.? In the past, great engineering breakthroughs have been brought through crisis; engineering can play a leading role in building a better future. What might that be?
  5. Innovations in engineering. Awesome technological developments will lead us to new opportunities. The future is bright for engineering.

Emerging Stronger Paper Overview (Rec 1 00:31:30): Prof Bev Gibbs, Chief Academic Officer at NMiTE, and Dr Gary C Wood, Head of Sheffield Engineering Leadership Academy

Emerging Stronger 2020 was about sharing early ideas and the immediate response. There was little opportunity for evaluation due to the short timeframe available.

The major themes investigated in Emerging Stronger 2020 were Assessment, Student collaboration and professional skills, Practical work, Employability, and Student Partnership.

Emerging Stronger 2021 will focus more on student experience and evaluating the ideas presented in Emerging Stronger 2020.

Large Cohort Team-Based Learning (TBL) (Rec 1 00:42:40): Alexander Lunt, Lecturer, University of Bath

Historically passive Q&A tutorials are ineffective and under-utilised. Team based learning improves engagement and interaction.

It is useful to have online assessment before in-person meetings for students to be able to practise. This allowed students to have questions to bring to the in-person meeting. Increasing engagement.

Internet of Things Student Placements (Rec 1 00:49:30): Ceri Batchelder, Royal Society Entrepreneur in residence and SELA Board Member, University of Sheffield and Mo-Anna Tucker, SELA Graduate, University of Sheffield, and Intern, Recycling Technologies

Offering online placements enabled more places to be offered. Online placements can offer more flexible working options. Internet of Things (IoT) environment sensors were able to be accessed from students’ homes using the internet, allowing the placement to be conducted remotely.

Mental Health and Wellbeing of Engineering Students (Rec 1 00:57:30): Jo-Anne Tait, Academic Strategic Lead, Robert Gordon University

There are high rates of mental health illness amongst engineering students. Online teaching exacerbated problems and any available support is less accessible. Mental Health of staff and students should be prioritised more.

Promoting lecture and examples sheet interactivity (Rec 2 00:01:05): David Fletcher, Professor of Railway Engineering, University of Sheffield

Adding a question asking function to online slides is useful as it allows students to ask questions about the content anonymously. This is useful for foreign students who may not be confident in their English ability along with people scared of asking questions in person. Answers to all questions submitted were given in the next lecture.

Online Project Based Learning: Engineering Design (Rec 2 00:07:15): Daniel Beneroso, Assistant Professor, University of Nottingham

Online refresher lectures along with reviewing progress against key milestones is useful. Each student getting a mentor helps ensure that they stay on track. A strong support structure is necessary.

Student Partnership in Learning Design (Rec 2 00:14:50): Ryan Grammenos, Senior Teaching Fellow, UCL

Attempted to use 3 different platforms to perform a project remotely as no one platform did everything needed. Students were overwhelmed and unhappy with this approach. Students felt lost and did not know where to start. It was a struggle to communicate expectations and the staff had forgotten what it was like to be a first year.

By Ewan Radford, an undergraduate in Integrated Masters in Electronic Engineering with Space Systems at the University of Surrey.

Guest blog: EPC Hammermen student prize

Congratulations to Rachel Beel, Glasgow Caledonian University, and to Dan Hicks, University of Brighton, joint winners of the 2020 EPC Hammermen Student Award. The Hammermen Award is an annual prize, presented in association with the Hammermen of Glasgow, to celebrate engineering students’ excellence. This year’s award received an unprecedented number of submissions and five finalists competed for the coveted prize at the EPC Congress. To see Rachel and Dan’s posters and multi-media pitches, as well as the other short and longlisted applicants, visit the EPC events microsite.

We asked Rachel to tell us about her win and this is what she said:

“My name is Rachel Beel and I was one of the winners of this years Hammerman Award for my poster on the link between Academia and Industry.

My poster highlighted how my work placement in industry ended up driving me towards my final thesis topic. The skills and experience I had gained in the workplace were built upon by researching and completing my final thesis on: The Design, Manufacture and FEA of a Thin-Walled Pressure Vessel.My placement was at Pacson Valves in Dundee where I was in the engineering office working on 2D and 3D drawings and models, alongside pressure retaining calculations and some supply chain work. The main skills gained from this was the ability to apply a design code to calculations correctly and figuring out how to do so effectively. I took it upon myself to use 3 different design codes in my thesis: the ASME BPVC, BS EN 13445 and PD500.

My poster also managed to highlight the struggles I had faced with being a student applying design codes and how certain sections and attributes would be left to ‘the experience of the engineer’, along with a few others. However despite these struggles and a global pandemic I still managed to get a First-Class Honours degree from home.

My experience of the virtual congress and meeting the other finalists was great. Hearing about everyone else’s posters and what they were currently working on was very interesting and actually gave me little hope I’d manage to win since I seemed to be the only one at my level- just finishing my fourth year and graduating. If you’re thinking on entering you definitely should even if you don’t think you’ll get anywhere with it. I never thought I’d be long-listed, never-mind a finalist or winning it so you never know what will happen! By next years congress I am hoping to be able to finally get a graduate job as COVID-19 has cancelled most, if not all, opportunities for me to get one right now. I have however managed to put the prize money towards a new iPad Air that will keep me entertained in the mean time, to draw and do some art commission work before hopefully moving forward in my engineering career.”

For information on other Hammermen awards, please visit their website.

EPC engineering enrolments survey results 2020/21

The EPC engineering enrolments survey gives us an early annual temperature check of the health of HE undergraduate and postgraduate engineering enrolments.

Sector pressures undoubtedly impacted the 2021 survey. The EPC reduced our focus to the changes experienced in new enrolments in engineering this autumn (plus changes in deferral and attrition behaviours) leaving us without underlying distributions to share with you. Our members faced unprecedented barriers to engagement this year, including competing deadlines and priorities; timing issues; complexities and structural changes resulting from Covid-19; and sensitivity and caution around their position.

Despite these compromises, we have together created a survey sample of:

  • 35 EPC member universities
  • covering both undergraduate and PG cohorts
  • with well distributed responses across engineering (175 distinct disciplines)
  • representing all countries and regions of the UK
  • and a typical institution type profile of previous annual EPC enrolment surveys.

Furthermore, we know our annual survey typically broadly reflects published engineering HE trends (many months after we share our findings). We tend to capture:

  • similar domicile profiles (roughly 3 in 4 undergrads in our survey are UK domiciled and approximately 2 in every 3 postgraduates are international)
  • a high proportion of overseas postgraduate enrolments, especially within the Russell Group
  • and the popularity of Mechanical (undergraduate) and Electrical, electronic and computer engineering (postgraduate).

We hope we have judged this year’s particularly challenging balance of burden, valid coverage and utility in a way which both instils your confidence and continues to offer useful insight. We are grateful to those of you who assured us of the survey’s continued high value during the collection process and thank you for recognising both its strengths and weaknesses. Your support for the survey is greatly appreciated, thank you.

Enrolments compared with 2019-20

In total, the undergraduate level of change tended towards growth, with over one-third of responses (reported by discrete discipline level) recording new enrolments at more than 10% higher – in fact, many of you reported even greater growth, citing the central policy change days into confirmation (Centre Assessed Grades) as an impetus for unexpected growth. This is supported by the pattern of home undergraduate enrolment growth, including increases of more than 10% which were higher among home enrolments than overall. Given that home undergraduates represent approximately ¾ of the undergraduate population, we can be relatively assured that 2020 was a healthy year for home undergraduate engineering enrolments. (Figure 1.)

What’s more, both EU and non-EU overseas changes in enrolment are pretty uniform, with reported levels about the same as last year dominating the mid-point with substantial range. This suggests stability in the overseas engineering undergraduate sector again this year, despite earlier fears in the sector of the impact of Covid-19 on international travel. (Figure 1.)

However, we observe less stability at postgraduate level, with non-EU enrolments showing an overall decline; over half of postgraduate engineering disciplines surveyed reported a drop in enrolments of more than 10%. Around 40% of postgraduate engineering disciplines surveyed reported a drop in EU enrolments. These differences are important as approximately 2 in every 3 engineering postgraduates are international, suggesting the overseas engineering market may have shrunk this year in this context. (Figure 1.)

Overall, home engineering postgraduates are reported to have increased, by more than 10% in over 40% of responses. Despite this, nearly 1 in 3 responses recorded enrolments to be more than 10% lower in this cohort, indicating a mixed bag within the engineering postgraduate sector. (Figure 1.) Further analysis shows that Mechanical engineering is of note at postgraduate level, clearly bucking the general decline overall in 2020. The postgraduate engineering superstar, Electrical, electronic and computer engineering appears relatively stable. (Figure 2.) This is not so at undergraduate level, where this discipline bucks the growth trend (Figure 3).

By university type, the survey reveals that the greatest decline in postgraduate engineering enrolments is skewed towards Russell Group members, who see higher international enrolments overall. Conversely, Russell Group universities dominate the increasing trends in undergraduate enrolments in engineering; with 60% of distinct disciplines at Russell Group members reporting undergraduate growth, compared with about half of that proportion at their non-Russell Group counterparts. (Figure 4.)

Regional analysis highlights a clear North / South divide at postgraduate level; decline is more pronounced than growth everywhere north of and including the Midlands. London and Wales show the greatest relative growth. (Figure 5.)

At undergraduate level, the greatest enrolments stability (in terms of “about the same” responses) is in Wales although we witness greater decline than growth (in small measure), However, the proportion of respondents reporting a decline in new enrolments of greater than 10% is broadly in line with the other regions; this is a remarkably consistent measure at undergraduate level, with the exception only of London and the East / South east. The Midlands shows the greatest proportional decline in undergraduate enrolments overall. London witnesses the greatest relative growth. (Figure 6.)

Deferrals

We sought data on deferrals in the context, initially, of reports that applicants might be deferring in greater numbers due to the uncertainty resulting from the pandemic. However, as the cycle progressed, it became clear that deferrals were also being driven by universities struggling to cope with the impact of the unprecedented shifting of results during confirmation in 2020 (qualifying a high number of conditional firms). The data we received was not extensive but did suggest an overall tendency towards more deferrals than last year, particularly among postgraduates (specifically international postgraduates) and Non-EU and home undergraduates. (Figure 7.)

Attrition

The data we received on students not returning / dropping out was patchy and ambiguous, not least because of delayed processes, exceptional delayed starts this year and more general confusion around (non) returning students and how to count them. However, the data we did receive does suggest an increase in dropout; in fact, some qualitative reports of increased progression as a result of pandemic pressures were received.

Context

Of course, we know that recruitment and selection is made in the context of institutional recruitment and admissions strategies and internal strategic pressures in any year. For example, some of the fluctuations in numbers will be influenced not current market forces but by changing course profiles and conscious decisions to adjust cohort size,.

In the devolved countries, in particular, we were told that these were also influenced by government policy and funding initiatives. Other policy factors cited included managing the impact of the central government changes to the summer exam series results and the economic context in which student decision-making occurs.

This year, many members were mindful of the impacts of Covid-19 on overseas partnerships, geography and travel limitations, and shared a wealth of administrative workarounds at play including delayed starts, arrivals and registrations and exceptional late intakes.

Sector behavior was also cited as a factor in enrolment trends; there was a perception of some pre-result unconditional offer making and, initially, unusual confirmation at lower than advertised entry requirements.

Further information

In addition to the images published above, the slide deck from the EPC Recruitment and Admissions Forum launch is available to download for all EPC members.

Are the OfS proposals on quality and standards good for the sector?

In this blog, Chief Executive Johnny Rich provides a critical commentary on the Office for Students (OfS) proposals on quality and standards – potentially one of the most important changes to its practice since its inception. 


These are his personal opinions to help stimulate the community into synthesising for themselves the possible implications and they do not necessarily reflect the views of the EPC nor its members.


We’ve posted a summary of the OfS consultation paper here and the OfS’s own documents can be found here. 

We are conducting a survey, so that the EPC can give a representative response. Please contribute your thoughts.

(Rhys A. via Flickr, CC-BY-2.0)

The Department for Education has got a hold of the idea that ‘low quality’ courses are rife in UK higher education. It is determined that they must rooted out. It’s hard to argue with a policy of tackling shortcomings – that makes it good politics, without necessarily being good policy. 

The problem is that there’s not actually sound evidence to support this idea of low quality. What do we even mean by it? 

Until recently the terminology was ‘low value’, but a pandemic-related newfound appreciation for low-paid nurses made it seem too reductive to focus too much on graduate salaries and that highlighted how problematic it is to define value. So the DfE now prefers to talk about ‘quality’ but the definition is no clearer. 

Never mind, that’s not the Government’s problem. The English regulator can worry about that. That’s why we now have a potentially far-reaching consultation from the Office for Students (OfS) on quality and standards and what, in future, the OfS proposes to do to hunt down low quality and quash it with “tough new measures”. 

As the consultation was launched, Universities UK boldly stepped up to defend the honour of courses everywhere by announcing a new charter on quality. Well, not actually a new charter, but a working group to consider one. I fear, however, that the possibility of a charter is like trying to slay this dragon with a plastic butter knife.

“This is full-on chemotherapy to treat a few bunions.”

So what’s so bad about the proposals? Surely, if there is any low quality, we should all want to cut it out? And if there’s not, there’s nothing to fear? Sadly, the cure is not harmless. This is full-on chemotherapy to treat a few bunions. 

It would be complacent to imagine there are not many courses and whole institutions that can be improved and there are many in HE who say we won’t win friends by being close-minded to criticism. I agree. Indeed, the EPC regularly engages in activities to support enhancement of teaching, learning and student experience. 

But we don’t do anyone any favours by not being rigorous about critics’ motives and the evidential basis for change. Every international comparison suggests we have one of the best HE systems in the world and calling for “tougher measures” as if there’s a regulation deficit is more to do with doing something than doing something necessary or even justified. 

This isn’t about fixing an actual problem. It’s about pandering to backward-looking diehards that see unis as too open, full of ‘Mickey Mouse’ courses with too many students. 

No one could call engineering a Mickey Mouse course, though, so perhaps we needn’t worry? Well, even if we didn’t care about colleagues and students in other disciplines, rather than improve HE, the proposals are likely to narrow fair access, stunt social mobility and protect elitism while cementing the goal of high education as basically a job conveyor belt. That’s not good for anyone.

Let’s just start by mentioning the timing. As EPC members know all too well, in recent months it’s been really tough to deliver high-quality courses and so choosing this moment to launch a major consultation on delivering high-quality courses, is, to say the least, insensitive. In all likelihood, it may distract from the very delivery that they want to improve and any current data that may be used to inform the consultation is going to be from the most outlying wildernesses of every bell curve. 

OfS proposes gifting itself greater powers to assess Higher Education Institutions (HEIs) more closely, including at subject level, and to apply sanctions for performance metrics (on continuation, completion and progression) that it deems unsatisfactory. These sanctions could include fines and even deregistration. A full summary of the OfS proposals is here.

“There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences.”

There are many obvious concerns about this. For a start, metrics are a very blunt instrument. There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences. (Ironically, this is exactly the argument that the Government has recently been deploying about the National Student Survey, which it has suddenly decided is the cause of supposed dumbing down. Again, there’s no actual evidence of this. Nevertheless, the OfS is currently reviewing that too.)

OfS wants to take a more metric-based approach in order to reduce bureaucratic load, which is fair enough, but if you want data to tell you anything useful, you really need to understand the context. No two HEIs are alike and the same numbers can tell very different stories. 

The consultation does explicitly acknowledge that the context needs to be considered, but the context will explicitly exclude anything to do with the socioeconomic disadvantage or other protected characteristics of the students (disability, ethnicity, etc). OfS intends to impose non-negotiable “numerical baselines” – ie. cut-offs for universities with outlying data – whatever the reason.

Some unis and courses will end up being ‘low quality’ for a host of reasons to do with their intake rather than anything that they might actually be doing wrong. Quite the opposite, trying too hard to do the right thing will open them up to sanctions. 

For example, dropout rates are higher among students with extra financial or social challenges, and bias in recruitment practice disadvantages certain graduates. So if students are from lower socioeconomic or minority ethnic backgrounds, or they are disabled or they are returners to study, their course might look ‘low quality’ while actually the prospects of those students (compared to not having achieved that degree) have been greatly improved.

BTEC students, for instance, have far higher non-continuation rates on engineering courses than students with A level maths and physics. When they do graduate, they face higher hurdles in gaining employment because they may not have the connections, the extra-curricular brownie points and the right accent. Is it really fair for the OfS to hold an HEI that helps these students establish fulfilling lives to the same standards as a university with nearly half of its intake with straight As from private schools?

HEIs could also be penalised for being based in parts of the country with lower employment rates or for drawing students from the locality who might want to stay in their home region post-graduation. Social mobility should not have to mean geographic mobility. To many students a positive outcome means worthwhile employment in their home region rather than maximising their income by moving away. 

This is not only a fair choice for them to make, it’s a really positive choice for the Government’s goal of levelling up regions by creating high-skilled employment in disadvantaged areas. Penalising universities that support this is counterproductive.  

“Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades?”

Outcomes data will also be subject to the vagaries of economic circumstances. Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades? These effects can happen locally too, which means they affect individual universities and subjects. For example, if a big local employer exits a region, there may be a knock-on effect for local courses and graduates.

Employment effects take time to show up in the data – a lag of several years if you want to get a reliable picture. By design, the metrics will identify only those stables where horses have long since bolted. By the time problems show up in the data, the HEI will have known about it for a while and may well have either improved or closed a course if it was genuinely deficient. “Tougher measures” won’t support this in any way, but they might close courses that have turned around.

Positive employment outcomes need to show themselves quickly. HEIs won’t want to encourage enterprising students to start businesses that may take a few years to mature, earn money for their founders and create wider jobs and prosperity. Because that would be ‘low quality’.

And, of course, the focus on continuation and employment penalises any subject that attracts students who are studying for love of learning rather than for the sake of optimising their employment outcomes.

It also penalises those courses that allow students to do anything other than join a course, stay the duration, and graduate. None of the hop-on-hop-off flexibility that the Government has been urging in other policy initiatives and which the evidence says is needed.

By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics.”

Worst of all, depending on how the ‘tougher measures’ are applied, it is statistically inevitable that a recklessly wielded axe will cut off healthy branches. By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics. 

There will always be a bottom of the pile to be removed. Someone will always need to be penalised to justify the quality assurance process. Being in the relegation zone in the Premier League doesn’t mean yours is a bad team, it simply means a team whose performance has been relatively less good amongst the very best. Sadly, the sanction for HEIs and courses will not be relegation, but elimination.  

If the comparison of what constitutes a ‘low quality’ course is made at a subject level, rather than across all HE courses, then some departments that have good metrics compared to other subjects, will be made to suffer. For example, an engineering course that is ‘low quality’ in comparison to other engineering courses, may be sanctioned. 

However, on the other hand, if the comparison is made at HEI level, then certain subject areas will be the victim because their outcomes do not translate easily into highly paid workplaces. Heads I win, tails you lose. 

Ultimately, this is likely to encourage universities to be risk-averse in their admissions, effectively raising the bar for any students that don’t look like those who have been successful in the past, closing down the opportunities until only those least in need of a break can get a look-in.

Even if all these proposals were a good idea – and you may have gathered I don’t think they are – this level of oversight by the regulator might not even be legal. I am sure the OfS has consulted its lawyers carefully, but it’s hard to square this with what was intended by the Higher Education & Research Act (HERA). 

HERA’s progress through the chamber of Parliament saw more amendments than almost any other bill ever laid before the Lords and, among those that the Government was forced, reluctantly, to accept was the legal protection of the institutional autonomy of universities. The Lords could see that one day the regulator would be asked to overstep the mark and tried to set protections in stone. These proposals would undermine those protections, undermine that autonomy and – ironically – undermine the very standards of high-quality education for all who can benefit from it that they seek to improve.


Do you agree? Do you disagree? Are there other important points to make? How might this affect engineering courses? Please respond to our survey here.

The OfS consultation on quality and standards in a nutshell

The Office for Students has just launched a consultation on one of the most important changes to its practice since its inception. What does it say? We’ve summarised the key takeaways. We have also published a personal perspective on the wisdom of the proposals by the EPC Chief Executive.

In 2017, the Higher Education & Research Act (HERA) dissolved HEFCE, which was a funding body, and replaced it with the OfS which began work the following year as the regulator of higher education in England. In the process it subsumed the remaining activities of HEFCE and OFFA (the Office for Fair Access). 

Since then, some of OfS’s main activities have included establishing a register of approved higher education institutions and signing off on the ‘Access and Participation Plans’ of those institutions that want to be able to claim funding via the Student Loans Company. 

The OfS’s regulation of HE quality and standards has been through signalling and recognisable processes, mostly farmed out under a contract with the QAA. There have been a few interventions from OfS on grade inflation, unconditional offers and TEF, but these haven’t been accompanied by significant new regulatory controls. 

Although OfS does have powers in case of failure (and it has used them by rejecting the registration of a few institutions), its light-touch approach was in keeping with the spirit of HERA, which, during its difficult passage through the Lords was amended to include an explicit commitment to the autonomy of higher education institutions (HEIs) over their admissions and the education they deliver. 

But now the OfS is consulting on a what it calls “tougher minimum standards” with the threat of fines and even deregistration for HEIs that don’t meet them. These powers, it is proposed, will be exercised not merely at an institutional level, but at a subject level too, which, in effect, might allow OfS to exert direct or indirect pressure on an HEI into closing a department whose metrics looked like underperformance. 

The EPC will be responding to this consultation on behalf of members and we’re keen to hear what you think. We will be inviting members views through a survey shortly. (Come back here for the link.) To help you, we’ve provided the following summary of the proposals.

So what are the proposals? There are four areas:

1. “Define ‘quality’ and ‘standards’ more clearly for the purpose of setting the minimum baseline requirements for all providers”

‘Quality’ will be defined in a metric way. This is, it is said, intended to reduce the regulatory burden. The metrics will relate to five areas: access and admissions; course content, structure and delivery; resources and academic support; successful outcomes; secure standards. 

The inclusion of ‘access’ does not mean wider participation targets, but rather admitting students who “have the capability and potential to successfully complete their course”. OfS has been explicit in saying that it “is not acceptable for providers to use the proportion of students from disadvantaged backgrounds they have as an excuse for poor outcomes”. In other words, they are rejecting the idea that non-academic circumstances or lower prior attainment might be mitigating circumstances for lower (according to the metrics) student outcomes. The argument put forward is that using the greater challenges of certain students as an “excuse” would “risk baking disadvantage into the regulatory system”.

The goalposts will be different for new HE institutions, because they can’t be judged on track record.

OfS will also set ‘standards’ for higher education – that is any courses beyond A level or equivalent (so that means drawing higher apprenticeships and other programmes into a unified quality framework). These standards will involve “sector-recognised” definitions of achievement – in other words, OfS intends to establish common standards for degree grades.

2. “Set numerical baselines for student outcomes and assess a provider’s absolute performance in relation to these”

OfS would impose “a numerical baseline”: this is intended to be a cliff edge for outcomes metrics, namely continuation to second year, course completion and progression into graduate-level work or further study. (There’s also a reference to employer satisfaction, but as there are no measures for that, it’s only an aside.) If you fall off the cliff, there’s a range of sanctions (see below) including fines or even deregistration of the institution.

What will matter is absolute – not relative – data. There is a reference to considering the context, but this is more to do with what may have changed rather than a profile of the student body. Unequivocally, the consultation paper states, “We would not set lower regulatory requirements for providers that recruit students from underrepresented groups, or with protected characteristics.” The idea is to spell out “more challenging” minimum standards that students can expect. 

Further consultation will be conducted around the exact metrics.

3. “Clarify the indicators and approach used for risk-based monitoring of quality and standards”

As the metric used for the baseline are about things that have happened in the past, the OfS proposes to keep an eye on potential risks in institutions by monitoring other metrics and being clear about which metrics those are. Among those mentioned are admissions data (offers, grades achieved, student demographics), student complaints, National Student Survey results, other regulators’ and PSRBs’ activities, TEF, and the outcomes metrics as above. It should be noted, by the way that NSS is currently under a separate OfS review and we’ve been awaiting the publication of an independent Review of TEF for DfE for nearly two years (which is believed to be critical).

There may be some extra data gathering and reporting for universities, but the intention is to minimise the need for unnecessary interference in the long-run by identifying risks before they become problematic outcomes. 

4. “Clarify our approach to intervention and our approach to gathering further information about concerns about quality and standards”

This proposal sets out what might be called a precautionary approach to intervention. In other words, the OfS makes it clear they would be willing to step in to investigate or gather evidence in the case of a feared risk of an institution failing to meet quality thresholds. 

It also sets out their available “enforcement” actions: impose conditions on an institution in order for it to continue to be registered; issue a fine; suspend some of the privileges of being registered (such as access to student loan funding for fees or OfS public grants); remove an institution’s degree-awarding powers or its right to use ‘University’ in its title; deregistration.

Please note: This precis is intended as guidance only. The aim has been to summarise the proposals objectively while providing some interpretation of their implications. Necessarily this involves some subjective inference and the omission of details. We advise referring to the OfS’s own consultation documents for the full details. Also, if you feel we have interpreted any proposals wrongly, unfairly or left out critical details, please let us now and we can make changes to this summary as needed.