In this blog, Chief Executive Johnny Rich provides a critical commentary on the Office for Students (OfS) proposals on quality and standards – potentially one of the most important changes to its practice since its inception. These are his personal opinions to help stimulate the community into synthesising for themselves the possible implications and they do not necessarily reflect the views of the EPC nor its members.
The Department for Education has got a hold of the idea that ‘low quality’ courses are rife in UK higher education. It is determined that they must rooted out. It’s hard to argue with a policy of tackling shortcomings – that makes it good politics, without necessarily being good policy.
The problem is that there’s not actually sound evidence to support this idea of low quality. What do we even mean by it?
Until recently the terminology was ‘low value’, but a pandemic-related newfound appreciation for low-paid nurses made it seem too reductive to focus too much on graduate salaries and that highlighted how problematic it is to define value. So the DfE now prefers to talk about ‘quality’ but the definition is no clearer.
Never mind, that’s not the Government’s problem. The English regulator can worry about that. That’s why we now have a potentially far-reaching consultation from the Office for Students (OfS) on quality and standards and what, in future, the OfS proposes to do to hunt down low quality and quash it with “tough new measures”.
As the consultation was launched, Universities UK boldly stepped up to defend the honour of courses everywhere by announcing a new charter on quality. Well, not actually a new charter, but a working group to consider one. I fear, however, that the possibility of a charter is like trying to slay this dragon with a plastic butter knife.
“This is full-on chemotherapy to treat a few bunions.”
So what’s so bad about the proposals? Surely, if there is any low quality, we should all want to cut it out? And if there’s not, there’s nothing to fear? Sadly, the cure is not harmless. This is full-on chemotherapy to treat a few bunions.
It would be complacent to imagine there are not many courses and whole institutions that can be improved and there are many in HE who say we won’t win friends by being close-minded to criticism. I agree. Indeed, the EPC regularly engages in activities to support enhancement of teaching, learning and student experience.
But we don’t do anyone any favours by not being rigorous about critics’ motives and the evidential basis for change. Every international comparison suggests we have one of the best HE systems in the world and calling for “tougher measures” as if there’s a regulation deficit is more to do with doing something than doing something necessary or even justified.
This isn’t about fixing an actual problem. It’s about pandering to backward-looking diehards that see unis as too open, full of ‘Mickey Mouse’ courses with too many students.
No one could call engineering a Mickey Mouse course, though, so perhaps we needn’t worry? Well, even if we didn’t care about colleagues and students in other disciplines, rather than improve HE, the proposals are likely to narrow fair access, stunt social mobility and protect elitism while cementing the goal of high education as basically a job conveyor belt. That’s not good for anyone.
Let’s just start by mentioning the timing. As EPC members know all too well, in recent months it’s been really tough to deliver high-quality courses and so choosing this moment to launch a major consultation on delivering high-quality courses, is, to say the least, insensitive. In all likelihood, it may distract from the very delivery that they want to improve and any current data that may be used to inform the consultation is going to be from the most outlying wildernesses of every bell curve.
OfS proposes gifting itself greater powers to assess Higher Education Institutions (HEIs) more closely, including at subject level, and to apply sanctions for performance metrics (on continuation, completion and progression) that it deems unsatisfactory. These sanctions could include fines and even deregistration. A full summary of the OfS proposals is here.
“There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences.”
There are many obvious concerns about this. For a start, metrics are a very blunt instrument. There are no reliable metrics of success, only proxy measures – and when you let proxies do your decision-making, you get people gaming the data and you get unintended consequences. (Ironically, this is exactly the argument that the Government has recently been deploying about the National Student Survey, which it has suddenly decided is the cause of supposed dumbing down. Again, there’s no actual evidence of this. Nevertheless, the OfS is currently reviewing that too.)
OfS wants to take a more metric-based approach in order to reduce bureaucratic load, which is fair enough, but if you want data to tell you anything useful, you really need to understand the context. No two HEIs are alike and the same numbers can tell very different stories.
The consultation does explicitly acknowledge that the context needs to be considered, but the context will explicitly exclude anything to do with the socioeconomic disadvantage or other protected characteristics of the students (disability, ethnicity, etc). OfS intends to impose non-negotiable “numerical baselines” – ie. cut-offs for universities with outlying data – whatever the reason.
Some unis and courses will end up being ‘low quality’ for a host of reasons to do with their intake rather than anything that they might actually be doing wrong. Quite the opposite, trying too hard to do the right thing will open them up to sanctions.
For example, dropout rates are higher among students with extra financial or social challenges, and bias in recruitment practice disadvantages certain graduates. So if students are from lower socioeconomic or minority ethnic backgrounds, or they are disabled or they are returners to study, their course might look ‘low quality’ while actually the prospects of those students (compared to not having achieved that degree) have been greatly improved.
BTEC students, for instance, have far higher non-continuation rates on engineering courses than students with A level maths and physics. When they do graduate, they face higher hurdles in gaining employment because they may not have the connections, the extra-curricular brownie points and the right accent. Is it really fair for the OfS to hold an HEI that helps these students establish fulfilling lives to the same standards as a university with nearly half of its intake with straight As from private schools?
HEIs could also be penalised for being based in parts of the country with lower employment rates or for drawing students from the locality who might want to stay in their home region post-graduation. Social mobility should not have to mean geographic mobility. To many students a positive outcome means worthwhile employment in their home region rather than maximising their income by moving away.
This is not only a fair choice for them to make, it’s a really positive choice for the Government’s goal of levelling up regions by creating high-skilled employment in disadvantaged areas. Penalising universities that support this is counterproductive.
“Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades?”
Outcomes data will also be subject to the vagaries of economic circumstances. Were this year’s graduates ‘low quality’? Or is it just that they graduated into the worst labour market for decades? These effects can happen locally too, which means they affect individual universities and subjects. For example, if a big local employer exits a region, there may be a knock-on effect for local courses and graduates.
Employment effects take time to show up in the data – a lag of several years if you want to get a reliable picture. By design, the metrics will identify only those stables where horses have long since bolted. By the time problems show up in the data, the HEI will have known about it for a while and may well have either improved or closed a course if it was genuinely deficient. “Tougher measures” won’t support this in any way, but they might close courses that have turned around.
Positive employment outcomes need to show themselves quickly. HEIs won’t want to encourage enterprising students to start businesses that may take a few years to mature, earn money for their founders and create wider jobs and prosperity. Because that would be ‘low quality’.
And, of course, the focus on continuation and employment penalises any subject that attracts students who are studying for love of learning rather than for the sake of optimising their employment outcomes.
It also penalises those courses that allow students to do anything other than join a course, stay the duration, and graduate. None of the hop-on-hop-off flexibility that the Government has been urging in other policy initiatives and which the evidence says is needed.
“By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics.”
Worst of all, depending on how the ‘tougher measures’ are applied, it is statistically inevitable that a recklessly wielded axe will cut off healthy branches. By definition, some HEIs and subjects will always be less ‘successful’ than others according to the metrics.
There will always be a bottom of the pile to be removed. Someone will always need to be penalised to justify the quality assurance process. Being in the relegation zone in the Premier League doesn’t mean yours is a bad team, it simply means a team whose performance has been relatively less good amongst the very best. Sadly, the sanction for HEIs and courses will not be relegation, but elimination.
If the comparison of what constitutes a ‘low quality’ course is made at a subject level, rather than across all HE courses, then some departments that have good metrics compared to other subjects, will be made to suffer. For example, an engineering course that is ‘low quality’ in comparison to other engineering courses, may be sanctioned.
However, on the other hand, if the comparison is made at HEI level, then certain subject areas will be the victim because their outcomes do not translate easily into highly paid workplaces. Heads I win, tails you lose.
Ultimately, this is likely to encourage universities to be risk-averse in their admissions, effectively raising the bar for any students that don’t look like those who have been successful in the past, closing down the opportunities until only those least in need of a break can get a look-in.
Even if all these proposals were a good idea – and you may have gathered I don’t think they are – this level of oversight by the regulator might not even be legal. I am sure the OfS has consulted its lawyers carefully, but it’s hard to square this with what was intended by the Higher Education & Research Act (HERA).
HERA’s progress through the chamber of Parliament saw more amendments than almost any other bill ever laid before the Lords and, among those that the Government was forced, reluctantly, to accept was the legal protection of the institutional autonomy of universities. The Lords could see that one day the regulator would be asked to overstep the mark and tried to set protections in stone. These proposals would undermine those protections, undermine that autonomy and – ironically – undermine the very standards of high-quality education for all who can benefit from it that they seek to improve.
Do you agree? Do you disagree? Are there other important points to make? How might this affect engineering courses? Please respond to our survey here.