Reflections on REF

As the dust settles on REF 2021, former EPC President, Prof Sarah Spurgeon, looks back at the process from her insider perspective as an Engineering sub-panel member and interdisciplinary advisor.

REF 2021 is finally over, and the results are out. Whether we were involved as individual researchers, collating submissions for our institutions or involved directly in the assessment panels and sub-panels, we all have a moment to take breath before the next cycle commences.

For the whole REF process, the increased selectivity of outputs combined with a requirement to submit all staff with significant responsibility for research has impacted across all submissions. This meant that for Engineering, the category A headcount rose from 5,279 in 2014 to 7,750 in 2021, a massive 47%. For the Engineering sub-panel, REF 2021 saw another significant and specific change: four individual Engineering sub-panels in 2014 (12. Aeronautical, Mechanical, Chemical and Manufacturing Engineering, 13. Electrical and Electronic Engineering, Metallurgy and Materials, 14. Civil and Construction Engineering, and 15. General Engineering) were combined into one single sub-panel (12. Engineering) for REF 2021. This led to a significant decrease in the number of impact case studies submitted across Engineering (from 621 to 483, -22%) and a significant decrease in the number of submissions (from 138 to 88, -36%).

A key driver for the move to a single sub-panel was to give a single picture of the health of the discipline of Engineering across the UK. This picture was harder to obtain previously when, for example, depending on the way in which Engineering units are constituted across different HEIs as either unified Engineering units or disciplinary units, this could impact whether teams in the same disciplinary area were submitted to one of the more themed sub-panels or to the General Engineering sub-panel. REF 2021 certainly succeeded in this aim, facilitating a perspective of Engineering impacts and environments across whole institutions in a unified manner.

Who can fail to be impressed by statistics showing such a high level of overall quality of Engineering research, with 91% of outputs assessed as being at least internationally excellent, with over 87% of the volume-weighted impact results judged to have very considerable or outstanding reach and significance and with over 85% of the volume-weighted environment results judged to demonstrate vitality and sustainability conducive to producing research of internationally excellent or world-leading quality?

Interdisciplinary research

The assessment of interdisciplinary research (IDR) has been another area of change for the 2021 exercise. An underpinning principle of REF is that all types of research and all forms of research output across all disciplines shall be assessed on a fair and equal basis including interdisciplinary and collaborative research.

The independent review of REF 2014, chaired by Lord Stern, noted concerns within the research community regarding IDR where it was perceived to have been disadvantaged at assessment. The lower proportion of IDR submitted relative to what might have been expected suggested low confidence within the community of fair treatment for IDR. It is the case after all that the Units of Assessment (U0A) around which REF assessment is built are entirely discipline-based. From the outcomes of Lord Stern’s review, it was recommended that structures should be implemented to support submission and assessment of IDR in REF 2021.

Consequently, Professor Dame Athene Donald was appointed to Chair what was called the REF Interdisciplinary Research Advisory Panel (IDAP). This was an excellent choice as Dame Athene Donald is well known for her interdisciplinary research contributions as well of course for her absolute commitment to inclusion. It was perceived by IDAP that the main issues for the assessment of interdisciplinary research arise around output assessment. Interdisciplinary work underpinning impact case studies was perceived to have had no issues in REF 2014 while the Environment Statement provides an opportunity to discuss a unit’s approach to interdisciplinary research without including specific metrics. IDAP produced a clear definition of what interdisciplinary research was, and indeed what it was not, and how it might be distinguished from a simple case of cross-referral.

Champions or advisors

A further strand of work related to how the actual assessment should be carried out. Lord Stern had recommended champions but it was felt by IDAP that this did not convey the right message – interdisciplinary work should not need championing by individuals so much as to be judged on a level playing field.

IDAP therefore recommended that each main panel should have a least one member with interdisciplinary research experience and each sub-panel should recruit at least two people identified as interdisciplinary advisors and for the Engineering sub-panel, I was one of the appointed interdisciplinary advisors. IDAP also decided that to support these main panel and sub-panel interdisciplinary advisors, networks should be established with whom IDAP would interact. It was felt that this network would provide a forum for discussion and support during the process of interdisciplinary research assessment.

These recommendations were made in 2019 and I can confirm that the network proposal worked extremely well initially. I had the opportunity to take part in network meetings across all main panels to discuss interdisciplinary research and to explore any boundaries between disciplines while developing a network of peers across a diverse community of units of assessment to support the assessment of interdisciplinary research in the Engineering sub-panel.

Unfortunately, Covid hit and this networking could no longer continue as planned. Covid regulations did not facilitate networking in the same way and indeed all of those involved in the research assessment were busy not just with the assessment but also frequently with day jobs that had also became increasingly complex. So the aspiration with regard to networking around IDR which showed so much promise initially was greatly reduced in scope. I hope it will be able to develop to maturity in any future exercises.

IDAP stated that, for the purposes of REF, the definition of interdisciplinary research is that which is understood to achieve outcomes (including new approaches) that could not be achieved within the framework of a single discipline. Interdisciplinary research features significant interaction between two or more disciplines and may move beyond established disciplinary foundations in applying or integrating research approaches from other disciplines.

Originality, significance and rigour

As for any output, disciplinary outputs were assessed against the generic criteria of originality, significance and rigour. In addition to this, IDAP provided guidance to sub-panels that originality and significance can be identified in one, some, or all of the constituent parts brought together in the work, or in their integration; they do not need to be demonstrated across all contributing areas or fields. This is important. It means that the originality does not need to be in, for example, an engineering method, but can be in the application of an established method into a new domain.

In terms of observations on submissions to the Engineering sub-panel with respect to interdisciplinarity, the range of research disciplines was wide and there was extensive evidence of interdisciplinarity. Indeed, there was a welcome increase as we might have expected in the number of interdisciplinary outputs submitted. It was notable that there was inconsistent use of the IDR flag by HEIs. Some HEIs used the flag extremely liberally, whereas others didn’t flag any outputs as IDR. Some units may have left the decision with regard to the IDR flag with the individual academics who wrote particular outputs which again didn’t favour a consistent approach.

Accompanying statements

The 100-word statement that can accompany any output submitted to the Engineering sub-panel is a useful mechanism not just to assist panels in seeing the significance of an output but also to contextualise interdisciplinary research.

For REF 2021, our Engineering sub-panel considered that the nature of the discipline was such that the significance of an output may not be fully evident within the output itself. They therefore invited factual information to be provided (maximum 100 words) that could include, for example, additional evidence about how an output had gained recognition, impacted the state of the art, led to further developments or had been applied. Some HEIs did not take advantage of this opportunity. If the 100-word statement continues into future exercises I would encourage all colleagues to use it and to apply the rules for use, otherwise the information will be ignored.

Calibration

Calibration across sub-panel members, different sub-panels as well as main panels is an important factor in the research assessment process. For outputs, impact case studies and environmental statements, extensive calibration took place, and this calibration was particularly important for interdisciplinary research.

In all cases the main panel calibration sample included items from across all the sub-panels, including a selection of interdisciplinary outputs. These items were then also included in our own sub-panel calibration exercises.

This really ensured there was a shared understanding of standards for assessment across the process. There is a general view that the material covered by all the sub-panels in main panel B is becoming ever more interdisciplinary, and this makes it far easier to recognise quality across multiple disciplinary boundaries.

An Engineering taxonomy

In terms of output allocation, the Engineering sub-panel used an extensive taxonomy that was published in the sub-panel working methods. To those of us involved in REF assessment for the Engineering sub-panel, I think mention of this taxonomy will cause us to smile for some time to come.

It came about for the very best of reasons. The sub-panel started by merging together the areas described across the original four sub-panels from ref 2014 and then sought to merge this information into a coherent single taxonomy. This taxonomy was then taken back out for consultation to the Engineering community. This caused the taxonomy to grow again.

With hindsight and probably as would be expected for a first attempt, this taxonomy wasn’t optimal and I’m sure it’s something that will be looked at for future Engineering research assessment.

It became the means by which outputs were allocated and indeed the means by which interdisciplinary boundaries were identified. Sub-panel members took on different elements from within the taxonomy to define their own expertise whilst submitting units identified elements from the taxonomy to associate with an output. This information was used to algorithmically allocate outputs to individuals ensuring outputs with interdisciplinary aspects were allocated to the right individuals.

I think most of us would agree that interdisciplinarity is inherent in Engineering and it was felt and found that the sub-panel had the appropriate membership and processes in place to enable us to robustly assess most of the interdisciplinary work using the taxonomy to ensure outputs were allocated to the right individual sub-panellists.

The fact that we had the right expertise is reinforced by sub-panel data showing that only 442 outputs were cross-referred out of sub-panel 12 from a total of 18,282 outputs (less than 3%). In fact, 324 of these outputs were cross-referred to other sub-panels in response to a direct request from the submitting HEI and only a further 120 (less than 1% of the total number of outputs) were cross-referred to provide additional guidance for assessment of interdisciplinary research.

Reflections

The outcomes for Engineering as reported earlier were fantastic. Interdisciplinary research was found to have become mainstream in our sub-panel as we would expect. Perhaps disappointingly, there is no specific data on interdisciplinary research as a result of the sub-panel assessment; issues with inconsistent use of the interdisciplinary flag by the higher education institutions were unhelpful in this regard.

The taxonomy, with further work, could be another approach to deliver this in the future. The sub-panel received outputs spanning the full breadth of Engineering, ranging from the highly theoretical to applied output, but we observed a high proportion of theoretical outputs were submitted. Does this indicate that interdisciplinary research outputs are still perceived as a greater risk in terms of assessment when compared with a strong theoretical paper with a robust 100-word statement supporting wider application in potentially interdisciplinary contexts?

I’ve commented on the decrease in the number of submissions and the number of impact studies. For the Engineering community, we should continue to ask whether the changes for REF 2021 have inadvertently negatively impacted smaller units.

For REF 2021 the minimum number of impact case studies to be submitted per unit of assessment was two. In general, the number of impact case studies is one case study plus one further case study for each 14.99 FTE in a UoA. This rule applies for the first 105 FTE returned in a UoA. After that one further case study is required per 50 FTE returned. The word limit for the environment statement is given in Table 1. The size of different units of assessment to the Engineering panel varied from less than 10 to more than 400.

A hypothetical unit with 10 FTE would need to submit two impact case studies and would have 8,000 words to describe the research environment. Another hypothetical unit with 300 staff would need to submit 12 case studies and have 21,600 words to describe the research environment. So, for an increase in staff numbers of 30x, the required number of impact studies increases by 6x and the research environment description increases by 2.7x.

This has the potential unintended consequence of giving larger UoAs a better chance to be selective than smaller UoAs have. This may be a consequence of an assessment that seeks to assess, among other things, vitality and sustainability. Further analysis is certainly required of the impact of nonlinearity in reporting versus size of UoA which I am sure EPC will be looking at in greater depth in the coming period.

Number of category A FTE staff Word limit for environment template
1-19.99 8,000
20-29.99 8,800
30-39.99 9,600
40-49.99 10,400
50-69.99 11,200
70 or more 12,000 plus 800 further words per additional 20 FTE

Table 1 Word limits for the unit-level environment template

Another area for the future which EPC has already highlighted in its response to the Future Research Assessment Programme consultation is the need for REF submission systems to be evolved to support processes around interdisciplinary research identification and allocation for assessment, particularly for large sub-panels like Engineering. There are good algorithmic and data analysis approaches to help with this and such innovations may help us in collecting robust data particularly on interdisciplinary outcomes.

In conclusion, we should acknowledge that the colleagues on our Engineering sub-panel did a fabulous job for our community in what were unusually difficult circumstances. We can all have confidence in the outcomes.

Professor Sarah Spurgeon OBE FREng is Head of Department of Electronic and Electrical Engineering at University College London, a member of the EPC Research, Innovation and Knowledge Transfer (RIKT) Committee, and a Sub-panel 12 Engineering Member and Interdisciplinary adviser for REF 2021.

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
Related articles

REF 2029 Main and Sub-Panel recruitment and collaboration

The REF 2029 panels recruitment has now been launched. They are looking to appoint individuals to the following roles: Sub-panel chair...

News

Science, Innovation and Technology Select Committee inquiry into “Innovation, growth and the regions”

The Science, Innovation and Technology Select Committee is launching an inquiry into “Innovation, growth and the regions” to assess the...

Consultations

Policy post: Autumn/Winter 2024

Every term, the EPC Chief Executive, Johnny Rich, provides an update to EPC reps in each university and to the...

News

Department for Education (DfE) Curriculum and Assessment Review – call for evidence

The DfE sought views from stakeholders to share their perspective on potential improvements to the curriculum and assessment system. The...

News
Let us know what you think of our website