This was roundly confirmed by the call for evidence stage of the review, which garnered 153 responses and is summarised on HEFCE’s web-site. It is also apparent that commentary, debates and controversies about metrics will be with us for some time, whether in the press, on social media or elsewhere; see for instance, #HEFCEmetrics.
A focus on equality and diversity
A noticeable majority (57 per cent) of those who responded to the review’s call for evidence were negative about increasing the use of metrics.
Common among their concerns was that a further use of metrics could disadvantage under-represented groups: early-career researchers, women, those with disabilities, and black and minority ethnic (BME) academics.
These anxieties have also surfaced through other projects examining the use of metrics. Therefore, to explore the issue in greater depth, the review made equality and diversity the focus of an event held at the University of Sheffield earlier this month.
The workshop – ‘Metrics for all?’ – drew contributions from 45 people, from the academic community, learned societies, research offices, sector bodies, equality and diversity offices, sector press and data providers.
A human touch
What issues did the discussion provoke? A prominent theme was the need to humanise the debate; we need to consider the likely effects of different metrics and assessment structures on the diverse groups of people involved.
Metrics used to assess and measure HE should examine a range of levels and scales, whether institutions, departments or individuals. Moreover, context is paramount – methodologies and ethical considerations should be adapted to suit the scale of enquiry.
Due attention should also be paid to ubiquitous implicit biases, and it is evident that further research on such biases and their effects needs to be done.
Due consideration should be paid to the possible unintended consequences of research assessment regimes, in terms of the potential to change behaviours across different academic populations and scales of enquiry. For instance, metrics can become measures of performance more broadly, affecting higher education institution hiring practices, which may then have profound effects on research careers.
For individual researchers, metrics can unduly shape the character of academic practice. Metrics, in other words, become targets that early-career researchers think they have to meet.
The recent report from the Nuffield Council on Bioethics on the culture of scientific research has underlined some of these concerns.
Differences between disciplines
The issue of context extends to subjects. As disciplines evolve, metrics also need to adapt.
Potential and expected academic career trajectories vary between disciplines. For instance, in many arts and humanities subjects, academics are often employed on teaching-only contracts prior to achieving tenure; the recent Oakleigh Report provides some useful insights on this point. There is also considerable variation between disciplines in terms of expected achievement and academic age
It is also important to consider which academics are more likely to get involved with inter or multi-disciplinary projects, and what their associated outputs might look like. We need to consider the sort of indicators or metrics that could be used to assist rather than obstruct interdisciplinary working, and the kinds of equality and diversity issues that might be faced.
Some areas of research seldom make it into the top journals and this could have knock on effects for equality and diversity that also need to be borne in mind.
Where does this leave us?
Some potential ways forward were suggested by participants, but first, a word of caution: in exploring options for research assessment, we should not fall into a caricatured comparison of idealised peer review versus a one-dimensional view of metrics.
- ‘Baskets’ of indicators
If metrics are used, ‘baskets’ of indicators should be developed. These should include qualitative and quantitative measures, crafted to suit the contexts under scrutiny. In choosing a ‘diversity of metrics’, biases should be explored and particular care taken to avoid selections which are biased in similar directions.
- Embedding equality and diversity
Within higher education institutions, equality and diversity considerations are often linked more closely to HR rather than research; equality and diversity should be better embedded within research (integrity), processes and policy, which academics are more likely to understand and embrace with ease.
- Early-career researchers
Greater efforts could be made to develop metrics which might more readily identify the value of early-career researchers, whether assessing the value, quality and significance of work they’ve undertaken to date, or to better indicate future potential success.
- Weighting by age
Some asked whether metrics could be weighted by academic age to mitigate for potential effects and biases (for example, for research income accrued).
- Involving the sector
Any decisions about the potential use of metrics should be made with the involvement of representatives from the sector and with a key eye on equality and diversity concerns.
Lessons can be learnt from REF2014, where clear equality and diversity guidance benefited (early-career) researchers and those with complex circumstances
Training should be provided to assessors and researchers, adapted appropriately to fit career level. This would help everyone to gain an understanding of the equality and diversity implications involved. There is evidently a place for organisations such as the Equality Challenge Unit and Vitae to assist with this, but across the HE sector, clear leadership and support at the highest levels is crucial for these matters to be taken seriously.
The review of metrics continues. However, recognising equality and diversity as we take things forward is paramount.
A future workshop focussed on metrics and the Arts and Humanities takes place on 16 January in Warwick. Though please be advised that the event is now fully booked.
Initial recommendations from the review group will be made at the end of March, and a final report is planned for publication in Summer 2015.