Can you prove the impact of your research

For those involved with the development of REF2014 impact case studies, one of the most common challenges was how to show evidence of research impact effectively. Many had participated in engagement and knowledge exchange activities, but it seemed that very few had captured or documented their activities, or kept evidence of their impact.

At events with REF impact officers across the country, the challenge of gathering and collecting evidence came up time and again. We went through the retrospective capture of evidence, but we also knew that we had to start thinking about how to capture evidence of impact more broadly, not least so that we would be more prepared for a future REF exercise.

Now, over two years since institutions submitted their REF returns, we feel that we should have learnt lessons. We should, as a sector, have the expertise to demonstrate the impact of our research. But it’s not that simple – it’s still an issue with which colleagues wrestle.

What to collect

The most common question I’ve faced is ‘What should I collect?’ The only answer I have to this is, ‘it depends what you want to know or show’.

This, I think, goes to the heart of why it can be particularly challenging for universities to provide guidance on gathering evidence of impact – there aren’t simple answers that can be widely used. What can be gathered will depend on the project, the hoped-for impact, the relationship with relevant groups and communities, and the time and money available.

When I’ve tried to provide support for evidencing impact, my approach has been to create frameworks of questions for researchers and academics to consider, to help them identify what is most useful.

When to collect

When to gather evidence is also challenging. We might hope, ideally, to collate evidence throughout the life of a project, but often its impact, and so the evidence that something has changed, is not available until many years later.

So pursuing the evidence is a commitment. It means keeping track of developments, and maintaining the relationships that may be required to access the evidence. Of course, not all of an individual’s or group’s research will lead to impact. In which case how should they identify what has the most potential? What should be the focus for any follow-up? For how long should we keep following up with partners in the hope of impact?

Research relationships

The questions of what to gather and when are tied to the relationships with those the research affects or influences. Impact relies on someone else taking the research produced in our universities and doing something with it.

This means that institutions don’t own any change or benefit that might follow. Private companies operate in a competitive environment and can be reluctant to share information, especially if there isn’t a clear purpose or reason for the institution holding the information.

Often, policy documents don’t properly reference the academics or the papers that have informed the content of the policy. So tracing the link from research to policy can be challenging. How much should we push for proper attribution or evidence from policy partners? Or should we be satisfied knowing that our research is impactful and influential, even if we can’t then demonstrate it well to others?

Personally, I also think that we should focus on high-quality engagement, based on high-quality research. Collecting good evidence and evaluating it to see what we have achieved is important. But it shouldn’t limit or direct how we engage and to what standard.

In particular, we don’t want to end up with the tail wagging the dog, where we only engage where we think we can get good evidence.

How we can support impact

So how do we, as institutions, as people who support academics and researchers to have impact, tackle these challenges? The principles for me are:

  • Proportionality
    Capture what can be captured, and accept that this may not be the full story.
  • Relationships
    Be clear about what you might need for collaborators and partners early on, and if you would like a testimonial from them, be prepared to offer something back.
  • Focus
    There is little value in capturing absolutely everything possible around an activity. Researchers and academics knowing what they want to achieve and what they hope the outcomes will be should clarify what evidence could be collected.

This is just a starting point, and these issues will be explored today at a HEFCE workshop on Capturing Evidence of Research Impact. The event will provide a platform for some of us who were involved in the REF2014, and who work in this area, to share their experience of capturing evidence of research impact across all subjects and disciplines.

After the workshop, the plan is to create guidance for the sector, which will explore how best to evidence impact, not just for REF, but beyond. By bringing a range of people together to explore this issue, hopefully we can all get closer answering the evidence challenge. We encourage people to participate in these discussions, and invite comments on this post and at the event.


Elizabeth Garcha is the Knowledge Exchange and Impact Manager in the Faculty of Social Sciences at the University of Sheffield, and is the ARMA Special Interest Group co-champion with Julie Bayley. The views expressed in this post are, however, personal and do not reflect those of the University of Sheffield.