Home Blog How to Show Impact: Highlights from our Essential Research Management workshop

How to Show Impact: Highlights from our Essential Research Management workshop

Last week we welcomed research managers at our bi-annual Essential Research Management workshop; aimed at improving understanding and awareness of different aspects of research management and showing how AMRC guidelines can help charities. We heard from the AMRC team as well as our member charities and a patient to look at different aspects of peer review, patient and public involvement and research impact. One theme highlighted was the challenges (and solutions) around demonstrating the impact of charity-funded research. In this blog we summarise the discussions and explain how it links to our wider work on supporting the sector around this potentially tricky problem.

What do we mean by impact?

Charities are striving to support research that will develop treatments and cures for patients – from understanding the causes of diseases, to supporting the development of new treatments and collating evidence of their health benefits.  So impact can mean publication of a paper that demonstrates a new understanding about the disease, or research findings which influence policy.

How can we measure impact?

All AMRC members have robust mechanisms for funding research but challenges occur when we try to evaluate the impact of this. What we measure, when and how are just some of the questions we are often asked. These challenges are compounded by research taking a long time to go from initial funding to directly benefiting a patient, which can make it difficult to link back to the initial charity grant at the lab bench. During the workshop AMRC’s Research Data Analyst, Rachel Burden, explained how impact data can be collected and evaluated from the whole sector so that it can inform individual charities’ research aims.

Impact can be measured by many different outputs such as number of publications, intellectual property or influences on policy. These data are normally collected from researchers via annual reports or using online tools such as Researchfish  (which is free for member charities spending less than £25m per year). But in the world of social media, other tools such as Altmetrics can quantify early influence of research such as tweets, blogs and even reddit posts. This expands research impact away from just a publication or journal impact factor, which is only a small part of the wider ‘impact puzzle’. These tools then help a charity to evaluate their own research portfolio and tell the wider story of the difference that their research has made.

How can we evaluate impact?

The workshop highlighted the importance of looking at impact through a wide lens to fully understand the whole research landscape. Many organisations use the Health Research Classification System to code research into health and research categories. This allows organisations to see what they are funding and identify potential overlaps and opportunities. Other tools such as Dimensions from ÜberResearch allow a funder to compare internal grants with global grants in a similar field. This allows a charity to ensure that their funding (and subsequently their impact) can go further.

What can we do with this information?

Once a charity has collected all this data surrounding the impact of their research funding and compared it to the wider research field what next? Emily Burns, Diabetes UK’s Research Communications Manager, told us how you can turn impact data into a tool that you can use to communicate to wider audiences such as supporters, politicians and the public.

It is very important to work with your researchers, volunteers and supporters to help create a story around the research - from the initial identification of the problem, what the charity did, the outcome and then the subsequent impact on patients and the public. Research outcomes must be accessible and engaging, and this means tailoring the approach to the audience. As well as this, it’s also important to find the appropriate channel to communicate the story, making sure that the language and tone is right for the intended audience. We have several tools at our disposal - from social media, websites, the press, television, radio and even the colleagues we work with.

‘The key to communicating research effectively is to harness the knowledge and enthusiasm of your researchers and supporters – they’ll be able to help you build an exciting and engaging story. You can maximise your impact by being absolutely clear on your key messages and communicating them in a way that works for your target audience‘.  Emily Burns - Research Communications Manager, Diabetes UK

What do AMRC do with impact data?

Every year, AMRC asks for information on all grants awarded by our member charities. This is really important as it helps us to provide an overview of the research being funded across the sector. We use this when talking to policy makers and demonstrating the scale of charity funding in the UK. It’s incredibly powerful – over 40% of publicly funded medical research is supported by AMRC member charities. This allows us to build a strong case for campaigning for the best environment in which to undertake research.

AMRC is currently utilising Researchfish data (such as publications and collaborations) from 40 of its member charities to create a report showcasing the excellent work health and medical research charities are doing. In the future we hope to collate more output data to enable our members to benchmark themselves against each other to evaluate what type of research they should fund to have the greatest impact. If you are interested in finding out more, contact Rachel.

If you are interested in learning more about measuring impact and evaluating research outcomes, come to our Evaluation and Impact Masterclass. Here we will be helping you understand what impact and evaluation means for your charity, identifying impacts for different types of research funding and showing you different tools and software than can help with impact and evaluation.