How we judge research outputs when making funding decisions
Robert Kiley, Head of Open Research, and Jim Smith, Director of Science, discuss the steps Wellcome is taking to fulfil the principles of the San Francisco Declaration on Research Assessment (DORA).
When we published our open access policy over a decade ago, we made it clear that what counts when we make funding decisions is the intrinsic merit of the work and not the journal or publisher.
Despite significant progress in making our research open access, we know many researchers remain unconvinced that it’s the work that counts and not where it’s published. We’re worried about this and so we’re continuing our efforts to promote best practice in research assessment.
What we’re doing
- Funding for DORA
We were one of the first funders to sign the San Francisco Declaration on Research Assessment (DORA), publicly committing that we will consider all research outputs and look at a broader range of qualitative measures of impact, such as influence on policy and practice. In partnership with other funders and publishers, we are providing funding to DORA to help promote the broad adoption of these principles and to collect evidence of good practice.
- Improvements to our grant application and decision process
We’ve modified our grant application forms so that now we ask researchers for their outputs, which may include (but are not limited to) preprints, datasets, software, research materials and inventions, patents and other commercial activities. Previously, we just asked researchers to cite their research publications.
Our forms also invite applicants to:
- describe (where appropriate) their training record and the contributions they have made to their trainees’ career development
- develop an output sharing plan, which describes how outputs will be managed and used to advance potential health benefits
- outline their plans for public engagement.
We make sure our advisory committees are fully aware of our values during their induction. This is to ensure they focus on the content and quality of publications when reviewing applications, rather than their number or the impact factors of the journals in which they were published.
What we’re going to do next
Although we’ve taken steps to improve how we assess funding applications, we know we need to do more.
- We’re working with the government’s Behavioural Insights Team to work out how best funders might work with researchers to allow them to share outputs beyond the traditional research article.
- Through the Open Research Funders Group, we’re co-funding a roundtable with the National Academy of Sciences to see how best we can work together with other funders and institutions to encourage appropriate evaluation and open science.
- We’re exploring how we can support partner institutions to adopt DORA. For example, we could stipulate that the administering organisations of our grant holders sign up to DORA, and provide funding to ensure they have the resource to put it into practice.
A researcher recently lamented in Nature that despite its shortcomings he would "settle for impact factor as the least-bad option" for researcher assessment. The simplicity of using journal impact factor as a tool for researcher evaluation is enticing – and we suspect its simplicity explains its persistence. But researcher assessment requires a far more nuanced understanding of a researcher’s skills, qualities and attitudes, and these can never be expressed in a single metric.
At Wellcome, we recognise this and seek to be transparent in the approach we’re adopting.
If you'd like to comment on the issues discussed here, email Robert Kiley at r.kiley@wellcome.org.
Related links
- Wellcome to use Researchfish for reporting research outcomes
- What is the best way to decide who to fund?