REF 2021: Guidance Released

This afternoon HEFCE released the decisions on staff and outputs for REF 2021. Our very own Ruth Hattam was digesting the details over lunch and has come up with a useful bitesize summary. If that’s not enough REF for you you can always read the full thing (it’s only 19 pages): http://www.ref.ac.uk/media/ref,2021/downloads/REF%202017_04%20Decisions.pdf

  • All staff with significant responsibility for research are returned to REF provided they are independent researchers.  As expected, institutions not going with 100% submission will be able to determine the criteria for identifying staff with significant responsibility.  My reading of the guidance is that it will be possible to consider different criteria for different UoAs, although the rationale for all decisions will need to be captured in the Code of Practice (guidance summer 2018, provisional submission spring 2019).  Further guidance on significant responsibility criteria (determined in conjunction with main panels) will form part of the Guidance on submissions/Panel criteria – the final versions of which will not be available until January 2019.
  • Independent researcher criteria will build on REF 2014 definition and will be worked on with the main panels.
  • ORCID strongly encouraged by not mandated.
  • Average number of outputs is 2.5 per FTE submitted.
  • Minimum of one, maximum of 5 outputs per member of staff (this is a soft 5 with no limit on co-authored papers).  Staff can be returned with zero outputs through an individual circumstances process.  The unit of assessment can also make  case for ‘unit circumstances’ to reduce the overall number of outputs required.
  • Impact case studies will be 2 for up to 15 FTE, then one for each additional 15 FTE (up to 105 FTE when one additional per 50 FTE)
  • Staff census date is 31 July 2020.  Hefce intend to work with HESA ‘to enable close alignment between the information collected and the staff record and the submission requirements of REF.
  • Output portability is the simplified model (i.e. outputs can be returned at current and previous institution – with some caveats).  (85% of the 157 respondents supported this model).
  • The original Open Access requirement (i.e. deposit within 3 months of date of acceptance) will be implemented from April 2018, although there will be the opportunity to record exceptions when deposit is 3 months after publication.

 

Share

HEFCE’s David Sweeney Blogs: Who Is Research Active?

HEFCE’s Director of Research, Education and Knowledge Exchange, and one of the key architects of the next REF, David Sweeney has published an interesting blog about who counts as “research active” ahead of the closure of the REF2021 consultation tomorrow. He suggests there has been a lot of “push back” from the HE sector on the consultation proposal that “research active” should be determined by contractual status, and that there should be another evidence-based and mutually agreed approach. It’s not clear what this would be, but interesting to see that there’s already movement on this even before consultation responses are in.

Share

HEFCE Adjust REF Open Access Policy

HEFCE have today published a letter to universities outlining key changes to their Open Access policy for the next REF. These include: deposit on acceptance now comes into force in April 2017, rather than 2016; there is now an exception to the deposit requirement for outputs available via Gold OA; “inadvertently non-compliant” outputs can be made compliant retrospectively.

Share

The use of metrics in research assessment.

Ruth Hattam (Assistant Director for Research) recently attended a session about the prospects and pitfalls around the use of metrics in research assessment. The event was hosted by SPRU (Science and Policy Research Unit) based at the University of Sussex, which is undertaking the HEFCE review of metrics, the report for which is due in June 2015.  There was no indication of the likely outcomes, and Steven Hill (Head of Research Policy, HEFCE) was keen to stress that no decision had been made about metrics and the next REF.

The event was well-balanced with a variety of views on the issue from a number of speakers.   There seemed to be a broad consensus that metrics alone should not be used to assess research, with general support for a mix of qualitative and quantitative approaches, although which should come first, or have prominence, was not resolved.

As an observation, those speakers with an interest in promoting metrics were careful to stress that metrics are only one indicator, whilst some speakers on the other side of the debate were more forceful in their criticism of the use of metrics, arguing that they were an unreliable means of assessment.  One speaker used his own citations to illustrate this point, asserting that his most frequently cited articles did not correlate with his best research. Some of the other general discussion points included: metrics could only potentially be useful as an indicator of significance in the three REF criteria for outputs (originality, significance and rigour); issues around impact metrics; peer review is a far from perfect system, potentially subject to individual bias; the public interest should dominate; use of Altmetrics (e.g. social media, blog posts – anything that isn’t citation-based).

The event featured a ‘metrics bazaar’ which allowed participants to explore metric tools and platforms with a range of developers and providers.   Of interest was an overview of ‘The Conversation’ which is an independent source of news and views sourced from the academic and research community and delivered direct to the public.

The afternoon session explored the ‘darker side of metrics’, although the speakers did not perhaps delve into some of the gaming practices which have been unearthed (e.g. self-citation malpractice uncovered at the Journal for Business Ethics.  Some of the discussion points included: the number of retractions is on the rise including  in ‘prestigious’ journals;  the sector had to be realistic and accept the principle of measurement as had other public-funded sectors (e.g. health); that use of metrics would potentially change behaviour; the term ‘metrics’ should be replaced by the term ‘indicators’;  arts and humanities academics needed to engage in the debate.

Share

Analysis of REF Impact Case Studies

The Policy Institute at King’s College London and Digital Science will be analysing the case studies submitted to the REF to illustrate the impact of research beyond academia. The analysis has been commissioned by the Higher Education Funding Council for England and its partners in the REF exercise. The work aims to maximise the value of the 6,975 case studies as a separate resource, analysing them and identifying what they can show about the wider impact of the research conducted by UK universities. The case studies will be made freely available for analysis in a database to be hosted on the HEFCE website. The outputs will be a well-curated, readily searchable database of the case studies, an overview report describing the strength of UK science, and a view on how the case study approach works in assessing and auditing impact.

The exercise will not affect the quality-related block funding awarded to institutions by the government, as the case studies have already been incorporated into that process.

See more at: https://www.researchprofessional.com/0/rr/news/uk/ref-2014/2014/9/King-s-to-lead-assessment-of-REF-impact-case-studies-.html#sthash.KcxzAkPY.dpuf

Share

Survey to help major research project understand how you use books

Domesday-book-1804x972 - Public DomainOAPEN-UK, an AHRC and Jisc-funded project on open access monographs, is currently running a survey to understand how researchers in the humanities and social sciences use books, and especially monographs.

The survey design has been informed by a range of funders including HEFCE and Jisc, and the findings will help build an evidence base for future policies to support monograph publishing in the UK.

No identifiable data will be made public or shared beyond the OAPEN-UK project team. All respondents to the survey can enter a prize draw to win up to £100 of Amazon vouchers.

I hope you’ll spare 10-15 minutes to participate, and to help the researchers understand what you want as both authors and readers of books.

The survey can be found here: https://www.surveymonkey.com/s/K96XZD5 

The deadline for completed surveys is 6th June.

If you have any questions, please contact the survey researcher, Ellen Collins, on ellen.collins@researchinfonet.org.”

Share