REF 2021: Guidance Released

This afternoon HEFCE released the decisions on staff and outputs for REF 2021. Our very own Ruth Hattam was digesting the details over lunch and has come up with a useful bitesize summary. If that’s not enough REF for you you can always read the full thing (it’s only 19 pages): http://www.ref.ac.uk/media/ref,2021/downloads/REF%202017_04%20Decisions.pdf

  • All staff with significant responsibility for research are returned to REF provided they are independent researchers.  As expected, institutions not going with 100% submission will be able to determine the criteria for identifying staff with significant responsibility.  My reading of the guidance is that it will be possible to consider different criteria for different UoAs, although the rationale for all decisions will need to be captured in the Code of Practice (guidance summer 2018, provisional submission spring 2019).  Further guidance on significant responsibility criteria (determined in conjunction with main panels) will form part of the Guidance on submissions/Panel criteria – the final versions of which will not be available until January 2019.
  • Independent researcher criteria will build on REF 2014 definition and will be worked on with the main panels.
  • ORCID strongly encouraged by not mandated.
  • Average number of outputs is 2.5 per FTE submitted.
  • Minimum of one, maximum of 5 outputs per member of staff (this is a soft 5 with no limit on co-authored papers).  Staff can be returned with zero outputs through an individual circumstances process.  The unit of assessment can also make  case for ‘unit circumstances’ to reduce the overall number of outputs required.
  • Impact case studies will be 2 for up to 15 FTE, then one for each additional 15 FTE (up to 105 FTE when one additional per 50 FTE)
  • Staff census date is 31 July 2020.  Hefce intend to work with HESA ‘to enable close alignment between the information collected and the staff record and the submission requirements of REF.
  • Output portability is the simplified model (i.e. outputs can be returned at current and previous institution – with some caveats).  (85% of the 157 respondents supported this model).
  • The original Open Access requirement (i.e. deposit within 3 months of date of acceptance) will be implemented from April 2018, although there will be the opportunity to record exceptions when deposit is 3 months after publication.

 

Share

Research Councils in 2013: BBSRC, STFC, Outputs and Audits

Star trails and Star tails by Joe Dsilva CC BY-NC-SA 2.0Following the 2-part series last month summarising our visit to RCUK HQ in Swindon, our Assistant Director of Research, Ruth Hattam, has written the following summary of a few of the parallel sessions which she attended. This provides insights into the work and priorities of BBSRC and STFC, as well as a summary of discussions around outputs and the Research Councils’ approach to audits and assurance.

Gerald Owenson from BBSRC outlined forthcoming funding opportunities both for responsive mode (3 calls per year) and open calls.  Three strands (basic, strategic and applied research) mapped to the scheme for new investigators, the industrial partnerships awards (IPA) and stand-alone ‘LINK’ schemes respectively.  The latter two schemes focussed on academia and industry collaboration.  IPA schemes focussed on more speculative research with industry meeting 10% of the fEC.  LINK schemes were more market-focussed with 50% of the fEC from industry.

Other initiatives highlighted were the ‘Excellence with Impact’ awards, opportunities through FAPESP (collaboration bids with Brazil) and the ISIS scheme to support academics making contact with international counterparts.

Unlike the rest of the Research Councils, impact funding is not embedded in STFC awards.  The STFC staff talked about consolidated grants (limited to bid per department per year) and consortium grants (joint consolidated proposals).  A scheme for new applicants is also available.

STFC Innovation funding was being reviewed this year but currently comprised the Innovations Partnership Scheme (IPS) (some industry collaboration), the Challenge Led Applied Systems Programmes (CLASP) and IPS Fellowships (co-funded technology transfer staff to work on knowledge exchange from STFC funded research).   The CLASP scheme was worth £1.5m and was based on collaboration with industry/other disciplines to address global challenges in the areas of Energy, Environment, Healthcare and Security.  These challenges also formed the basis of the Futures Programme, which was around initiating projects based on STFC’s strengths and capabilities through 3 routes: networks, concepts and studentships.

Output, Outcome and Income Data

There are currently two main systems where research outcomes data can be stored, Researchfish and the Research Outcomes System (ROS).  RCUK is currently working to align/exchange data between the two systems.

Dr Ian Viney from MRC gave an overview of Researchfish which had evolved from the MRC e-Val system.  Currently 6,500 researchers use Researchfish, and 9 research intensive institutions have subscribed.  Data stored is mainly related to MRC and STFC projects.

Ben Ryan from EPSRC gave an overview of ROS, which was used by the other 5 Research Councils.  ROS is open source, with both individuals and institutions able to input data on a broad range of outcomes.  Bulk uploads are possible, the system is bibliometric data friendly and can be updated/added to even several years after project completion.  From 1 January 2013 all final reports on EPSRC grants should go straight onto ROS, with other Research Councils following suit soon.  ROS will provide certain data to the Gateway to Research system. The issue of whether one research outcomes system was anticipated was raised, but not directly answered.

Audit and Assurance Processes (Gareth McDonald, Associate Director of the Audit and Assurance Services Group (AASG))

The AASG is newly formed from the merger of Research Councils UK Assurance and the Research Councils Internal Audit Service.   This was a lively last session focussing on assuring compliance with the terms and conditions of awards.  Transparency of approach was emphasised, as was the AASG aim of helping and supporting institutions rather than policing them.  Emphasis was on whether effective control systems were in place rather than the detail of individual transactions.  During an audit visit, the AASG would: assess the regularity of expenditure for around 50 grants; review the use and application of TRAC methodology (ensuring that the institution was complying with minimum requirements of TRAC and that costs calculated were appropriate);  assess the effectiveness of communications (within the institution and with RCUK); and examine the control environment.  There would be increased emphasis on procurement and value for money, but also on non-financial assurance (research integrity and ethics).

A model of ‘Pillars of Funding Assurance’ supported by the control environment at the institution was outlined, with the specific areas of scrutiny listed.

The AASG will be adopting a revised methodology for audit. Research organisations will be asked to undertake an annual self-certification process using a template based on the ‘pillars of funding assurance’.  The templates will inform the visit schedule.  Research intensives could expect the most frequent visits as they receive the majority of funding.  Institutions with low levels of RCUK funding would be audited less frequently (likely to be less than 1 visit every 2 years).  The measurement criteria and definitions were currently being established but there were likely to be three grades: substantial confidence, satisfactory and no confidence.

Share