HEFCE’s Director of Research, Education and Knowledge Exchange, and one of the key architects of the next REF, David Sweeney has published an interesting blog about who counts as “research active” ahead of the closure of the REF2021 consultation tomorrow. He suggests there has been a lot of “push back” from the HE sector on the consultation proposal that “research active” should be determined by contractual status, and that there should be another evidence-based and mutually agreed approach. It’s not clear what this would be, but interesting to see that there’s already movement on this even before consultation responses are in.
As you have no doubt heard, the Higher Education Funding Council for England has launched a consultation on the next Research Excellence Framework. This includes proposals intended to streamline the REF process and make it less burdensome for UK universities whilst maintaining and improving incentives for research excellence. It includes recommendations relating to: the submission of staff and outputs, the approach to the assessment of impact, and the introduction of an institutional level assessment.
Northumbria University is preparing a response and would like to give the ECR community an opportunity to feed into this. We are holding an event on Tuesday 14th February from 1.00-2.30 in room 209, Sutherland Building, City Campus. If you are able to attend, please sign up on this doodle poll to reserve your place: http://doodle.com/poll/fdkxmmr7mvaw8mee
Responses to the consultation must be submitted by midday on Friday 17 March 2017. Full details of the consultation can be found on the HEFCE website here.
The Stern Review of the Research Excellence Framework has been published today.
Broadly speaking the review recommends keeping the REF more or less the same as before – i.e. still a periodic exercise based predominantly on peer review, rather than metrics, and with recommendations for weightings of outputs, environment and impact more or less the same. This is not a proposal to radically overhaul the system.
However, the review does contain some fairly significant tweaks which will have implications for both academics and research managers, if they are adopted. The review makes the following recommendations (in bold below – followed by my own comment and reflections):
- All research active staff should be returned in the REF: This was a contentious issue going by the responses to the consultation on the REF, with some concerned that it could lead to a greater distinction between staff on teaching only contracts and those whose contracts include research responsibilities. On the positive side, it should lead to less burdensome selection processes for HEIs and reduce the negative stigma of not being “REF-able”.
- Outputs should be submitted at Unit of Assessment level with a set average number per FTE but with flexibility for some faculty members to submit more and others less than the average: The suggestion in the review is that the minimum should be 0 and the maximum 6 per FTE submitted, but further work will be required to model this so that it doesn’t lead to a large increase in work for panels.
- Outputs should not be portable: This has caused the largest outcry on Twitter following the publication of the review, particularly among early career researchers, many of whom argue that it will make it much more difficult to get new jobs as their previous publications will not count towards the next REF. On the other hand, the review makes the case that this will discourage the so-called “transfer market” of REF staff before the deadline.
- Panels should continue to assess on the basis of peer review. However, metrics should be provided to support panel members in their assessment, and panels should be transparent about their use: The report recognises that bibliometric data is not appropriate for use in all units of assessment (following the Metric Tide review), but that it can be used “judiciously” to help panels in their peer review assessment.
- Institutions should be given more flexibility to showcase their interdisciplinary and collaborative impacts by submitting ‘institutional’ level impact case studies, part of a new institutional level assessment: This is a genuinely new part of the assessment and perhaps reflects the slightly amorphous nature of research impact assessment, which in many cases is difficult to tie down to a particular body of work which fits neatly within the boundaries of a single UoA.
- Impact should be based on research of demonstrable quality. However, case studies could be linked to a research activity and a body of work as well as to a broad range of research outputs: Again this appears to be about increasing the flexibility of what counts as impact and reducing the instrumental approach of linking research outputs directly to impacts. How this will play out in reality will, I imagine, depend heavily on precisely how this is interpreted in the guidance, assuming the recommendation is adopted.
- Guidance on the REF should make it clear that impact case studies should not be narrowly interpreted, need not solely focus on socioeconomic impacts but should also include impact on government policy, on public engagement and understanding, on cultural life, on academic impacts outside the field, and impacts on teaching: The fact that the review contains three recommendations wholly focusing on impact shows that this element of the assessment is still critical. Several people have pointed out on ARMA mailing lists that the guidance for REF2014 anyway allowed these kinds of impacts (except perhaps for academic impacts), so this might partly be about emphasising to institutions and panels that these are eligible impacts and should be taken seriously.
- A new, institutional level Environment assessment should include an account of the institution’s future research environment strategy, a statement of how it supports high quality research and research-related activities, including its support for interdisciplinary and cross-institutional initiatives and impact. It should form part of the institutional assessment and should be assessed by a specialist, cross-disciplinary panel: As widely predicted, the REF3a “impact statement” is part of a wider statement about research environment, though at the institutional level rather than the UoA level.
- That individual Unit of Assessment environment statements are condensed, made complementary to the institutional level environment statement and include those key metrics on research intensity specific to the Unit of Assessment: Recommendations 8 and 9 are listed together in the review and they do appear to complement each other. The focus at UoA-level (9) appears to be on shorter, punchier metrics-based evidence, while at institutional level (8) it is on more narrative-based plans and strategies.
- Where possible, REF data and metrics should be open, standardised and combinable with other research funders’ data collection processes in order to streamline data collection requirements and reduce the cost of compiling and submitting information: The focus here is on reducing the burden of the REF, and the review here acknowledges current events in the form of the TEF and the uncertainty about the future relationship between the UK and the EU. With the emphasis on standardisation and streamlined data collection, there is surely a role here for organisations like Jisc and CASRAI.
- That Government, and UKRI, could make more strategic and imaginative use of REF, to better understand the health of the UK research base, our research resources and areas of high potential for future development, and to build the case for strong investment in research in the UK: Ant Bagshaw suggests in his WonkHE post that this appears to be a “bit of a cheeky request” for cash.
- Government should ensure that there is no increased administrative burden to Higher Education Institutions from interactions between the TEF and REF, and that they together strengthen the vital relationship between teaching and research in HEIs: This again returns to the theme of the review which is to reshape the REF to reduce the burden on HEIs.
The timetable suggested by the review on p32 is also instructive and suggests a lot of work lies in store for the Government and Funding Councils: a consultation on concrete proposals for the next REF by the end of 2016, with decisions made in the summer of 2017. This will also need to be “checked for consistency” with the TEF, as the two exercises will evolve in parallel. The review suggests that this timetable could see a deadline for submissions by the end of 2020, with the assessment itself taking place in 2021.
HEFCE have today published a letter to universities outlining key changes to their Open Access policy for the next REF. These include: deposit on acceptance now comes into force in April 2017, rather than 2016; there is now an exception to the deposit requirement for outputs available via Gold OA; “inadvertently non-compliant” outputs can be made compliant retrospectively.
Why not sign up to attend this informative one hour session run by Lucy Jowett, Research Impact Manager (RBS) on Tuesday 3rd March 2015, 12 – 1pm, City Campus?
The session will encourage researchers from all disciplines and at various career stages, to actively think about how they will achieve excellence with impact and detail the support available.
For more information visit the People Development pages on the Northumbria intranet.
Ruth Hattam (Assistant Director for Research) recently attended a session about the prospects and pitfalls around the use of metrics in research assessment. The event was hosted by SPRU (Science and Policy Research Unit) based at the University of Sussex, which is undertaking the HEFCE review of metrics, the report for which is due in June 2015. There was no indication of the likely outcomes, and Steven Hill (Head of Research Policy, HEFCE) was keen to stress that no decision had been made about metrics and the next REF.
The event was well-balanced with a variety of views on the issue from a number of speakers. There seemed to be a broad consensus that metrics alone should not be used to assess research, with general support for a mix of qualitative and quantitative approaches, although which should come first, or have prominence, was not resolved.
As an observation, those speakers with an interest in promoting metrics were careful to stress that metrics are only one indicator, whilst some speakers on the other side of the debate were more forceful in their criticism of the use of metrics, arguing that they were an unreliable means of assessment. One speaker used his own citations to illustrate this point, asserting that his most frequently cited articles did not correlate with his best research. Some of the other general discussion points included: metrics could only potentially be useful as an indicator of significance in the three REF criteria for outputs (originality, significance and rigour); issues around impact metrics; peer review is a far from perfect system, potentially subject to individual bias; the public interest should dominate; use of Altmetrics (e.g. social media, blog posts – anything that isn’t citation-based).
The event featured a ‘metrics bazaar’ which allowed participants to explore metric tools and platforms with a range of developers and providers. Of interest was an overview of ‘The Conversation’ which is an independent source of news and views sourced from the academic and research community and delivered direct to the public.
The afternoon session explored the ‘darker side of metrics’, although the speakers did not perhaps delve into some of the gaming practices which have been unearthed (e.g. self-citation malpractice uncovered at the Journal for Business Ethics. Some of the discussion points included: the number of retractions is on the rise including in ‘prestigious’ journals; the sector had to be realistic and accept the principle of measurement as had other public-funded sectors (e.g. health); that use of metrics would potentially change behaviour; the term ‘metrics’ should be replaced by the term ‘indicators’; arts and humanities academics needed to engage in the debate.
The Policy Institute at King’s College London and Digital Science will be analysing the case studies submitted to the REF to illustrate the impact of research beyond academia. The analysis has been commissioned by the Higher Education Funding Council for England and its partners in the REF exercise. The work aims to maximise the value of the 6,975 case studies as a separate resource, analysing them and identifying what they can show about the wider impact of the research conducted by UK universities. The case studies will be made freely available for analysis in a database to be hosted on the HEFCE website. The outputs will be a well-curated, readily searchable database of the case studies, an overview report describing the strength of UK science, and a view on how the case study approach works in assessing and auditing impact.
The exercise will not affect the quality-related block funding awarded to institutions by the government, as the case studies have already been incorporated into that process.
Remember when HEFCE suggested replacing peer review with metrics after RAE2008? They backed off fairly quickly following strong opposition, and the REF ended up, as the RAE, largely based on peer review. Well, now that REF2014 is over, talk of metrics-based research assessment is back and HEFCE have just announced an independent review, chaired by Prof James Wilsdon. It will report by spring 2015. More from THE [£] and Research Fortnight [£].
HEFCE has today released its open access policy for the post-2014 Research Excellence Framework.
I was intending to write a short summary this morning, but I’ve already been beaten to it by the ever-excellent Martin Eve, who’s got a “Really Short Version” in four bullets, and an extended abridged version if you want to dig a little deeper.
Martin’s full summary is posted below thanks to the magic of open access (Creative Commons Attribution license to be specific). This guide is aimed at academic staff, so it doesn’t cover some of the stuff around “discoverability” of outputs since that will largely be handled by institutional repository staff. (Incidentally, you shouldn’t worry about the veracity of this account – Martin’s version has already been praised by Ben Johnson, the HEFCE policy manager who co-authored the original policy, on Twitter):
The Really Short Version:
- Submit journal article.
- Check journal policy at http://www.sherpa.ac.uk/romeo/ (the bits on “post-print” are the thing to pay attention to).
- On acceptance go to your institution’s repository and create a record. Upload your author’s accepted version setting the embargo as per SHERPA/RoMEO [at Northumbria repository staff are available to help with this and check it’s set it correctly].
- That’s it.
The Geekier Longer Version
Source: “Policy for Open Access in the Post-2014 Research Excellence Framework“, March 31, 2014
- Applies to all: journal articles and published conference papers (with ISSN) accepted after 1 April 2016. (Paras 11, 13)
- Exempted: monographs (“and other-long form publications”), edited collections (without ISSN), non-text outputs, data. (Para 14)
- Outputs to which this applies are subject to open access deposit, discovery and access requirements. (Para 16) It is anticipated that the “discovery” requirement will be met at the institutional and technological level.
- “Credit will be given” to institutions exceeding the letter of this policy in a future “research environment” component. (Para 15)
- “Non-compliant outputs will be given an unclassified score and will not be assessed in the REF.” (Para 42)
- upload the “accepted and final peer-reviewed text” to “an institutional repository, a repository service shared between multiple institutions, or a subject repository such as arXiv”. (Paras 17, 19)
- do so “as soon after the point of [firm (Para 19)] acceptance as possible, and no later than three months after this date”. (Para 18)
- the author did not work at an HEI at time of acceptance.
- “it would be unlawful to deposit, or request the deposit of, the output”.
- “depositing the output would present a security risk”. (Para 36)
- upload a subsequent version as a supplement or replacement if the publisher allows it. (Para 19) If a replacement, it must also fulfil the access requirement (Para 33).
- allow others “search electronically within the text, read it and download it without charge”. (Para 25)
- respect any ‘embargo period’ (an open access delay) specified by the publisher. (Para 25)
- but an embargo MUST NOT be longer than 24 months for panels C and D and 12 months for panels A and B. (Para 30)
- regardless of whether publisher specifies delay, you MUST deposit at time of acceptance (deposit requirement)
- no specific license is required. It is suggested that CC BY-NC-ND could meet the above provisions. (Para 25)
- if provisions are made to allow text-mining, which could include more liberal licensing, then credit will be given in the “environment” component. (Para 34)
The ONLY allowable exceptions:
- third-party rights couldn’t be obtained for material within to be made OA
- the embargo period was above the maximum allowed, or the journal disallows deposit, but the venue was “the most appropriate” (Para 37)
A summary of the REF submission data is now available on the REF website: www.ref.ac.uk/subguide