- Skúli Skúlason, former Rector of Hólar, will give a brief account of a preceding REAC even, a webinar on the same topic held on 10 September. (for more information on that webinar, please see below and visit: https://qef.is/events/reac-webinar-on-evaluation-of-research/).
- Guðbjörg Linda Rafnsdóttir, Vice-Rector of Science and Professor at the University of Iceland, will talk about the ongoing review of the evaluation system of the public universities.
- Panel discussion with Rectors of all Icelandic universities (or their representatives).
- Open discussion.
Bjarnheiður K. Guðmundsdóttir will chair the meeting and Áslaug Helgadóttir, quality manager at the University of Iceland, will serve as rapporteur.
The workshop on October 1 will be of interest to researchers and stakeholders of research in Iceland. This webinar is particularly relevant today, as the National Library is rolling out ÍRIS/PURE (a Current Research Information System) for Iceland, the Public Universities in Iceland are revising their Evaluation System (í. matskerfi opinberu háskólana) and the government is developing its Green Book on public funding of universities.
The event will be streamed here.
The event will be streamed through the facebook page of LÍS (National Union of Icelandic Students)
To register for the workshop, please click here.
Recap of September 10 webinar:
Prof. Philip Winn, Chair of REAC and rapporteur from the webinar, has provided the following summary of the September 10 webinar to guide discussions in the workshop:
Key questions set in advance:
1. How do we characterize research outputs across academic disciplines?
2. How can we capture the impact of research beyond academia – on the economy, society, culture, public policy, services, health, the environment or quality of life.
3. Can we balance quantitative research metrics and qualitative narratives in evaluating research?
Discussions centered on:
i. Rethinking recognition and reward – developing better forms of evaluation that embrace more than just research outputs and which go beyond just metrics. Recognition and reward can apply to individuals, parts of universities or whole institutions.
ii. Open Science – aims to improve quality, improve societal impact, use metrics appropriately in line with agencies such as DORA (and end addiction to impact factors and the H-index), create narratives for research, include stakeholders in decision making around the choice of problems to investigate. Key idea – develop a better way of evaluating research.
iii. Formative and Summative research evaluations are used globally. Summative assessments tend to be associated with funding mechanisms; formative assessments are more developmental and not directly connected to funding. Such an approach is concordant with the enhancement-led approach to learning and teaching used by the Quality Board in Iceland.
iv. The aims of research evaluation can be varied. For example, to justify spend of taxpayers’ money; for recruitment and promotion; to strengthen the quality of research in universities; to stimulate healthy competition between universities; to develop better research environments and doctoral training
v. Research in the arts is multifaceted – research into arts; research through arts; research for the arts. There is a distinction between approaches to arts in the humanities and practice-based research, the arts that we make. The Icelandic University of the Arts exemplifies the practice-based approach.
vi. What is evaluated varies across academic disciplines. Research outputs include, inter alia, journal articles, monographs, book chapters, conference proceedings, physical artefacts, exhibitions, performances, compositions, designs, code. It is important to recognize the validity of research outputs as they apply to and are identified by academic disciplines.
NB: REAC has spent time carefully considering a large range of research output formats.
vii. Key features of research evaluation should include transparency, simplicity, independent in-discipline peer review. Research that extends internationally and domestically should be appropriately accounted for.
viii. Academic publishing faces criticism. Factors involved in this include the numbers of journals and an increasing volume of predatory publishing; the high costs and large profit margins; aggressive use of impact factors and the gaming that takes place to publish with a high impact factor; the debilitating cost of open access and affordability in developing economies. Solutions to these problems are uncertain. Professional bodies can have a role in owning journals; changes in academic practice, such as a wider appreciation of the various talents academics have, could help.
ix. There is wide recognition of the need to valorize and support societal impact. This can happen across all disciplines, in appropriate ways. The only means of evaluation referred to during the webinar was the one used in the UK Research Excellence Framework – a case study narrative approach, assessed on a common scale by discipline-based peer review.
Please email firstname.lastname@example.org you wish to be added to the Quality Board mailing list to receive updates on activities and events hosted by the Quality Board, the Quality Council and REAC.
Posted on 10 Sep 2021.
The Research Evaluation Advisory Committee hosted a webinar titled Research Evaluation...Read More
The Quality Board hosted a webinar on 2 March 2021 titled Teaching and Learning...Read More