Page 256 - Becoming Metric Wise
P. 256
248 Becoming Metric-Wise
8.1 INTRODUCTION
8.1.1 Measuring What we Know (van Raan, 1992)
The title of this section is also the title of the inaugural address of
Professor Anthony van Raan upon becoming a full professor at Leiden
University in 1992 (in Dutch: Het meten van ons weten).
Many people are fascinated by the idea of measuring science in all its
aspects. They seem to forget that collecting and comparing specific data
(e.g., counting numbers of publications) is not measuring science. That
said, the research “business” has many stakeholders such as researchers,
public and private funders, not-for-profit organizations, and policy
makers. They all have an interest in auditing all aspects related to research
in general. So, for instance, research councils of universities try to obtain
a meaningful appreciation of the research conducted at their university.
High values for indicators may lead to higher visibility, which in turn
increases the probability for better and higher funding opportunities.
Funding can be used for better equipped research facilities, reducing brain
drain (top researchers staying at the university), and maybe even resulting
in some brain gain: top researchers coming to work at the university or
returning. The same reasoning applies at country level.
Trustees of research funds require researchers they support to publish
the funded research in the most visible way possible. Besides placing their
research results in public repositories, this also implies publishing in top
journals, and preferably in Open Access. Publishing in top journals and
with top publishers (for books) is one of the criteria of the Norwegian
model for research funding see Subsection 8.9.1. Hence, it is no surprise
that journal publishers and editors try to increase their journals’ impact
factors (Krell, 2010) and this not always in ethical ways.
8.1.2 Elements Used in Research Evaluation
Moed and Plume (2011) provide the following overview of elements that
may play a role in research evaluation exercises.
Units of assessment: individuals, research groups, departments, institu-
tions, research fields, countries, regions (e.g., the European Union).
Purpose: allocating resources, improving performance, increasing
regional engagement, increasing visibility, stimulating (international) col-
laboration, promotion, hiring.
Output dimensions: research productivity, quality, scholarly impact, appli-
cations, innovation, social benefit, sustainability, research infrastructure.