Science: How academia is exploring new approaches for evaluating researchers
- 24 Jul
- Claartje Chajes
- ·
- Modified Jul 2023
- 1
- 68
How academia is exploring new approaches for evaluating researchers | Science | AAAS
https://www.science.org/content/article/how-aca...Wide-ranging experiments have important implications for early-career researchers and the entire ...
Matilde Galli has ample experience writing grant applications. But a couple years ago, as she sat down to prepare her application for a Dutch Research Council individual grant she had her eye on, she was thrown by the new CV requirements. Instead of a traditional long list of degrees, positions, papers, grants, and talks, Galli was asked to capture her research vision and academic qualities in a written personal statement complemented by a description of no more than 10 key scientific outputs.
Such narrative-style CVs are part of an expanding wave of new evaluation methods for grants and academic positions, aiming to assess applicants more directly on the merits of their research rather than where they got it published and to expand the range of activities that researchers get recognition for. There is a lot at stake, believes policy adviser and research culture consultant Karen Stroobants. Many issues in academia, “including mental health, career progression, inclusion, and diversity … all come back to this combination of a very competitive environment together with very narrow definitions of success.” Tackling the publish or perish culture and updating approaches to assess researchers can be “a lever to solve a lot of different problems.”
Among the earliest and most influential initiatives in this direction is the San Francisco Declaration on Research Assessment (DORA), spearheaded 10 years ago by a group of editors and publishers. To date endorsed by more than 23,000 signatories in 161 countries (including AAAS, publisher of Science), DORA has advocated for a radical move away from evaluating researchers using journal-based metrics such as impact factor and toward making sure researchers receive professional recognition for research outputs beyond manuscripts, including data sets, software, and influence on policy.
In the decade since DORA was established, it has been joined by a flurry of other reform movements—and a nascent field of research into what approaches could be taken and which of these may work or not. Recently, for example, more than 580 research institutions, funders, and other organizations in Europe and beyond—most of them pledging to work together toward systemic change as part of the Coalition for Advancing Research Assessment (CoARA)—have signed on to the Agreement for Reforming Research Assessment, committing to developing new criteria and procedures for their research evaluation processes within 5 years and generally contributing to the evidence base. Together, these initiatives are paving the way for a profound reconsideration of what research endeavors and academic careers should look like.
Experiments into these approaches are ever-evolving and too many to list. To name just a few, in 2021 Universities Norway released the Norwegian Career Assessment Matrix with specific examples of research outputs and academic competencies as well as prompts for reflection to foster a more transparent and holistic assessment of researchers and individual career development. University College London has developed its own academic careers framework with a broad range of activities and impact indicators. Ghent University invites faculty staff to help define their evaluation and promotion criteriabased on activity portfolios. The European Molecular Biology Laboratory uses a CV listing varied research outputs with a narrative describing the significance of the most important ones to recruit researchers across its six sites.
One of the most advanced initiatives so far in its reach and implementation has been taking place in the Netherlands. Starting in 2019, Dutch universities, research institutes, and funders joined forces to implement a new recognition and rewards system giving visibility to the whole range of academic activities and diversifying the successful roles and career directions that faculty can take beyond the traditional research-focused model. “Changing culture is difficult and takes a long time,” the program’s website states, and it will take “[e]xperimentation, inspiration, co-creation, sharing good practices and learning from each other.”
Some institutions have already shown it can be done. In 2016, University Medical Center (UMC) Utrecht moved away from using traditional bibliometrics in faculty appointmentsby introducing a qualification portfolio that allows professors to highlight a broad range of activities, including not only scientific research, teaching, and patient care, but also innovation, outreach, leadership, mentorship, and collaboration. In 2020, a bottom-up initiative led one graduate program there to shift the focus of annual student evaluationsfrom publications, conference abstracts, and prizes to students’ broader achievements and impacts. And starting this year at Maastricht University, all faculty members will choose a set of specific academic activities beyond core teaching and research to focus on—and be evaluated on, alongside their engagement in academic citizenship, team performance, and open science—based on their preferences and the broader needs of their units.
Maastricht associate professor Vanessa LaPointe says she has benefited from her institute’s early experiments toward more holistic assessment of young principal investigators. LaPointe has developed a mostly research-focused career, but she has also taken management and leadership roles including chairing the Maastricht Young Academy, which was involved in designing the new, university-wide research assessment policy. “It’s actually being viewed as a potential strength of mine, and something that can be developed” alongside research, she says. During the annual evaluations with her tenure committee, being able to go beyond CV bullet points and discuss why she made certain decisions and what she learned from them “has made me feel that it’s more possible to make mistakes … and to actually just get better at my job.”
But the road to reform can be bumpy, and the scientific community ambivalent. When Utrecht University announced in 2021 that it was moving away from using journal impact factors in researcher evaluations, all the way from recruitment to staff appraisals and promotion decisions, some scientists—including Galli, a cell biologist at the Royal Netherlands Academy of Arts and Sciences’s Hubrecht Institute—criticized the move and the broader national initiative as a threat to the competitiveness of Dutch science and scientists. Others, however, defended the new recognition and rewards system as a much-needed measure to adapt to today’s academic environment.
Many early-career researchers also feel largely left out of the process. More clarity is needed as to what is expected of early-career academics to succeed in the short term, said Charisma Hehakaya, an assistant professor with a 1-year contract at UMC Utrecht, at an annual recognition and rewards event held in April. Utrecht University postdoc Onur Sahin also urged reformers to consider “how are we preparing the early-career academics” for the diversity of skills and activities that will be expected in the next steps of their careers, adding that flexibility should start at the Ph.D. level.
Current doctoral students have mixed feelings. When Ph.D. students at Maastricht were asked how they would like their doctoral trajectory to be changed amid the ongoing reforms, for example, they said they wanted to keep their ability to focus on research for a few years rather than being pressured to develop their competencies in other domains, while still being offered some opportunities to do so, said university President Rianne Letschert at this year’s recognition and rewards event. Beyond the Netherlands, Ph.D. students at Hasselt University in Belgium are largely open to the reforms—though not without concerns, finds Noémie Aubert Bonn, a postdoctoral researcher on research assessment there and a senior policy adviser for Research England. When teaching students, “I do see them a bit worried as well because they have on the one hand the supervisor telling them, this is how you do things, as in the old system, and on the other hand, they have this whole debate that’s going on and things are changing, so there’s uncertainty about their futures,” says Aubert Bonn, who was involved in the CoARA agreement drafting process.
Funding bodies are also making changes. The European Molecular Biology Organization, an early signatory of DORA, states that it will delete any publication-based metrics from funding applications and allows candidates to include preprints, mentorship, and outreach in their application form. In the two-page Résumé for Research and Innovation currently being rolled out by UK Research and Innovation (UKRI), applicants must describe their contributions to scientific knowledge, the development of other researchers, the wider research and innovation community, and toward societal benefit to convince reviewers they are in a good position to carry out their proposed research project. They should refer to publications using digital object identifiers, not the name of the journals, and reviewers are instructed to not use journal indicators or author-level metrics such as the h-index in their evaluations. Along with standard biographical information, applicants for European Research Council (ERC) funding will soon have to provide a list of no more than 10 research outputs showing how they’ve advanced knowledge together with selected examples of peer recognition. There will be space for short narratives about the significance of the research outputs, the applicants’ capacity to undertake their project, broader contributions to the research community, or unusual career paths, and ERC will no longer provide prescriptive profiles for applying principal investigators based on an expected minimum publication track record.
Narrative CVs still have some wrinkles to work out, as Galli experienced. “In principle, I think it is a good movement to evaluate scientists more broadly,” especially by their employers, Galli says, but new funding rules must be introduced with care. The first time she encountered the new CV she was pushed for time in a tight application season and felt at a loss as to what to highlight beyond her four high-profile publications. A 2021 pilot CV project revealed similar confusion among early-career researchers asked for the first time to prepare a UKRI Résumé for Research and Innovation as to what information should go where and how it would be evaluated by reviewers.
Galli, who now lobbies the Dutch Research Council toward improving procedures, stresses the importance of funders stating the purpose of the new CV format and establishing clear evaluation criteria to guide applicants and reviewers. The Dutch agency has continued to develop its CV requirements and selection processes, providing more guidance and explanations.
And while participants in a Swiss National Science Foundation CV pilot appreciated the opportunity given by the new CV format to demonstrate connections between their research activities and highlight achievements that traditionally would receive little visibility, they found the amount of time necessary to author narratives off-putting. But this may be time well spent, including for researchers who will go on to leave academia, points out Stroobants, who designed a precursor format to the CV now used in the United Kingdom. “The narrative CV will help them massively in writing something that’s much more useful across sectors,” she says.
Such debates or signs of changing practices are unfolding in most parts of the world. But so far, the CoARA coalition primarily includes European entities—a region that generally prizes researchers’ international experience. It will be important to understand how the changes could affect researchers’ mobility, says Stroobants, who is vice-chair of the CoARA steering board and led the CoARA agreement drafting team, which also included representatives from the European University Association, Science Europe, and the European Commission. “How are they finding applying in other countries where organizations are at different stages in their reform journey? This is going to be one of the big challenges, how do we ensure that we don’t make this overcomplicated for those early in their career.” Researchers who move internationally “might have to deal with a variety of ways of being assessed, and that is time consuming,” she acknowledges. But she hopes that, in the longer term, the ongoing reforms “will give chances to people who currently feel they don’t belong” in academia.
Finding a working environment and assessment rationale that suit who you want to be as a scientist is key, Aubert Bonn says. That task should soon become easier thanks to DORA’s Tools to Advance Research Assessment (TARA) project, which is to release a dashboard of reform efforts around the globe this fall. Called ReformScape, the tool will allow people to see how far individual institutions have gone in implementing policies about metrics; diversity, equity, and inclusion; research integrity; open science; and more, says Alex Rushforth, a sociologist at Leiden University who studies research evaluation and is part of the TARA project team. But TARA will leave it up to the users to make up their minds as to whether the initiatives are a good idea. A strong evidence base has yet to be built to accompany the reforms, Rushforth says, with key questions including “For whom do they work? What kind of values do they promote? What outcomes have they reached?”
All the sources Science Careers spoke to stressed the importance of continuously monitoring and adapting the new assessment procedures to ensure they fulfill their promise. “It’s important to immediately think about, what are potential unintended consequences here? Are we able to mitigate these?” Stroobants says. For example, greater reliance on narrative CVs could unwittingly add new biases into the evaluation processes, possibly disadvantaging nonnative English speakers or favoring applicants with more assertive attitudes, a 2021 report by DORA and Funding Organisations for Gender Equality Community of Practice warned. “It’s important to really be quite critical around how things are working,” Stroobants says. And early-career researchers should be part of the discussion.
By Elisabeth Pain in Science Career
Comments
Ik vind het een inzichtelijk stuk. Kleine kritische kanttekening wel: er wordt wel degelijk gemonitord of de nieuwe procedures doen waar ze voor bedacht zijn. Zie bijv de monitoring van de FNR rondom hun narratieve CV. In de laatste alinea gaat het stuk dus wel te kort door de bocht en sluit het daar mee meer in een mineur af dan nodig is.
En nog los van dat het stuk dus wat monitoring mist: natuurlijk is het altijd belangrijk om bij veranderingen te monitoren. Tegelijkertijd is de status quo daar niet van vrijgesproken. Er is ook zat bewijs dat er veel problematische kanten aan de oude systematiek was. Dan moet er m.i. ook ruimte zijn om daarvan weg te kunnen bewegen zonder dat je vooraf al bewijs hebt dat dat ook succesvol gaat zijn.
En al helemaal in het onderzoeksveld: ik heb bij NWO immers ook nog nooit een onderzoeksvoorstel voorbij zien komen dat kon garanderen dat de hypothese bewezen ging worden. Waarom zou je anders het project nog moeten uitvoeren? ;)