🕸️ CoARA working Groups

Archived
  • Dec 2022
  • Stefan Penders
  • ·
  • Modified 10 Jul
  • 33
  • Kim Huijpen
  • Claartje Chajes
Stefan Penders
Nieuwsrubriek
  • Claartje Chajes
  • Kim Huijpen

Bij de Constitutive Assembly van CoARA zijn de eerste ideeen voor Working Groups geidentificeerd via Mentimeter. Hieronder vinden jullie de ideeen. Aan welke internationale Working Groups willen jullie bijdragen? Wie wil voorzitter zijn van zo'n Working Group?

  • What are your first suggestions for possible topics of CoARA Working Groups?
  • Researcher participation
  • Sharing good practices in choosing a change approach
  • Responsible numerical bibliometric indicators
  • A working group on Narrative CVs
  • Rankings
  • the role of Impact factors across disciplinesThe role of Books across disciplinesThe role of Open access with fee and open access managed by universities as non profit
  • Rewarding OS practices
  • GYA proposes leading a working group: Research Assessment for a globally equitable and diverse research ecosystem
  • Narrative cv
  • The long-term impact of advancing research assessment on research culture
  • qualitative assessment beyond peer review (topical)
  • Link to the work on open science.
  • Early career researchers can be damaged by Open Research. Their career prospects need to be protected
  • basic principles and guidelines for evaluators
  • social sciences
  • Valuation of Research Data (perhaps jointly with RDA)
  • Alternative bibliometric metrics
  • Developing practical workshops for implementing new res assessment approaches; (2) Aligning evaluation on different levels (individual, institution, national) (3) Recognition of diverse career pathways (4) ensuring diversity in evaluations
  • The societal impact of research activities
  • Narrative CVs
  • What is responsible use of metrics? How can open science practices be rewarded?
  • Geographical inequality in successful grant applications and less applications from certain regions
  • National government and funding agency engagement/sign-up to CoARA. Ranking agencies and their influence on research assessment
  • Research Funders WG, RRA for public participation in research
  • Present examples of HEI that have already adopted CoARA guidlines or are juts about to do so. Maybe, simulations on what the adoption of CoARA gudilines would mean for the HEIs.
  • Innovative perspectives for rankings; how to measure impact
  • Practical ways to assess and value "human" skills.
  • National WGs
  • how to account for diversity
  • Framework to define metrics on an international level
  • Analysis of the differences between countries and the problems to be addressed in each of them (Europe)
  • Multilingualism and language bias: to provide ideas on regocnising and rewarding research published in different languages; Engaging learned societies; National acticivites in advancing research assessment.
  • Young career evaluation
  • What is appropriate use of indicators? In the commitments, inappropriate use is mentioned, yet what is appropriate use? Currently, there is a lot of hesitancy in this regard.
  • reform of copyright
  • Rankings; Getting grant organizations involved (ERC in particular)
  • appointment procedures
    • Create a working group for the sharing of best practices and creating of a network.
    • national working group- ex post assessment
    • A calendar for working groups- some rules for interactions within groups-
  • Assessment of institutions by National Evaluation AgencyFair peer reviewAssessment of Research Projects
  • Improve the quality of evaluation of Research
  • Definition of qualitative indicators by research area
  • Peer review: ways to open it to make it truly among peers (how to select reviewers?
  • Working on open science and open
  • Develop new metrics for measuring team work, collaboration, and all the others aspects of research that are crucial
  • Promotion and progression
  • peer review
  • publication ethics
  • A group to indicate merits of the researchers
  • Discuss possibly universal criteria for evaluating quality in research and, consequently, for creating the quality-based evaluating system for universities and research institutes
  • Role and status of peer review
  • reflecting on the role of the reviewer/expert in research assessments
  • State-of-the-art of current initiatives for changing research assesment
  • Indicators to include in institutional rankings
  • Peer review - how to valorize it, how to teach ERCs to do it, how to assure it is done properly, how to ensure we have enough reviewers to perform the foreseen qualitative assessment ...
  • How different disciplines should be assessed
  • Diversity of contributions in careers
  • Global solidarity
  • Role and status of peer review un the assessment of individual researchers
  • Narrative evaluation
  • Narrative evaluation
  • How to incorporate reform of research assessment in overall job evaluations?
  • Impact of personality among researcher community
  • Principles for research assesment
  • success metrics and KPIS
  • Agenda 2030 and assessment
  • Global / International Research Evaluation System
  • Positive experience with bibliometric indicators as a solely supporting tool for peer review
  • support to financiers to go forward with coara and guidelines to use
  • Training evaluators and each one involved in research and research assessment
  • tst
  • cooperation in the field of technical sciences
  • evaluation of transdisciplinary/multidisciplinary research and publishing
  • Create a European assessment fundament (framework) based on qualitative criteria) that is recognized within the EU universities.
  • Het future of HRS4R
  • Health Sciences
  • alternatives to university rankings
  • Acknowledge publications regardless of language
  • Development of recommendations or guidelines for funding organizationsDevelopment of a roadmap of informing members of research community, stakeholders, decision-makers and research community of the reform, its aims and expected outcomes
  • Research Ethics
  • More experienced evaluators to be involved, based on their participation in projects, rather than based on their education/academics.
  • How can recognize and rewardTeamscience? What do we need to make the assessment transparant?
  • joint workshopsjoint agendas
  • First I would say How to ensure synergies between the WG of CoARA then extend to the external initiatives. The task is huge.
  • career assessment
  • The funded project on research assessment for alignment
  • Research Assessment in Law & humanities
  • best practices
  • Pan EU impact-based CV
  • Peer review
  • Framework for open assessment processes
  • How to perform the gap analysis of CoARA principles vs. organisational practices.
  • Supporting HEIs in engaging with the global rankings.
  • Gather evidence (data, stories, practices, etc.) on the various aspects of Research Assessment. This evidence is key as we move to new indicators and apply new practices.
  • Italy-specific WG centred on evaluation in Universities
  • Academic publishing: move away from evaluating publications in terms of prestige, impact factors, or quantitative metrics.
  • Synergies with quality assurance of education within higher education institutions and of study programmes
  • Alignment of CV templatesFunding organisations working groupAssessment criteriaTransparency in practice
  • Collect best practices!
  • Impacts of interventions on individual groups (ECRs, Gender, Geographic, Research field, ...)
  • How to train international evaluators to take into account broader elements in the evaluation other than classic bilbiometrics. It is possible that in their country they see things differently
  • We need to pay much attention to humanities and social sciences' evaluation processes, especially in connexion of Digital Humanities.
  • Disciplinary communities
  • How to define research excellence?
  • the role and status of peer review in the assessment of individual researchers for funding, promotion or appointments?
  • How to envolve the Humanities in the process?
  • role and status of peer review in the assessment of individual researchers
  • Tool for self assessment of quality and impact of research
  • research assessment for performance-based research funding
  • Research career positions and requirements
  • Alternative career pathsInteroperability of KPI's (for lack of a better word)Assessment criteria for Horizon Europe
  • Technical RIs /data bases: how to collect comparable data ; how to create a safe space for "guinea pigs": WG on "receipe" for their application procedure
  • How can we make the assessment sustainable and at an affordable cost?
  • How universities access their academic staff
  • cooperation/coproduction how can that be part of quality assessment of research.
  • First steps needed for universities that have not started any kind of reformOpen Access issues and the role of publishers
  • National dialogues
  • Research assessment criteria in national evaluation procedures for the assessment of research institutions
  • Criteria and procedures for research assessment
  • open science as a priority; research work assessment; publications and journal prices
  • Working group on responsible metrix. The working group should produce guidelines for apropriate use
  • Should research assessment be done by internal committees or external/international ones?How to compare early stage researchers with more senior researchers?How to include institutional interests in the assessment procedure?
  • rewarding other output/activities than just articles
  • transformative agreements
  • Set the frame to garantee balance between men-women, early career and experience investigators, geographical and theatical balance
  • the role of the reviewer/expert in research assessments and how power dynamics shape decisions making (for the better or worse) even when using responsible indicators and outline strategies to optimizes fair, quality-driven assessments
  • The meaning of quality in research across different disciplines. The journals landscape and quality of research.
  • Career paths for researchere
  • -peer review- involve teaching
  • reflecting on the role of the reviewer/expert in research assessments
  • Priority should be given to sharing best practices of assessing research quality (balance qualitative & quantitative indicators; peer review).
  • A group on exchanging practices on engaging with critical/ opposed scientists (who do not want to move away from the status quo
  • How imply the European Universities' alliances in the process, what could be their role in connection with national regimentation ?
  • How to bring the agreement to reality for reserachers ? Making Governments, main funding bodies for universities adhering to qualitative assessments is a challenge to should be carefully adressed.
  • Axe of reflexion could be explaining how to make research differently: Most of researcher are already educated with the logic of writing paper and don't see other way to do research.Also porpose tools to makeit
  • Grant and fellowship applications
      • Research assessment in the recruitment of researchers
  • narrative CV
  • Guideliness on peer review panel organisation. Best practice on panel organisationUse of bibliometric indicators in panel assessmentHow can research assessment take into consideration "team science" practicesHow to consider non-academic impact in
  • Diversity
  • What do we consider as 'fair evidence'? In many meetings we hear 'How do we keep it fair? This counts for both sides. So the candidate to be assessed as well as the assessor(s)
  • how power dynamics shape decisions making
  • research direction including disciplines
  • balancing robust assessment with reduced administrative burden
  • Over-assessment
  • Rankings
  • how to differentiate the two areas (bibliometric vs non-bibliometric) in the working groups
  • Peer review processes
  • how to find evaluation indices, quickly, that are not magazine-driven and are more FAIR
  • open access
  • Research culture and meaningful diversity
  • Remarkable achievements the researcher obtained
  • on 3 levels: research projects, researchers (individual & researcher groups) and institutions/units. Then special ones dedicated to doctors level - requirements for doctoral candidates and supervisors (accreditation ect) and Governmental level.
  • Effect of narrative CVs on excellence of science
  • The impact of researcher upon the economic life
  • peer review and metrics for fundamental versus applied science
  • reform copyright
  • incentivising and measuring collaboration- and impact from same
  • National Based WGs
  • Interdisciplinary Research on Research Assessment
  • Acknowledge work beyond language - how
  • Rankings - clear rules for research and researchers evaluation processes
  • Human and Social sciences
  • Research assessment - at doctoral level (requirements)
  • Open Metrics and Open Peer Review.
  • How can we measure Impact. In the end it is all about Impact.
        • Development and introduction of uniform standards of reporting and accounting documentation - to reduce the time for its preparation by the researcher and increase his effective time for conducting scientific research;
  • competency based assessment
  • Tools for fine-tune evaluation between similar candidates/organizations
  • Impact of OPEN ACCESS and OPEN SCIENCE issues on research assessment
  • Researcher careers
  • New set of indicators, alligned to the commitments.Maping best practices among instutions.
  • Open metrics - alternatives to commercial products: what is existing and how can we support & scale up these initiatives
  • How to reward and incentivise researchers to practise open science?
  • Peer review
  • What inspiration to change research assessment can be drawn from existing communities (for example applied sciences, arts, qualitative social sciences, humanities, etc.)
  • Multilingualism in research evaluation
  • Insights and potential good practices in peer review
  • Peer review
  • What is the most significant output for the next year work of COARA?
  • Informed peer review: How to assure the good use of (biblio)metrics?Limitations of peer review: How to track biases?
  • Disciplinary comparable standards and disciplinary differences; open science impact; language difference; academic culture; science-society
  • RankingsPeer review
  • Disciplinary groups, good practices
  • Technological/Technical layer for Research Evaluation
  • Good alternatives for the current research quality assessments. How to get to more transparent rankings.Assessment of research groups and researchers (individuals)
  • Reproducibility
  • Early career researchers
  • WG recruitment & promotion of individual researchers 2. Life sciences WG3. institutitutional assessment practices
  • research assessment for different organisation types, i.e. "classic" universities, research organisation, universities of applied sciences
  • Research Assessment Project Management -group for experts running institutional level reviews in universities
  • Mode of evaluation of research projetcs.
  • HRS4R award
  • narrative CVs
  • WG on evaluation of open science practicesWG addressing evaluation of large groups with qualitative indicators (eg. evaluation of 100 research groups)
  • A Follow-up group for the actions of the Agreements and how they are implemented
  • Evaluating small, specific fields (e.g., in small organizations, or country-specific topics etc.), where no statistical population large enough is available.
  • Train reviewers
  • Type of University (e.g. Universities of Applied Sciences)
  • Embedding a range of qualitative indicators within a qualitative assessment
  • Researcher careers
  • Disciplinary Communities (such as Arts and Humanities, Social Sciences etc.), Challenges of Interdisciplinary Scholarship, Assessing Digital Methods and Outputs
  • Responsible numerical measures for research impact
  • We would like a WG comparing existing institutional evaluation systems in different countries, like REF in UK, Netherlands, Belgium (e.g. VIB), Academies of Science and similar.
  • Implementation and roll-out of CoARA at RPO level - inclusion of the research administrator and management profession
  • Researcher development
  • Academic assessment as a result of research assessment reform- Qualitative research assessment in human and social sciences- Open Science development and research assessment reform
  • which alternative metrics can be used to support qualitative assessment
  • working with assessment authorities
  • Working on dismantling ranking in public perception.
  • Forming European assessment boards (is there enough people to perform peer review if done separately at national levels), Open Access and its financing, Abandoning bibliometrics completely, or use it as additional info for peer review
  • Reforming the way project are selected (Europe) is a good first step to modifie the culture of presenting what is excellent, what is innovation ...
  • A comparison between KPI-focused, benchmarking-oriented, and content-oriented systems should be made and seen how they developed through the years – what are the trends in more longstanding systems? How heavy are they on human resources?
  • Peer review evaluation processes in the context of funding agencies.
    • Working group on capacity building
  • We need to focus on Digital Humanities assessment and make proposals.
  • WG for developing new tools for research assessment. Wg for developing variations of assessment criteria by branches of science
  • Social Sciences and Humanities research evaluation
  • Analysis of the state of the art to revise the differencies
  • Recruitment and selection
  • Topic 1: Research quality in relation to teachingTopic 2: Research quality in cooperative contexts with non-academic partners
  • Evaluation of practices
  • Overview of existing evaluation criteria
  • European status/diversity of the Research Assessment
  • Interdisciplinary evaluation of research: how to give importance
  • Gender Equality in Research Teams, Gender Equality at the leadership levels of Research Teams, Inclusion and Diversity in the representation of researchers from different career levels
  • Researcher centric and user focus application of CoARA
  • costs of open access publications
  • assessment of societal impact of research
  • work out universal criteria for assessing research
  • List of the research productions that could be assess
  • National vs. organisational evaluation.
  • Mapping WG and scheduling regular meetings is needed
  • including usefulness of science and research
  • Peer review
  • Criteria for new indicators based on Open Science
  • Role and status of peer review in the assessment of individual researchers e.g. appointments or promotion
  • Peer review
  • assessing inclusion and diversity in research protocols
  • rewarding team science
  • Narrative CVs
  • Narrative CVs
  • impact of research in society
  • Interdisciplinary research.
  • I would like to receive the list of suggestions, thank you mariabenedetta.mattei@univaq.it
  • How many young researchers have been formed by the respective researcher
  • Training of reviewers
  • self assessment tools -quality and impact (both sociatal and academic) and narrative CVs
  • Predatory Journals
  • Assessments/evaluation methods and processes that encourage research integrity.
  • A WG to focus on diversity criteria on various issuesA WG to discuss and propose those metric criteria that are ”chosen with responsibility”
  • New Research Communication Mechanisms (paper and journals are anachronistic and ineffective) leading new metrics
  • Іmplementation of developed and uniform standards for all countries and institutions that have joined this agreement and their approval at the level of the governments of all participating countries.
  • Identify successful models already working and establish collaboration with them
  • Maybe define goals and benchmark institute/orgnaisatino/etc. based on these goals. This will help to make a state-of-the-arts of the situation