Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin - PowerPoint PPT Presentation

Loading...

PPT – Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin PowerPoint presentation | free to download - id: 7a8e6c-ZDRkM



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin

Description:

Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 24
Provided by: Uta109
Learn more at: http://www.icsti.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin


1
Options for Evaluating Research Inputs,
Outputs, OutcomesMichele GarfinkelManager,
Science Policy ProgrammeICSTI ITOC Workshop19
January 2015, Berlin
2
Todays talk
  • About EMBO
  • A policy view of research assessment
  • Stakeholder roles

3
About EMBO
  • European Molecular Biology Organization (Maria
    Leptin, Director)
  • Founded 1964, Heidelberg, DE
  • Funded by the European Molecular Biology
    Conference
  • 27 Member States
  • 3 cooperation agreements
  • Advancing policies for a world-class European
    research environment

4

Science Policy Programme
  • Governance
  • Three main areas biotechnology, responsible
    conduct of research, scientific publishing
  • Technology assessment
  • Scientific publishing
  • Open access
  • Data
  • Responsibilities of editors, administrators,
    authors

5
(No Transcript)
6
Scientific publishing
The publication of scientific information is
intended to move science forward. More
specifically, the act of publishing is a quid pro
quo in which authors receive credit and
acknowledgment in exchange for disclosure of
their scientific findings.
7
Journal name as proxy for quality
  • Journal Impact Factor a librarians number
  • The concern is not use, but misuse
  • Research assessment
  • JIF 38.597 a subscription for the price of the
    IF
  • Why has this been adopted for research
    assessment?
  • Cross-disciplinary
  • Intuitive and reflective
  • Prospective

8
Research assessment is an ecosystem
Other assessors?
9
What DORA sets out
  • Main recommendation Do not use journal-based
    metrics, such as Journal Impact Factors, as a
    surrogate measure of the quality of individual
    research articles, to assess an individual
    scientists contributions, or in hiring,
    promotion, or funding decisions
  • Implementation?

10
What DORA sets out
  • Research institutions and funding agencies be
    clear on evaluation criteria and consider all
    contributions
  • Publishers do not use JIF as a marketing tool,
    make more article level metrics available, make
    all reference lists open, remove limits on
    reference list length

11
What DORA sets out
  • Metrics suppliers provide methodology and data
    in a useful form, account for variation in
    article types (reviews v. research articles)
  • Researchers as assessors, review for scientific
    content as authors, cite appropriate (primary)
    literature challenge bad practices

12
  • What DORA does not say
  • Metrics based research assessment is wrong
  • JIF is flawed for assessing journals
  • Citations are a flawed metric
  • There is a simple alternative
  • Publishers are to blame
  • Thomson Reuters is to blame

13
  • What DORA does not say
  • Metrics based research assessment is wrong
  • JIF is flawed for assessing journals
  • Citations are a flawed metric
  • There is a simple alternative
  • Publishers are to blame
  • X is to blame

Altmetric Score
14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
Incremental advances
  • More institutions and funders emphasizing
    biosketches and 'select your 5 best papers'
    strategies over IF.
  • Constructive discussions with Thomson Reuters.
    More interest in dialogue and a willingness to
    improve the JIF as a metric
  • Competition is good for everyone

18
Incremental advances
  • Engagement with funders
  • Engaging additional research communities
  • Study national/regional variations
  • Editorials forthcoming
  • Key point better analyses needed
  • Policy analysis
  • Implementation and governance issues, metrics,
    stakeholders

19
Its the system (?)
  • This is not (just) about overworked or lazy
    promotion committees and rapacious journals
  • The reward system in science is (becoming) warped
  • Resources for thorough evaluation are not
    available
  • Journal articles have become the currency of
    rewards rather than a contribution to knowledge

20
Research Assessment Stakeholders
  • Researchers
  • Publishers
  • Research administrators
  • Funders
  • Metrics researchers
  • Metrics providers
  • Decision-makers

21
What should we be assessing?
  • We are great at measuring inputs (funding,
    numbers of students)
  • We are good at measuring outputs (numbers of
    papers, some impact measures)
  • Outcomes measurements are a problem

22
What should we be assessing?
  • Papers
  • And how they are discovered?
  • Data
  • And how they are discovered?
  • Reviewing?
  • Teaching?
  • Committee work?
  • Responsible conduct?

23
Ongoing work
  • Workshops
  • Governance issues
  • Stakeholders
  • Engagement with funders
About PowerShow.com