The RankingIs Not Enough Lessons From New Bibliometric Exercises Second Leiden University Internatio - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

The RankingIs Not Enough Lessons From New Bibliometric Exercises Second Leiden University Internatio

Description:

The Ranking Is Not Enough. Lessons From New Bibliometric Exercises ... Ed Noyons, Renald Buter, Martijn Visser, Ton Nederhof, Ton van Raan ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 44
Provided by: tvan9
Category:

less

Transcript and Presenter's Notes

Title: The RankingIs Not Enough Lessons From New Bibliometric Exercises Second Leiden University Internatio


1
The Ranking Is Not Enough Lessons From New
Bibliometric Exercises Second Leiden University
International Symposium on Ranking Leiden,
February 2-3, 2007Anthony F.J. van Raan Center
for Science and Technology Studies(CWTS)Leiden
University
2
First and major challenge- Measure as good as
possible the research performance in the recent
past in quantifiable terms (objectivity,
reliability) - If you succeed in quantifying
performance, you can always make a ranking
3
Two methodological approaches(1) broader
peer review gt expert survey gt reputation
assessment (2) bibliometric indicators gt
performance and impact measurementOn the
department or research program level (1) and (2)
are strongly correlated
4
-Peer/expert review is a slow indicator-Populari
zation/media-stars (e.g., Stephen Gould) may
considerably affect expert opinions
5
New, specialized and often smaller universities
may have a disadvantage in reputation but
nevertheless may have a high score on
bibliometric performanceAnd for broad, old,
often large classical universities there may be
the opposite situation
6
Expert Survey Problems(methodological)1.
Biases geographical, field-specific2.
Responding gt Non-responding characteristics3.
Sample size gt reliability of measurement4.
Nomination procedure5. Scaling procedure6.
Controlling variables7. Standard deviation
scores8. Statistical significance9. Cognitive
distance
7
Bibliometric Principle-Researchers produce
knowledge-This new knowledge is mostly
communicated by research papers in the
international literature- Other researchers
criticize/use/apply this published work by
referring to it in their researcher papers
8
Not covered by bibliometric measurement are
important academic achievements such as the level
of teaching, research training, knowledge
transfer and technological innovation.
9
Not covered by bibliometric assessment are other
evidences of research performance such asprizes
and awards, peer/expert review/surveys,
invitations as key-note speakers, etc. However,
in many cases these non-bibliometric indicators
do correlate strongly with bibliometric
indicators
10
Technical Problems (major ones)- delineation,
definition unification of research
institutions particularly universities with
their academic hospitals - citing-cited
mismatches
11
Methodological Problems (major ones)- Field
definition - Field-normalization of citation
counts- Black box indicators (do not use)-
Impact factors (do not use)- Highly cited
scientists (do not use)gthighly cited articles-
Article-type normalization of citation counts-
US bias- Language bias (Germany 25!)-
Engineering, Social Sciences, Humanities - Same
data, same methodology, different rankings
12
Conf. Proceed.
Book chapters
Reports
Journal articles
Books
within CI
. and also field-specific!
13
Network of publications (nodes) linked by
citations (edges)
Lower citation-density Higher
citation-density e.g., applied research, e.g.,
basic natural social sciences medical research
CPP(f)FCS
CPP(j)JCS
CPP
14
International field- and document-normalized
impact of University A CPP(A,f)/FCS(W,f) this
is our crown indicator
15
There are in the world 349 largest universities
with P gt 700/y
16
  • CWTS created a unique bibliometric data-system
  • These universities are defined and unified as
  • accurate as possible
  • (2) For these universities all bibliometric
    indicators are
  • calculated and updated, in particular
  • P, C, CPP/FCSm, PCPP/FCSm, A/E(Top5)
  • for the universities as a whole (average over all
    fields) and for each
  • of the 16 main fields Ranking
  • (3) Compares any of these universities with any
    selection
  • Benchmarking

17
The CWTS Ranking is based for a large part on our
work in a project funded by the European
Commission, Research DG (the ASSIST
project)CWTS-Team Henk Moed, Clara Calero,
Robert Tijssen,Thed van Leeuwen, Ed Noyons,
Renald Buter, Martijn Visser, Ton Nederhof, Ton
van Raan
18
Major 33 European Universities with P(y) gt 2,000
ranked by normalized impact
19
Major 68 European Universities with P(y) gt 1,500
ranked by normalized impact
20
Major 100 European Universities with P(y) gt 700
ranked by brute force
21
Major 100 European Universities with P(y) gt 700
ranked by normalized impact
22
From universities as a whole to field-specific
rankings
23
Field set of publications in a field-
specific group of journals established fields,
scientific coarse structure
Journals
Field set of publications with specific
classification codes scientific fine structure
Classif
Field cluster of concept-related
publications new, emerging fields
cluster
24
Main Field set of fields
Field set of journals established
fields scientific coarse structure
journals
25
(No Transcript)
26
Main field
journals
fields
27
Genetics Heredity 185 journals
28
Real life examples, current projects University
of Manchester Leiden University
As an example two of the 16 main fields
29
1575
Clinical medicine
2007
30
Engineering
31
NL
Benchmark univ. NL, LERU 2000-2005
LERU
32
(No Transcript)
33
(No Transcript)
34
Trend analysis 1994-2005
35
Rankings put performance measurements in a
perspective
36
(No Transcript)
37
(No Transcript)
38
latest development (yesterday)
39
(No Transcript)
40
(No Transcript)
41
(No Transcript)
42
(No Transcript)
43
Thank you for your attention
Write a Comment
User Comments (0)
About PowerShow.com