Title: The RankingIs Not Enough Lessons From New Bibliometric Exercises Second Leiden University Internatio
1The Ranking Is Not Enough Lessons From New
Bibliometric Exercises Second Leiden University
International Symposium on Ranking Leiden,
February 2-3, 2007Anthony F.J. van Raan Center
for Science and Technology Studies(CWTS)Leiden
University
2First and major challenge- Measure as good as
possible the research performance in the recent
past in quantifiable terms (objectivity,
reliability) - If you succeed in quantifying
performance, you can always make a ranking
3Two methodological approaches(1) broader
peer review gt expert survey gt reputation
assessment (2) bibliometric indicators gt
performance and impact measurementOn the
department or research program level (1) and (2)
are strongly correlated
4-Peer/expert review is a slow indicator-Populari
zation/media-stars (e.g., Stephen Gould) may
considerably affect expert opinions
5New, specialized and often smaller universities
may have a disadvantage in reputation but
nevertheless may have a high score on
bibliometric performanceAnd for broad, old,
often large classical universities there may be
the opposite situation
6Expert Survey Problems(methodological)1.
Biases geographical, field-specific2.
Responding gt Non-responding characteristics3.
Sample size gt reliability of measurement4.
Nomination procedure5. Scaling procedure6.
Controlling variables7. Standard deviation
scores8. Statistical significance9. Cognitive
distance
7Bibliometric Principle-Researchers produce
knowledge-This new knowledge is mostly
communicated by research papers in the
international literature- Other researchers
criticize/use/apply this published work by
referring to it in their researcher papers
8Not covered by bibliometric measurement are
important academic achievements such as the level
of teaching, research training, knowledge
transfer and technological innovation.
9Not covered by bibliometric assessment are other
evidences of research performance such asprizes
and awards, peer/expert review/surveys,
invitations as key-note speakers, etc. However,
in many cases these non-bibliometric indicators
do correlate strongly with bibliometric
indicators
10Technical Problems (major ones)- delineation,
definition unification of research
institutions particularly universities with
their academic hospitals - citing-cited
mismatches
11Methodological Problems (major ones)- Field
definition - Field-normalization of citation
counts- Black box indicators (do not use)-
Impact factors (do not use)- Highly cited
scientists (do not use)gthighly cited articles-
Article-type normalization of citation counts-
US bias- Language bias (Germany 25!)-
Engineering, Social Sciences, Humanities - Same
data, same methodology, different rankings
12Conf. Proceed.
Book chapters
Reports
Journal articles
Books
within CI
. and also field-specific!
13Network of publications (nodes) linked by
citations (edges)
Lower citation-density Higher
citation-density e.g., applied research, e.g.,
basic natural social sciences medical research
CPP(f)FCS
CPP(j)JCS
CPP
14International field- and document-normalized
impact of University A CPP(A,f)/FCS(W,f) this
is our crown indicator
15There are in the world 349 largest universities
with P gt 700/y
16- CWTS created a unique bibliometric data-system
-
- These universities are defined and unified as
- accurate as possible
- (2) For these universities all bibliometric
indicators are - calculated and updated, in particular
- P, C, CPP/FCSm, PCPP/FCSm, A/E(Top5)
- for the universities as a whole (average over all
fields) and for each - of the 16 main fields Ranking
- (3) Compares any of these universities with any
selection - Benchmarking
17The CWTS Ranking is based for a large part on our
work in a project funded by the European
Commission, Research DG (the ASSIST
project)CWTS-Team Henk Moed, Clara Calero,
Robert Tijssen,Thed van Leeuwen, Ed Noyons,
Renald Buter, Martijn Visser, Ton Nederhof, Ton
van Raan
18Major 33 European Universities with P(y) gt 2,000
ranked by normalized impact
19Major 68 European Universities with P(y) gt 1,500
ranked by normalized impact
20Major 100 European Universities with P(y) gt 700
ranked by brute force
21Major 100 European Universities with P(y) gt 700
ranked by normalized impact
22From universities as a whole to field-specific
rankings
23 Field set of publications in a field-
specific group of journals established fields,
scientific coarse structure
Journals
Field set of publications with specific
classification codes scientific fine structure
Classif
Field cluster of concept-related
publications new, emerging fields
cluster
24Main Field set of fields
Field set of journals established
fields scientific coarse structure
journals
25(No Transcript)
26Main field
journals
fields
27Genetics Heredity 185 journals
28Real life examples, current projects University
of Manchester Leiden University
As an example two of the 16 main fields
291575
Clinical medicine
2007
30Engineering
31NL
Benchmark univ. NL, LERU 2000-2005
LERU
32(No Transcript)
33(No Transcript)
34Trend analysis 1994-2005
35Rankings put performance measurements in a
perspective
36(No Transcript)
37(No Transcript)
38latest development (yesterday)
39(No Transcript)
40(No Transcript)
41(No Transcript)
42(No Transcript)
43Thank you for your attention