Size, Rankings and Bibliometrics OECDNordic Universities Association University of Iceland, Reykjavi - PowerPoint PPT Presentation

1 / 68
About This Presentation
Title:

Size, Rankings and Bibliometrics OECDNordic Universities Association University of Iceland, Reykjavi

Description:

Size, Rankings and Bibliometrics OECDNordic Universities Association University of Iceland, Reykjavi – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 69
Provided by: Wie79
Category:

less

Transcript and Presenter's Notes

Title: Size, Rankings and Bibliometrics OECDNordic Universities Association University of Iceland, Reykjavi


1
Size, Rankings and Bibliometrics OECD/Nordic
Universities AssociationUniversity of Iceland,
Reykjavik, June 4-7, 2008 Anthony F.J. van
Raan Center for Science and Technology Studies
(CWTS)Leiden University
2
Leiden University, oldest in the Netherlands,
1575, European League of Research Universities
(LERU)
Leiden, prominent and historic city (2th, 11th),
strong cultural scientific tradition one of the
largest science parks in EU
3
(No Transcript)
4
Contents of this presentation
  • Basic Bibliometric Principles
  • Application to Rankings

5
Network of publications (nodes) linked by
citations (edges)
Lower citation-density Higher
citation-density e.g., applied research, e.g.,
basic natural social sciences medical research
FCSm
JCSm
Expected values for normalization
CPP
6
Citing publications
Field-specific normalization C(A)/P(A) ----------
-- CPP/FCSm C(f)/P(f) doc. type
normalization no self-citations, also not in
C(f)!
C(f)
C(A)
C(A)/P(A) c
C(f)/P(f) cf
P(f)
P(A)
Cited publications
7
Indicators P number of publications gt
scientific productivity, scaleCPP/FCSm number
of field- document type- time window
normalised citations gt average scientific impact,
influence Pt/?t number of publications with
impact in the top-5, 10, 20,.., number to the
expected number gt distribution-related scientific
impact, influence
8
Basic research
high FCSm
High CPP
high FCSm, but low JCSm
low FCSm, but high JCSm
low CPP
low FCSm
Up to factor 20
Certainly for a period gt 5 years these
measurements provide a very significant indicator
of scientific impact ( quality)
Applied research, engineering
9

E 19
3.0
D 10
institute as a whole
2.0
C 15
CPP/FCSm
1.0
world average
B 10
A 46, half of which 0 cit
0.0
1996
2005
t
10

Total publ universe
non-WoS publ Books Book chapters Conf.
proc. Reports
ArXiv
Scopus
WoS sub-universe 8,000 j 1,000,000p/y
LNCS
..Google
Source expansion e.g., Computer Science
Compendex
CWTS has license agreement with Scopus CWTS
currently compares Scopus- vs. WoS
coverage CWTS bibliometric algorithms can be
applied to Scopus data
Medline
Target expansion non-WoS analysis
Refs gt nWoS
11
(No Transcript)
12
CWTS applies three types of field definitions
journal-based, classification-code based,
concept-relations based
Field set of journals established
fields scientific medium-grained structure
Journal
reference-based re-definition (expansion) of
fields
13
Main field Medical Life Sciences
Major field
journals
fields
14
  • NL national chemistry evaluation 10 Universities
    with major chemistry departments
  • One of these universities has 13 chemistry
    research groups
  • Bioinformatics
  • Solid state NMR
  • Theoretical chemistry
  • NMR studies of proteins
  • Supramolecular chemistry
  • Synthesis of antitumor compounds
  • Synthetic organic chemistry
  • Bio-organic chemistry
  • Organometallic chemistry
  • Chemometrics
  • Crystal growth
  • Autoimmune biochemistry
  • Heat shock proteins

Chemistry at research group level Field
(CPP/FCSm)
Breakdown by fields
15
  • CWTS has a unique bibliometric data-system
  • 1000 universities worldwide are defined and
    unified
  • as accurate as possible
  • (2) For these universities all bibliometric
    indicators are
  • calculated and updated, in particular
  • P, C, CPP/FCSm, PCPP/FCSm, A/E(Top5)
  • for the universities as a whole (average over all
    fields) and for each
  • of the 16 main fields Ranking
  • (3) Compares any of these universities with any
    selection
  • Benchmarking

16
There are in the world 400 largest universities
with P gt 700/y
17
(No Transcript)
18
Total visibility
size
19
(No Transcript)
20
(No Transcript)
21
top
Total visibility
bottom
size
22
Finding 1 Size-dependent cumulative advantage
for the impact of universities in terms of total
number of citations. Quite remarkably, lower
performance universities have a larger
size-dependent cumulative advantage for receiving
citations than top-performance universities.
23
invisibility
bottom
top
size
24
invisibility
bottom
top
journal impact
25
Finding 2 For the lower-performance
universities the fraction of not-cited
publications decreases with size. The higher
the average journal impact of a university, the
lower the number of not-cited publications.
Also, the higher the average number of
citations per publication in a university, the
lower the number of not-cited publications. In
other words, universities that are cited more per
paper also have more cited papers.
26
quality
top
bottom
size
27
Finding 3 The average research performance of
university measured by crown indicator CPP/FCSm
does not dilute with increasing size. The
large top-performance universities are
characterized by big and beautiful. They
succeed in keeping a high performance over a
broad range of activities. This is an indication
of their overall scientific and intellectual
attractive power. But also smaller
universities may perform very well
28
quality
top
bottom
journal impact
29
Finding 4 Top universities publish in journals
with higher journal impact as compared to lower
performance universities. Top universities
perform a factor 1.3 better than bottom
universities in journals with the same average
impact.
30
self-citation
journal impact
31
Finding 5 The fraction of self-citations
decreases as a function of research performance,
of average field citation density, and of average
journal impact.
32
(No Transcript)
33
(No Transcript)
34
250 European Universities with P(y) gt 350 Top-20
in size, Physics, ranked by crown indicator
35
Political Science Top-30 in Europe of in total
500 universities
36
Recent and Current Benchmark Projects Manchester,
Leiden, Heidelberg, Rotterdam, Copenhagen,
Zürich, Lisbon-UNL, Gent, Antwerp, VU Brussels,
UCL, Southampton, Kent, East Anglia
As an example two of the 16 main fields
37
NL
P (2000-2004) C (2000-2005) Benchmark univ. NL,
LERU
LERU
38
Large European University
Among top 25 in publication output and
citation impact
Top 25
Impact ranking
Bottom 25
Top 25
Bottom 25
Publ.ranking
39
Top research university
Top 25
University has a top position in each discipline
Impact ranking
Bottom 25
Publ.ranking
Bottom 25
Top 25
40
Departments
bottom-up analysis input data (assignment of
researchers to departments) necessary gt Detailed
research performance analysis of a university by
department
University
Fields
top-down analysis field-structure is imposed
to university gt Broad overview analysis of a
university by field
41
Large UK University vs. Leiden University
42
Large UK University vs. Leiden University
43
  • Some provoking conclusions
  • Different research rankings by indicator
    possible, but impact remains the most important
  • Different research rankings by field possible,
    but strong universities are strong in most of the
    fields (cumulative advantage by reputation)
  • Nevertheless, universities with not that high
    average impact may be very strong in one or a few
    fields focus, ambitions

44
impact
rank
45
Thank you for your attention
46
Number of NL chemistry groups as a function of
c/cf
47
(No Transcript)
48
(No Transcript)
49
(No Transcript)
50
(No Transcript)
51
(No Transcript)
52
(No Transcript)
53
2.5
Biol Sci, humans
54
(No Transcript)
55
(No Transcript)
56
c/cf gt3.0 c/cf Cambridge 18
2.12 Leiden 11 1.35
Research excellence threshold
57
c/cf gt3.0 c/cf Cambridge 8
1.32 Leiden 19 1.92
Research excellence threshold
58
(No Transcript)
59
..and on the basis of the 30,000,000
grammatically parsed publication abstracts
(1980-2008)
Field clusters of concept-related
publications new, emerging often interdisc.
fields scientific fine-grained structure
cluster
60
Field set of publications with
thematic/field-specific classification
codes again for new, emerging often interdisc.
fields scientific fine-grained structure
cluster
61
Mesh delineation vs. journal-classification
Problem of the right FCSm..
ISI j-category based
FCSm
FCSm
PubMed classification based
62
  • Time-dependent analysis is important for
    monitoring strengths (and/or weaknesses)
  • Unexpected strengths may show up!
  • International collaboration is very important in
    reinforcing a universitys impact, this clearly
    underlines the importance of networks
  • A relatively large group of European
    universities with good overall quality and
    several centers of excellence city and region
    give universities new opportunities (and vice
    versa)

63
(No Transcript)
64
(No Transcript)
65
(No Transcript)
66
Finding 4 Low field citation density and low
journal impact universities have a size-dependent
cumulative advantage for the total number of
citations. For lower-performance universities,
field citation density provides a cumulative
advantage in citations per publication. Top
universities publish in journals with higher
journal impact as compared to lower performance
universities. Top universities perform a
factor 1.3 better than bottom universities in
journals with the same average impact.
67
(No Transcript)
68
quality
top
bottom
Field citation density
Write a Comment
User Comments (0)
About PowerShow.com