Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding) - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding)

Description:

Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding) Ellie James Cardiff University did clinical medicine and English ... – PowerPoint PPT presentation

Number of Views:174
Avg rating:3.0/5.0
Slides: 22
Provided by: XimenaW5
Category:

less

Transcript and Presenter's Notes

Title: Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding)


1
Research Assessment Exercise (RAE) Research
Excellence Framework (REF) Quality Related (QR
funding)
  • Ellie James

2
Dual support system
  • 2009/10 Keeles total income for research 18.3m
  • 6.6m from QR (RAE 2008 results)
  • 11.7m from research grant income
  • Research Assessment Exercise
  • National exercise, every 6 years
  • Purpose to assess, by discipline (UoA), the
    quality of research in each University, since
    last RAE
  • Process is currently based on peer review
  • Results are the basis for allocating QR funding
    (in HEFCE grant letter)

QR funding from RAE
Research Council grants
Research funding
3
Definitions of quality levels (RAE 2008)
Four star Quality that is world-leading in terms of originality, significance and rigour.
Three star Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence.
Two star Quality that is recognised internationally in terms of originality, significance and rigour.
One star Quality that is recognised nationally in terms of originality, significance and rigour.
Unclassified Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.
4
Keeles RAE submissions
  • RAE 2001
  • In RAE 2001 357.7 FTE staff were submitted across
    27 Units of Assessment
  • 80 of academic staff submitted
  • Two thirds of Keeles submissions rated 4 or
    above (national to international excellence)
  • RAE 2008
  • More selective approach, with high quality
    focused submissions
  • Less staff have been submitted to fewer UoAs
  • 286.15 FTE staff submitted to 14 Units of
    Assessment
  • 48 of academic staff have been submitted

5
RAE 2008 Results
  • All Universities results
  • i.e. 52,409 FTEs in 2,363 number of submissions
  • Keeles results
  • i.e. 286 FTEs submitted to 14 UoAs

4 3 2 1 unclassified Mean score 2,34 3 4
17 37 33 11 1 2.58 87 55
4 3 2 1 unclassified Mean score 2,34 3 4
11 35 38 14 1 2.41 85 46
6
Keele results
7
Conclusions from RAE 2008 results
  • Keele performed well in its high volume
    submissions (General Engineering, Social Policy
    Business Management)
  • Each discipline needs to compare its results to
    the sector
  • Social policy rated 12th in the sector (out of
    68).
  • History and Music have 20 4 (above sector
    average of 17)
  • Sciences did well
  • Law and politics performed poorly
  • Disappointing results for Primary Care and
    Astrophysics, compared to the sector (but not
    financially.)

8
Overall feedback from RAE 2008
  • Research environment benefited from establishment
    of research institutes
  • High proportion of early career researchers
  • Below average numbers of postgraduate research
    students
  • Below average research income

9
Translating results into funding
  • Never know rules of the game in advance
  • Each UoA is allocated a pot
  • HEFCE decided to weight funding as follows
  • 1 attracts no funding
  • 2/3/4 funded in the ratio 1/3/9
  • 2 funding reduced to 0.294 Feb 2011
  • Emphasis on science based (STEM) subjects
  • RAE 2008 much larger number of Universities were
    successful, means funding more widely and
    therefore thinly spread

10
REF main changes from RAE
  • Assessing impact
  • Standardise three elements across UoAs
  • Reduce number of UoAs (from 67 to 30) and main
    panels (to 4)
  • Up to a max of 4 outputs
  • Submissions due 2013, assessed 2014, inform
    funding 2015/16
  • Maybe including citations data (in some areas to
    help assess outputs)

11
REF RAE 2008 IMPACT citations
  • Three elements to overall excellence,
  • Consultation proposed
  • Weighting to be decided.
  • Impact probably less than 25 following
    consultation responses

Outputs 60
Impact 25
Environment 15
12
Impact categories
REF retrospective impact RCs prospective
impact
13
Impact pilot
  • 29 Institutions, 5 UoAs
  • Clinical Medicine
  • Physics
  • Earth Systems and Environmental Sciences
  • Social Work and Social Policy Administration
  • English Language Literature

14
Evaluation framework (work in progress)
2005-09
1993-2004
Research findings
Interaction with users
Influence and impact
Future capacity
Snapshot of activity across the whole unit
Example 1
Underpinning research
Example 2
User interaction
Underpinning research
15
Impact Pilot
  • Submissions included
  • Impact statement for each Unit
  • Case studies (1 per 10 FTE staff submitted)
  • Supporting indicators of impact
  • Panels will assess
  • Reach how widely the impacts have been felt
  • Significance how transformative the impacts
    have been
  • Time lag issue - Research may have been
    undertaken 10-15 years earlier (but impact must
    be evident 2008 to 2012)

16
Pilot outcomes
  • Weighting to be agreed after pilot complete, end
    2010
  • Pilot successful panels can differentiate
    quality
  • Case study templates have since been revised
  • Definition of impact (and benefit) may be
    broadened
  • Impact statement for Unit likely to be excluded
    (or part of environment)
  • Clarity of presentation is key (writing teams)
  • Each case study reviewed by 4 panel members (2
    users 2 academics)
  • Many case studies were based on individuals not
    groups
  • Impact preparations underway across sector

17
Revised impact case study template
  • Short summary of the case study (Maximum 150
    words)
  • Underpinning research (Maximum 500 words)
  • Provide information about the research and the
    specific insights that underpin the impact or
    benefit claimed in this case study.
  • References to the research
  • Provide references to key research outputs, any
    key research grants, and evidence of the quality
    of the research (Maximum of 10 references).
  • 4. The contribution, impact or benefit (Maximum
    750 words)
  • Describe the impact or benefit and how the
    research contributed to this
  • 5. References to corroborate the contribution,
    impact or benefit (Normally maximum of 10
    references)

18
Lessons learned from pilot HEIs
  • Takes time to understand concept of impact or
    non academic impact or benefit. HEIs need to
    raise awareness NOW
  • Interim impact ambiguous compared with final
    impact
  • Standard approach emerged
  • Central admin project managing submissions
  • Department academics leading drafting
  • High level committee reviewing and tactical
    advice
  • Big challenge acquiring supporting evidence
    (heavy reliance on personal knowledge of senior
    academics)
  • Impact with impose real additional cost
  • Subject specific challenges e.g. English more
    conceptual
  • Pilot HEIs advised other HEIs to start
    preparations now!

19
Lessons learned from pilot Panels
  • Best case studies make explicit the non academic
    benefit from research
  • Brief is best
  • Good case studies showed the link between
    research and impact and provided supporting
    evidence
  • Case studies can get high rating on either
    reach or significance (or both)
  • Engagement isnt impact
  • Not convincing to simply state distinguished
    Professor
  • Universities need to improve their presentation
    of evidence
  • Issues for new departments, early career
    researchers, small submissions
  • Dont expect panels to follow up references,
    these are just for verification

20
REF timetable
  • 17 September 2010 Deadline for applications for
    sub-panel chairs
  • 8 October 2010 Deadline for nominating panel
    members
  • October 2010 Sub-panel chairs appointed
  • November 2010 Reports from the impact pilot
    exercise
  • December 2010 Panel members appointed
  • Early 2011 Panels begin meeting
  • Mid 2011 Guidance on submissions published
  • Mid 2011 Panels consult on criteria
  • Late 2011 Panel criteria and methods published
  • Early 2013 Submission system operational
  • Mid- to late 2013 Panels meet to prepare for the
    assessment
  • Further nominations sought assessors
    appointed
  • Late 2013 Submissions deadline
  • 2014 Panels assess submissions
  • December 2014 Outcomes published

21
Further info
  • http//www.hefce.ac.uk/research/ref/
  • New UoAs
  • http//www.hefce.ac.uk/research/ref/pubs/2010/01_1
    0/
  • Pilot HEIs
  • http//www.hefce.ac.uk/research/ref/impact/Institu
    tions_byUOA.pdf
Write a Comment
User Comments (0)
About PowerShow.com