A case study of one institutions approach to institutional research - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

A case study of one institutions approach to institutional research

Description:

University of Brighton. Preliminaries: definition and purpose ... BA Hons Criminology and Sociology (47) BA Hons Criminology and Social Policy (18) ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 18
Provided by: pfj
Category:

less

Transcript and Presenter's Notes

Title: A case study of one institutions approach to institutional research


1
A case study of one institutions approach to
institutional research
  • Penny Jones
  • Elizabeth Maddison
  • University of Brighton

2
Preliminaries definition and purpose
  • Self-study is about collective reflective
    practice carried out by a university with the
    intention of understanding better and improving
    its own progress towards its objectives,
    enhancing its institutional effectiveness, and
    both responding to and influencing positively the
    contact in which it is operating. As such,
    self-study is intimately linked to university
    strategy, culture and decision-making with an
    emphasis on each of the collective, reflective
    and practical components of this definition
    From Managing Institutional
    Self-Study by David Watson and Elizabeth
    Maddison, 2005

3
University of Brighton
  • gt21,000 students gt2,000 staff gt135m turnover
  • gt5,500 awards 2007
  • submitted 287 staff in 16 RAE units of assessment
  • highly distributed (five sites UCH four partner
    colleges)
  • joint medical school (first graduates July 2008)
  • major funding from HEFCE, TDA and NHS

4
University context
  • national debate on and requirements for
    accountability
  • HEFCE, TDA, NHS, PSBs etc
  • Better Regulation
  • single conversation
  • CUC Pls guidance
  • the Accountable Institution Project
  • (HEFCE-funded 3 universities)

5
University context
  • 1999 no real analytic capacity
  • problematic HESES return
  • 2000 first data analyst appointed on fixed term
    contract
  • 2008 two permanent data analyst posts plus one
    part-time survey post about to be filled
  • continuous improvement in data quality
  • 2008 clean data audit from HEFCE

6
University context
  • 2007 basket of indicators approved by Board of
    Governors as basis for their own monitoring of
    institutional performance against Corporate Plan
    and reporting for HEFCE
  • significant time series including student
    retention surveys of student finance why chose
    Brighton / decliners
  • targets for Faculties (e.g. research grants bid
    and won research student completions commercial
    income)

7
Critical success factors in IR at Brighton
  • senior management commitment SU involvement
  • data quality improvement and sustained effort
  • real examples where data is informing practice
    and decision-making, and / or identifying
    questions to be addressed
  • feeding in at key moments (e.g. what we know
    about what students think)
  • expectation that Heads know the facts about
    their Schools will investigate / challenge /
    respond / change practice

8
Using a data framework in an effective wayFrom
Managing Institutional Self-Study by David
Watson and Elizabeth Maddison, 2005.
  • integrate the data cycle with the committee
    cycle, including Board of Governors
  • focus on Brightons objectives and practices
  • focus on performance indicators identified in
    corporate plan and assessing them in appropriate
    ways
  • keep it well organised and managed to fulfil
    internal and external requirements
  • ensure it supports risk management
  • The data framework at the University of Brighton

9
Challenges
  • timeliness of analysis
  • data quality and understanding when/where data
    does not have to be perfect
  • balancing analysis for information only with
    analysis to support and/or challenge decision
    making
  • to improve the quality of analysis over time, and
    with changing requirements
  • data literacy communicating analysis using
    different modes to provide appropriate access to
    different users

10
1. The Retention Report an example of analysis
well integrated into university cycles
Student Cycle
HESA return 06/07
HESA return 07/08
Analysis Cycle
Registration
Committee Cycle
Board of Governors
S
A
J
O
HESA Performance indicators
Senior Management Team
N
J
M
D
HESES Return 07/08
Academic Standards Committee
A
J
F
M
Withdrawals survey
Student Retention Review Group
RETENTION REPORT Student cohort 06/07
  • Addressing data literacy
  • Report on the web
  • Hard copy of the report sent out to key
    customers
  • Lunch time seminar tailored to attendees
  • An offer of one to one sessions with analyst

Budget agreed for retention issues
11
2. The National Students Survey using
incomplete data and other challenges
  • results published at JACS subject level do not
    map to internal schools and faculties.
  • data only published at department level if
    threshold of 10 or more met.
  • an example of the complexity

12
The complexity
JACS Level 3
Sociology (116)
SCHOOL A -departments (with number of
respondents)
SCHOOL B
BA Hons Social Science (30) BA Hons Criminology
and Sociology (47) BA Hons Criminology and
Social Policy (18) BA Hons Health and Social
Care (13) BA Hons Sociology and Social Policy
(11) BA Hons Criminology and Applied Psychology
(77) BA Hons Applied Psychology and Sociology
(36) BA Hons English and Sociology (22)
Social Policy (192)
SCHOOL C
Others in Subjects Allied to Medicine (74)
SCHOOL D
Psychology (113)
SCHOOL E
Unidentified Respondents from departments gt
10 respondents
English Studies (54)
SCHOOL F
13
The NSS the challenge continued
  • difficult to ask academics to be accountable for
    data where we are unsure who the respondents
    making up the data are
  • why it matters Unistats website
  • resolution this year NSS willing to provide
    JACS mapping to make unpicking the results
    easier.
  • increase response rates more data at a lower
    level
  • good example of difficulty in balancing analysis
    for info only and for challenge

14
3. The dashboard improving analysis over time
  • new corporate plan 2007-2012
  • opportunity to improve high level analysis
    provided to senior management and Board of
    Governors
  • undertook comparator group analysis and
    researched dashboard techniques
  • resulting UoB Dashboard
  • the challenges

15
Tensions
From Managing Institutional Self-Study by David
Watson and Elizabeth Maddison, 2005
16
Still to do
  • herd the plethora of people involved in data
    analysis and evaluation (practitioners and
    academics quantitative and qualitative)
  • bring together data to give complete perspective
    on each School (e.g. NSS clearing Retention
    student and staff data student complaints /
    appeals)
  • clearer processes and timetable (revisiting data
    cycle and framework)
  • reduce reinvention
  • review external frameworks (e.g. CSR)
  • align/dialogue between IR and academic HE
    research interests
  • improve level of analysis (school course
    subject)

17
Still to do
  • agree definitions (research third stream)
  • continuous attention to data quality and for
    collecting, using and reporting on data
  • inter-institutional comparisons
  • contribute to national debate (e.g. metrics for
    community engagement)
  • technical capacity
  • market intelligence
  • is good enough good enough?
  • continuous attention to so what?
  • avoid spurious veracity
Write a Comment
User Comments (0)
About PowerShow.com