A proposed approach to demonstrating the impact of extended schools and school workforce initiatives - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

A proposed approach to demonstrating the impact of extended schools and school workforce initiatives

Description:

ESRAs and school workforce team members need to make a compelling case ... 2) Different standard of evidence persuasiveness' rather than proof' 3) Objectivity ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 26
Provided by: Shal93
Category:

less

Transcript and Presenter's Notes

Title: A proposed approach to demonstrating the impact of extended schools and school workforce initiatives


1
A proposed approach to demonstrating the impact
of extended schools and school workforce
initiatives
  • 7 October 2008

2
Rational for work on impact
  • ESRAs and school workforce team members need to
    make a compelling case for the impact of their
    work - particularly in terms of LAA outcomes.
  • If LAs are successful at this, and are given
    tools to help them communicate research findings,
    an evidence-base will emerge that will help build
    a better national evidence-base and also provide
    insights to inform service design.
  • TDA is looking to suggest some practical and
    pragmatic solutions to this issue.

2
3
Getting the language right
  • Three distinct challenges requiring three
    different types of evidence
  • Verification eg. is the school meeting the full
    core offer as set out by DCSF? Is the school
    compliant with the revised Performance Management
    regulations
  • Verification tools/questionnaires
  • Moderation practices
  • Data entry strategy
  • Outputs (Quality/Maturity) How good is the
    service provided?
  • Meets defined service standards eg. gold,
    silver, bronze
  • Uptake
  • Users views
  • Providers/staffs views.
  • Impact To what extent has the service delivered
    desired outcomes?
  • What is different about service users?
  • How have their attitudes changed?
  • How has their behaviour changed?
  • Do they have access to and use new resources?
  • Do they achieve things that might otherwise have
    not?
  • Can you see any change in final outcomes data?
    (eg LAA data)

4
Four central questions for testing hypothesis
about impact
  • Overall Impact At a basic level, we need to
    demonstrate whether the policy does deliver any
    outcomes (eg. approach of addressing needs
    outside the classroom through Extended Services
    can indeed improve learning potential (and hence
    outcomes). Is it actually possible to get from A
    to B?
  • Detailed impact Which of the particular aspects
    of the policy as they have been formulated both
    nationally and locally seem to be more or less
    effective in delivering outcomes (for example
    learning potential (and hence outcomes)), in
    different contexts. What are the best routes to
    get from A to B (or C or D to B)?
  • Diagnostic What else has been learnt that might
    help others improve the effectiveness and impact
    of policies? Other tips about how to get to B
  • Value for Money On the basis of the evidence we
    have collected at a school/cluster/LA/National
    level, can we conclude the resources invested in
    this policy have had a greater impact than they
    would have had if they were invested in something
    else? Is it worth trying to get to B? Perhaps
    we should go to K?

A
B
K?
D
C
The approach outlined here is a way of moving
towards answers to these questions.
5
A few introductory concepts - towards a suggested
impact evaluation approach
6
Extended schools can be understood innovations or
hypothesis
7
As are school workforce interventions..

Reduce teachers workload
Teachers are happier
Happier teachers are better teachers
Increase number of support staff
Pupil with particular needs have more contact
time with adults
Time spent with support staff removes barriers to
learning
8
TDA tools are hypothesis forming making in
practice
9
Which aspects of the hypothesis do we need to
test?
Reduce teachers workload
Teachers are happier
Happier Teachers are better teachers
Provision of Extended Services designed using
SIPf
Improved well being/ECM
Improved learning potential
1) Links closely with the principles of Results
Based Accountability
2) Different standard of evidence
persuasiveness rather than proof
3) Objectivity
10
1) Closely evaluate a small number of
services/interventions, by a) articulating
hypothesis and b) testing these against
evidence2) Generalise from the findings of these
evaluations3) Make the most of other data and
information that you have already
In conclusion, the solution to testing your
hypotheses is.
Persuasive case
11
Part 1 Conducting (primary) research/evaluations
12
A step-by-step approach to doing impact
evaluations
  • Step 1 Work out who should be involved in the
    evaluation.
  • Step 2 Sample which services you will be
    evaluating (it is important to stress that this
    model should only be used on a small number of
    services eg. a cluster manager could evaluate one
    service per year.)
  • Step 3 Get a common understanding about
    language (outputs, intermediate outcomes, final
    outcomes).
  • Step 4 Agree what the final outcomes for the
    service will be and then the intermediate
    outcomes that you think will lead to these
    outcomes.
  • Step 5 Develop a general theory of change
    that articulates how your intervention will lead
    to the intermediate outcomes you are looking to
    achieve.
  • Step 6 Develop the theory of change into a
    detailed logic model which outlines the key
    aims of your service and how these link to
    outcomes.
  • Step 7 Use the logic model to determine what
    evidence you will need to collect.
  • Step 8 Workout what indicators you will use and
    collect baseline data if possible.
  • Step 9 Also, think what evidence you will need
    to collect from other services to enable you
    generalise from these findings (you can use the
    SIPf design brief for this).
  • Step 10 Run the service and collect the
    associated evidence.
  • Step 11 Assess each step in the logic model
    against the evidence collected.
  • Step 12 Conclude whether the evidence shows
    that your service has delivered impact, and
    present a coherent argument to support your
    conclusions.

12
13
Developing and articulating a theory of change
Example - How do we use ES to improve the
learning potential of Year 6 boys?
  • Many of the pupils we want to target, could
    potentially attend activities before school and
    consultation with parents suggests this would be
    welcome.
  • Computing facilities are available and we have
    staff resources for a breakfast club.
  • Breakfast club activities could be targeted to
    engage pupils in mathematics and we have a range
    of techniques for doing this.
  • We believe we can engage the pupils in these
    activities
  • These activities will address some of the
    blockers identified through SIPf and improve
    learning potential (How?)

14
The logic model as a suggested approach to
carrying out intensive research on specific
interventions
Evidence needs
Logic model
Stage
Intervention design/input
Eg. Staff review, awareness survey
What has the service delivered?
Eg. Staff review, users views, Partners views
Outputs/ quality
How good was the service?
Eg. Assessments of service users, staff/ partners
views
Intermediate outcomes
In what ways can we show that users have
benefited?
Impact
Can we link user benefits to change in LAA target
data?
Official statistics
14
15
Illustrative logic model for school/cluster
evaluation of breakfast club
Logic model
Stage
Evidence needs
We will run a pre-school maths breakfast club
Staff review of whether breakfast club and
targeting happened as planned
Verification
This will be targeted at certain pupils
Attendance log
Targeted pupils will attend and continue to attend
Register of activities/staff review
Activities will be effectively tailored to
individual pupil needs
Outputs/ quality
Pupils will be engaged in the activities at the
breakfast clubs
Staff review/parental perceptions
Targeted pupils say that they enjoy breakfast
clubs
Consultation with pupils
Attendance and engagement at school and in maths
lessons will increase
Attendance records, behaviour measures (hard and
soft), pupil perceptions, staff review.
Pupils will appear more confident in Maths
Intermediate outcomes
Final outcome
Maths results amongst targeted pupils show
improvement
Improvement (test scores)
16
Illustrative logic model for Performance
Management
Stage
Logic model
Evidence needs
Intervention design
School has clear Performance Management policy
involving objective setting, review and classroom
observation.
Written PM policy and implementation plan
Teachers engage positively in school PM system,
professional dialogue between reviewer and
reviewee
Audit of participation, reviewer/reviewee
assessment of level of engagement
Individual teachers objectives are clearly
linked to professional standards, career
development goals and school improvement
priorities
Review of plans
Outputs/ quality
Teachers feel they understand and endorse their
objectives and have a realistic plan for
achieving them
Teachers perceptions of process
School-wide comparison of teacher objectives to
CPD options, take-up of training
CPD linked to teacher objectives is provided and
taken-up
1) Teacher/reviewer perceptions. School-wide
assessment of extent to which objectives were
met. 2) Teachers personal views and evidence of
how PM learning has been applied 3) Staff survey,
SLT observations
1) Teachers demonstrate a focus on achieving
their PM objectives
2) Teachers personal professional development is
informed by planning, review and feedback
3) All staff show understanding of school
improvement objectives appreciate own
role/contribution
Intermediate outcomes
Improvements in quality of teaching
Ofsted, pupil/parent views
Impact
Improved attainment
Improved Key Stage 1- 4 scores
16
17
Example How you could design an evaluation of a
multi-agency drop in centre (real LA example).
Evidence needs
Logic model
Stage
Staff review of provision achieved, service
standards met.
A multi-agency drop in centre for under 16s
Intervention design
Publicity/outreach effectively targets young
people exhibiting pre-defined risk factors
Measure awareness of service in targeted community
Service is used by young people who exhibit risk
factors
Attendance log, analysed by risk factors
Contact with/referral to partner
agencies/multi-agency approach to individual cases
Log of cases/referrals, staff review
Outputs/ quality
Provision is effectively tailored to individual
need/circumstances/appropriate pathways suggested
Staff review/ service user perceptions, case
studies of individual users
Users respond positively to services (and
continue to respond)
Ongoing attendance/ engagement log, user
perceptions
2) Attendance/ engagement at school
1) Increase self-esteem/positive aspirations
Intermediate outcomes
1) Well-being assessments of service users (eg.
questionnaires, interviews) 2) Liaison with
school/other staff 3) Recording and analysis of
outcomes for individual service users
3) Users of service achieve accreditations
Impact
Reduced of NEETs in community (long-term aim)
Official statistics
17
18
Elimination of teachers administrative tasks
Logic model
Stage
Evidence
Identify tasks that teachers routinely perform
which could be delegated to support staff or
eliminated entirely
Identification of ?? tasks
Legislation to express the principal that
teachers do not perform these tasks
Phase 1 of National Agreement
Intervention design
Promote change management techniques to help
schools adopt change appropriately and positively
Remodelling. Training to raise capacity and
capability
Increase in training and numbers of support staff
to undertake tasks that are still regarded as
necessary
School census data on support staff numbers
Support staff effectively undertake routine tasks
DISS survey
Outputs/ quality
All school staff understand and positively
participate in the remodelling process.
Assessment and delegation of tasks is integrated
into school-wide strategic vision for improving
teaching and learning.
Case studies, academic research, anecdotal
feedback
Teachers have more time to focus on teaching and
doing the right things
Teachers views (through surveys and qualitative
work)
1) Teachers work-life balance improves, as does
morale
2) Improved lesson preparation innovation
3) Learners receive higher quality teaching and
1-1 support
Intermediate outcomes
1) OME survey, teacher surveys 2) Ofsted, other
research 3) Ofsted, other research
Improvements in quality of teaching and
personalisation of learning
Ofsted, pupil/parent views
Impact
Improved attainment
Improved Key Stage 1- 4 scores
18
19
How? Data/evidence sources and indicators
  • Verification
  • Challenge is consistency at the LA level
  • Mix of yes/nos and some more subjective
    judgements.
  • Dealt with elsewhere
  • Outputs/Quality
  • Participation/take-up measurements
  • Processes for staff reviews of services
    effectiveness/quality
  • Processes for collecting pupil/parent feedback
    (linked to existing TDA consultation guidance)
  • Impact
  • Developing or using existing behaviour/engagement
    systems and turning them into indicators
  • Staff reviews
  • Interpreting attainment data
  • Collecting pupil/parental perceptions of impact
  • Measuring well being/learning potential

20
Guidance on writing up evaluations
  • Example structure
  • Summarise Go back to the four levels of impact
    (Overall, detailed, diagnostic and value) and
    outline the argument/conclusions in each.
  • Describe the background to the service/contextual
    factors
  • What was the theory of change
  • Present the Intervention Logic Model
  • Which assumptions within the model were proved
    correct or not? Presented with associated
    (triangulated) evidence
  • How did external factors influence the relative
    success or failure of the service to deliver the
    desired outcomes could these be addressed?
  • What are the conclusions?
  • What advice would you give a school/cluster that
    is going through the same process?
  • Detailed evidence appended

21
Relationship with other models
SIPf Bullseye tool
This approach
Intervention design
Outputs/ quality
Intermediate outcomes
Impact
NFER Model
The Guskey Model
21
22
Part 2 Making the most of other (secondary)
data sources
23
How it all fits together at an LA level
Statistical analysis of attainment, attendance
and other relevant LAA data
SIMS/ pupil level data analysis, linked to
participation
Case studies, good news stories, media coverage
Perceptions of pupils, staff, parents and
communities
Analysis of Ofsted reports
External /published research and analysis
Secondary data sources
A persuasive case for local investment
Verification and moderation system to record the
extent of provision across LA (eg using TDA
Extended Schools Progress System)
SIPf tools provide a comprehensive view of
service measures and outcomes across LA
LA level synthesis and generalisation from
primary research
Primary data sources
School/cluster research study 1
School/cluster research study 2
School/cluster research study 3
School/cluster research study 4
23
24
Questions?
25
Exercise
  • 1) Develop a theory of change
  • - Think of a service who is it targeted at?
  • What final outcome(s) are you working towards?
  • What are the intermediate outcomes you believe
    are linked to this in terms of knowledge,
    attitudes and behaviour?
  • What factors do you think are associated with
    the intermediate outcomes? And how might your
    service seek to influence those factors and
    through these the intermediate and then final
    outcomes
  • 2) Turn it into a simple logic model
  • 3) What sorts of evidence will you collect?
  • 4) Finally, what do you think of the
    process/methodology? (Feedback welcome)
Write a Comment
User Comments (0)
About PowerShow.com