Title: Starting from complete ignorance: Applying democracy and science to school improvement
1Starting from complete ignoranceApplying
democracy and science to school improvement
- Dr Robert Coe
- CEM Centre, Durham University
2Do we know anything about school improvement?
- over the past 30 years the United States has
made almost no progress in raising the
achievement of elementary and secondary school
students, according to the National Assessment of
Educational Progress, despite a 90 percent
increase in real public spending per student. - (Coalition for Evidence Based Policy, 2002).
3Standards rising in England?
4(No Transcript)
5School Improvement
- The problem of school improvement
- Shortage of initiatives
proven initiatives
- How do you know what works?
- The Secret of School Improvement
- Be ready to improve
- Start low
- Recruit better students
- Less cynical advice
- Be sceptical
- Monitor closely
- Evaluate properly
It aint ignorance as does the harm, its
knowing so many things that aint so
6Problems with informal evaluation
- Hard to be objective
- Hard to see overview
- Too many other things change
- Results vary if you change nothing
- What would have happened otherwise?
7What is evidence-based?
Before a policy is implemented or a practice
recommended, it should have been tried out and
evaluated
- Intervention, not description
- Evaluation, not common sense
8Do we need to intervene?
Effects of innovative medical therapies, from
carefully designed studies Gilbert, McPeek
Mosteller (1977)
new treatments are as likely to be inferior as
they are to be superior to existing
alternatives Chalmers, 1997
9The need for rigorous evaluation
empirical research over the last 15 years
suggests that nothing improves the chances of
apparently successful innovation as much as lack
of experimental control. Marked enthusiasm for an
innovation is negligible in reports on controlled
trials.
Declarations that a program is successful are
about four times more likely in research based on
poor or questionable evaluation designs as in
that based on adequate ones.
Boruch (1997) Randomized Experiments for Planning
and Evaluation
10Distributed experiments
- 1. Target Setting
- Precise, challenging targets, systematically
monitored - 2. Assessment
- Formative comments and self-referenced grades
- 3. Mentoring
- Programme of pastoral/tutorial support
http//cem.dur.ac.uk/ebeuk/experiments
11Research on Goal Setting
- Goals that are specific and difficult lead to a
higher level of performance than vague,
non-quantitative goals such as do your best,
work at a moderate pace or no assigned goals. - Locke and Latham (1990)
- A Theory of Goal Setting and Task Performance
12Targets ...
- focus attention on outcomes
- increase intensity of effort
- direct attention to goal-relevant activities
- increase persistence
Individuals develop task strategies in response
to targets
13But ...
- You must have goal commitment
- You must have feedback in order to evaluate your
performance - Effect on complex tasks is smaller
- Very little evidence from educational settings
- Long term effects not known
- Unintended consequences (Smith, 1985)
- There is only one thing worse than not getting
what you want. And that is getting it.
14Effects of different forms of assessment
(comments, grades and both)
Overall improvement after 2 lots of feedback
5
4
3
2
Comments and grades
1
Change in marks
Grades
0
Comments
-1
-2
-3
-4
-5
Butler, 1988
15What they recalled
Comment and grade group
Comments group
16If you are going to grade or mark a piece of
work, you are wasting your time writing careful
diagnostic comments.
Wiliam, 1999
17Identifying under-aspirers
- Yellis Project allows schools to monitor progress
and attitudes. - Year 10 students complete a test of developed
abilities and a questionnaire. - Able students who say they are not planning to
stay in education are identified as
underaspirers. - Schools then mentor, target etc.
- In 1999 some schools were asked if they would
mind getting only half the list (selected at
random).
18The underaspirers results
- 120 year 10 students in 15 typical schools
- Half named, half not named
- In terms of achievement in GCSEs (value added)
- Named students did worse in 12 of 15 schools
- Average difference 0.3 grades per subject
- Overall effect size of naming -0.38
19Named
Not named
20Was more counselling better?
21Underaspirers (2)
196 students in 26 schools GCSEs in 2002
ES 0.12
22Making evidence-based decisions
- The problem (and possible solutions) must be
generalisable - Agreement about outcomes?
- What evidence exists already?
- Theory (formal and informal)
- Experience
- Research
- Systematic reviews of research
- Conduct an experiment
23Why involve teachers?
- Only those who do the job can ask the right
questions in the right ways - Only those close to the outcomes can provide a
rich and detailed understanding of them - Many teachers are already experimenting
- Assimilation, not dissemination other peoples
ideas dont have the same impact - We must evaluate actual implementation, not just
ideal policy - Only multi-site trials can give generalisable
findings - The process of doing the research is hugely
valuable
24Implications
- Need for more trials
- Need for systematic reviews
- Need for debates about important outcomes
- Need for debates about important questions
- Need for involvement of those closest to the
outcomes and implementation - Need to evaluate actual practice, not just ideal
policy
25Collaborative Experiments
Neil Appleby, Park View Community School,
Chester-le-Street Effects of a mentoring
programme for Y8 pupils Ian Duckett, Slough
Borough Council Education Department Effectivenes
s of different ways of target setting Mike Ion,
Blessed Robert Johnson Catholic College,
Telford Effectiveness of different ways of
target setting in KS3 maths and science Kevin
Sellwood, Ridgeway School, Plymouth Effectiveness
of different ways of target setting at
KS4 Richard Sinclair, Bedales School,
Hampshire Effects of different ways of giving
pupils feedback on work Aidan Smith, De Lisle RC
School, Loughborough Effectiveness of different
ways of target setting at KS4 Zoe Spavold,
Fitzharrys School, Abingdon Using
self-assessment to promote learning Paul Stevens,
Fort Pitt Grammar School, Chatham Effects of
different ways of marking work Sabine Stroud, The
Arnewood School, Hampshire Effectiveness of
different ways of target setting with G T
pupils at KS 34 Neil Walker, George Stephenson
High School, Newcastle Effectiveness of GOAL
assessment
26Evidence-Based Education Network
- Gain access to existing evidence
- Accessible, clear, brief, non-technical,
not-oversimplified summaries (!) - Keep in contact
- with others of like mind and with latest
developments - Create and share evidence
- Design and take part in experiments, disseminate
results - Campaign
- to oppose unjustified policies and to promote a
culture of evidence
?
Sign-upfor details
http//cem.dur.ac.uk/ebeuk
27Evidence-Based Policies and Indicator Systems
Conference, 2003
28Dr Robert Coe
Curriculum, Evaluation and Management (CEM)
CentreUniversity of DurhamMountjoy Research
Centre 4Stockton RoadDurham DH1 3UZTel 0191
374 4504 Fax 0191 374 1900Email
r.j.coe_at_dur.ac.ukhttp//www.cem.dur.ac.uk/ebeuk