Impact Evaluation: Presentation to DAC Evaluation Working Group, Paris, June 2, 2005 - PowerPoint PPT Presentation

About This Presentation
Title:

Impact Evaluation: Presentation to DAC Evaluation Working Group, Paris, June 2, 2005

Description:

Been associated with poverty, gender and environment effects, using mixed ... For impact there is a new initiative (DIME) promoting greater use of impact evaluation. ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 15
Provided by: how61
Learn more at: https://www.oecd.org
Category:

less

Transcript and Presenter's Notes

Title: Impact Evaluation: Presentation to DAC Evaluation Working Group, Paris, June 2, 2005


1
Impact Evaluation Presentation to DAC
Evaluation Working Group, Paris, June 2, 2005
  • Howard White
  • Operations Evaluation Department
  • World Bank

2
What is impact evaluation?
  • Has had different meanings
  • Sector or country-wide evaluation
  • Long-run effects
  • Establishing the counterfactual
  • Focus on final welfare outcomes
  • OED adopt a combination of the last two a
    counterfactual-based analysis of how the
    intervention has affected welfare outcomes

3
Impact evaluation in the development context
  • Been associated with poverty, gender and
    environment effects, using mixed methods with
    bias toward qualitative
  • But growing tide of quantitative impact
    evaluation. Driven by
  • New techniques and technology
  • Results-based agenda (including MDGs)

4
Impact evaluation at the World Bank
  • As with all evaluation, much work takes place
    outside of OED
  • For impact there is a new initiative (DIME)
    promoting greater use of impact evaluation. These
    studies are
  • All prospective
  • Attempt to promote randomization
  • OEDs own program is adopting a range of
    non-experimental approaches, firmly grounded by
    context

5
The existing OED program
  • Impact evaluation not new to OED
  • Over 80 studies classified as impact, and others
    not so classified adopt different approaches to
    measuring impact
  • Current program under OED-DFID partnership
    supporting three studies
  • Ghana basic education
  • Bangladesh maternal and child health
  • India rural poverty

6
Ghana Method and approach
  • Main data collection was survey to follow up
    GLSS2 from 1988 education module
  • Combined with time spent in the field in three
    visits
  • Background analysis budget and political economy
    (context)
  • Multivariate analysis of determinants of
    educational outcomes. Link those determinants to
    donor-financed activities
  • Work in collaboration with MOEYS and GSS

7
Ghana findings
  • Enrolments are rising
  • Learning outcomes are improving
  • Better school infrastructure is part of
    explanation
  • Hence Bank-financed school investments lay behind
    a substantial part of these improvements in
    education outcomes
  • But growing dichotomization of public sector
    (partly rectified in new Bank-supported program)

8
Bangladesh method and approach
  • Initial meta-analysis
  • Using existing data sets (mainly DHS)
  • Own analysis and commissioned studies
  • Multivariate analysis of determinants of
    mortality and nutrition (Oaxaca decomposition)
  • For BINP using propensity score matching
    combining two datasets (problem of poor quality
    and contaminated control)
  • Holistic approach
  • Work with IMED, and local research company

9
Bangladesh findings
  • Role of publicly-supported interventions in
    successful reduction of fertility and mortality
  • Immunization most cost-effective, but so are
  • Training TBAs
  • Female secondary stipends
  • Strong cross-sectoral effects (need not imply
    multi-sectoral programs)
  • BINP theory-based approach identifies weak links
    that help explain poor outcomes

10
India rural poverty
  • Looking at two interventions
  • AP Irrigation II and III
  • AP Rural Livelihoods Project
  • Three rounds of surveys (2005-2007)
  • Using innovative data collection instruments for
    immeasurables

11
Do you want to do your own impact studies?
  • Pros
  • Demonstrate results
  • Perceived as high quality products, appear to
    find audience in country teams
  • Cons
  • Expensive
  • Technically demanding
  • So when to use
  • Periodic validation
  • Pilots

12
What do you need to do your own impact study?
  • Some opportunism in selecting cases
  • Sufficient scale intervention to justify cost
  • Lengthy lead time, especially if collecting own
    data (18 months from start to finish is best you
    can hope for, 24-30 months is more realistic)
  • Appropriate skills mix
  • Promote prospective evaluation

13
Data collection for impact evaluation
14
How OED is planning to take its program forward
  • Partnership has helped consolidate commitment to
    continuing impact evaluation
  • Study a year is built into work program
  • One prospective study being agreed (possibly
    Karnataka health)
  • Open to discussion of expanded program
Write a Comment
User Comments (0)
About PowerShow.com