Evaluating Program Impact: Some Random Thoughts - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Evaluating Program Impact: Some Random Thoughts

Description:

Treated & counterfactual groups have identical characteristics on average, ... everyone eligible an equal chance (e.g. Colombia School Vouchers, Mexico Tu Casa) ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 24
Provided by: paulge7
Category:

less

Transcript and Presenter's Notes

Title: Evaluating Program Impact: Some Random Thoughts


1
Evaluating Program ImpactSome Random Thoughts
  • Paul Gertler
  • UC Berkeley

2
Today
  • Impact Evaluation versus traditional ME
  • Internal validity
  • External validity
  • Operational issues

3
Internal Validity
  • Attribution and causality
  • The counterfactual
  • Counterfit counterfactuals
  • Possible Solutions

4
Solving the evaluation problem
  • Counterfactual what would have
    happened without the program?
  • Need to estimate counterfactual
  • i.e. find a control or comparison group
  • Counterfactual Criteria
  • Treated counterfactual groups have
    identical characteristics on average,
  • Only reason for the difference in outcomes
    is due to the intervention

5
2 Counterfeit Counterfactuals
  • Before and after
  • Same Individual before the treatment
  • Problem things can change over time
  • Beneficiaries versus Non-Beneficiaries
  • Those who choose not to enroll in program
  • Those who were not offered the program
  • Problem why did they not participate?

6
Possible Solutions
  • Comparability of treatment control groups so
    ONLY remaining difference is intervention
  • Solutions
  • Experimental or random assignment
  • Quasi-experiments
  • Regression Discontinuity
  • Double differences
  • Ex post Matching is NOT a solution

7
Random Assignment
  • Not same as random sampling
  • Random assignment to treatment control
  • Balances observed and unobserved characteristics
  • Treatment status uncorrelated with
    characteristics
  • Only difference is one group gets program
  • Assignment Level
  • Individual or community (group)
  • Random promotion

8
Regression Discontinuity
  • Eligibility criteria
  • Observable
  • Continuous (e.g. income, age, geography)
  • Compare
  • Treatment group just eligible
  • Comparison group just ineligible
  • Local comparison like random assignment
  • Estimation only valid close to cutoff

9
Difference in Differences
  • Baseline endline survey of
    treatment comparison groups
  • Compare change in treatment group to change in
    control group
  • Use change in control group to estimate what
    would have been change in treatment group
  • Can combine D in D with RD

10
C
C
Y
Before - After (C B)
B
B
T1
T0
Time
11
C
Dif in Dif (C B) (E - A)
E
Y
A
A
D
Note (D B) (E A)
B
B
B
T1
T0
Time
12
External Validity
  • Can results from one study population be used for
    policy regarding another
  • Different culture, socio-economic status, enviorn
  • Efficacy versus effectiveness
  • Solution
  • Random sample of target population at scale
  • Multiple sites

13
Operational Issues
  • Retrospective versus prospective
  • Where do control groups come from?
  • Ethics
  • Communicating to policy makers
  • Data sources
  • Budget management

14
Retrospective Analysis
  • Examples
  • Randomization Housing (Mexico)
  • Regression Discontinuity Bono Sol (Bolivia)
  • Issues
  • How were benefits assigned?
  • Were administrative records kept?
  • Is there an adequate baseline survey?

14
15
Prospective Analysis
  • Prospective Analysis
  • The evaluation is designed in parallel with the
    assignment of the program
  • Baseline data can be gathered
  • Use program rollout
  • Example
  • Progresa/Oportunidades (México)

15
16
Program RolloutWho gets the program first?
  • Eligibility criteria
  • Are benefits targeted?
  • How are they targeted?
  • Can we rank eligible's priority?
  • Are measures good enough for fine rankings?
  • Operational rules need to be
  • Equitable, transparant and accountable

17
The Method depends on the rules of
operation
18
Ethical Considerations
  • Do not delay benefits Rollout based on
    budget/administrative constraints
  • Equity equally deserving beneficiaries deserve
    an equal chance of going first
  • Transparent accountable method
  • Give everyone eligible an equal chance
    (e.g. Colombia School Vouchers, Mexico Tu
    Casa)
  • If rank based on some criteria, then criteria
    should me quantitative and public

19
Data Sources
  • Need baseline follow-up (end line)
  • Existing sources
  • DHS, labor force, other sample surveys
  • Administrative data Vital stats, crime, etc
  • Can existing sources be used
  • Do they have the indicators?
  • Are they in control treatment areas?
  • Do they have sufficient sample sizes?
  • Typically no, need to collect own data
  • Biggest part of the budget ()

20
IE and Monitoring Systems
  • Projects/programs regularly collect
    data for management purposes
  • Typical content
  • Lists of beneficiaries
  • Distribution of benefits
  • Expenditures
  • Outcomes
  • Ongoing process evaluation
  • Key for impact evaluation

21
Use Monitoring data for IE
  • Program monitoring data usually only collected in
    areas where active
  • If start in control areas at same time as in
    treatment areas have baseline for both
  • Add a couple of outcome indicators
  • Very cost-effective as little need
    for additional special surveys
  • Most IEs use only monitoring

22
Overall Messages
  • Internal validity
  • External validity
  • Use rules of operations to get control groups
  • Evaluation ethical
  • Save if generate data from IE from ME
  • IE part of doing business
  • IE just another operation

23
Design Messages
  • Start evaluation during program design
  • Prospective design
  • Find valid control groups
  • Equity, transparency accountability
  • Baseline large enough sample indicators
  • Integration of evaluation operations
  • Evaluation is just another operation
  • Ensure design is maintained
  • Good monitoring systems administrative data can
    improve IE and lower costs
Write a Comment
User Comments (0)
About PowerShow.com