The RealWorld Evaluation Approach to Impact Evaluation With reference to the chapter in the Country-led monitoring and evaluation systems book - PowerPoint PPT Presentation

Loading...

PPT – The RealWorld Evaluation Approach to Impact Evaluation With reference to the chapter in the Country-led monitoring and evaluation systems book PowerPoint presentation | free to download - id: 480658-MmYyY



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

The RealWorld Evaluation Approach to Impact Evaluation With reference to the chapter in the Country-led monitoring and evaluation systems book

Description:

The RealWorld Evaluation Approach to Impact Evaluation With reference to the chapter in the Country-led monitoring and evaluation systems book Michael Bamberger and – PowerPoint PPT presentation

Number of Views:105
Avg rating:3.0/5.0
Slides: 28
Provided by: MichaelB109
Learn more at: http://www.unicef.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: The RealWorld Evaluation Approach to Impact Evaluation With reference to the chapter in the Country-led monitoring and evaluation systems book


1
The RealWorld EvaluationApproach to Impact
EvaluationWith reference to the chapter in
theCountry-led monitoring and evaluation
systemsbook
  • Michael Bamberger and
  • Jim Rugh
  • Note more information is available at
  • www.RealWorldEvaluation.org

2
The extensive use of weak impact evaluation
designs
  • Most impact evaluations are not able to use the
    textbook designs with pre-test/post-test project
    and control group comparisons
  • Most assessments of impact are based on
    methodologically weak designs
  • Many claims about project impacts are not
    justified and their tends to be a positive bias
    in many evaluation reports
  • Very few evaluation reports assess the validity
    of the methodology and findings.

3
Weak evaluation designs are due to
  • Time constraints
  • Budget constraints
  • Data constraints
  • Non availability including lack of baseline
    data
  • Quality
  • Political constraints
  • Lack of evaluation culture
  • Lack of understanding of the value of evaluation
  • Unwillingness to accept criticism
  • Lack of expertise
  • Use of information as a political tool

4
The Real-World Evaluation Approach
  • Step 1 Planning and scoping the evaluation
  • . Defining client information needs and
    understanding the political context
  • . Defining the program theory model
  • . Identifying time, budget, data and political
    constraints to be addressed by the RWE
  • . Selecting the design that best addresses client
    needs within the RWE constraints
  • E. Assessing methodological quality and validity
    and defining minimum acceptable design standards

Step 2 Addressing budget constraints A. Modify
evaluation design B. Rationalize data needs C.
Look for reliable secondary data D. Revise
sample design E. Economical data collection
methods
Step 3 Addressing time constraints All Step 2
tools plus F. Commissioning preparatory
studies G. Hire more resource persons H.
Revising format of project records to include
critical data for impact analysis. I. Modern
data collection and analysis technology
  • Step 4
  • Addressing data constraints
  • . Reconstructing baseline data
  • . Recreating comparison groups
  • . Working with non-equivalent comparison groups
  • . Collecting data on sensitive topics or from
    difficult to reach groups
  • . Multiple methods

Step 5 Addressing political influences A.
Accommodating pressures from funding agencies or
clients on evaluation design. B. Addressing
stakeholder methodological preferences. C.
Recognizing influence of professional research
paradigms.
Step 6 Assessing and addressing the strengths
and weaknesses of the evaluation design An
integrated checklist for multi-method designs A.
Objectivity/confirmabilityB. Replicability/depen
dabilityC. Internal validity/credibility/authent
icityD. External validity/transferability/fittin
gness
Step 7 Helping clients use the evaluation A.
Utilization B. Application C. OrientationD.
Action
4
5
How RWE contributes to country-led Monitoring and
Evaluation
  • Increasing the uptake of evidence into policy
    making
  • Involving stakeholders in the design,
    implementation, analysis and dissemination
  • Using program theory to
  • base the evaluation on stakeholder understanding
    of the program and its objectives
  • Ensure evaluation focuses on key issues
  • Present findings
  • when they are needed
  • Using the clients preferred communication style

6
  • The quality challenge matching technical rigor
    and policy relevance
  • Adapting the evaluation design to the level of
    rigor required by decision makers
  • Use of the Threats to validity checklist at
    several points in the evaluation cycle
  • Defining minimum acceptable levels of
    methodological rigor
  • Avoiding positive bias in the evaluation design
    and presentation of findings
  • How to present negative findings

7
  • Adapting country-led evaluation to real-world
    constraints
  • Adapting the system to real-world budget, time
    and data constraints
  • Ensuring evaluations produce useful and
    actionable information
  • Adapting the ME system to national political,
    administrative and evaluation cultures
  • Focus on institutionalization of ME systems not
    just ad hoc evaluations
  • Evaluation capacity development
  • Focus on quality assurance

8
The RealWorld Evaluation Approach
  • An integrated approach to ensure acceptable
    standards of methodological rigor while operating
    under realworld budget, time, data and political
    constraints.

See summary chapter and workshop presentations
at www.RealWorldEvaluation.org for more details
9
Reality Check Real-World Challenges to
Evaluation
  • All too often, project designers do not think
    evaluatively evaluation not designed until the
    end
  • There was no baseline at least not one with
    data comparable to evaluation
  • There was/can be no control/comparison group.
  • Limited time and resources for evaluation
  • Clients have prior expectations for what the
    evaluation findings will say
  • Many stakeholders do not understand evaluation
    distrust the process or even see it as a threat
    (dislike of being judged)

10
Determining appropriate (and feasible) evaluation
design
  • Based on an understanding of client information
    needs, required level of rigor, and what is
    possible given the constraints, the evaluator and
    client need to determine what evaluation design
    is required and possible under the circumstances.

11
Design 1 Longitudinal Quasi-experimental
P1 X P2 X P3 P4 C1 C2
C3 C4
Project participants
Comparison group
baseline
end of project evaluation
post project evaluation
midterm
11
12
Design 2 Quasi-experimental (prepost, with
comparison)
P1 X P2 C1 C2
Project participants
Comparison group
baseline
end of project evaluation
12
13
Design 2 Randomized Control Trial
P1 X P2 C1 C2
Project participants
Research subjects randomly assigned either to
project or control group.
Control group
baseline
end of project evaluation
13
14
Design 3 Truncated Longitudinal
X P1 X P2 C1 C2
Project participants
Comparison group
end of project evaluation
midterm
14
15
Design 4 Prepost of project post-only
comparison
P1 X P2 C
Project participants
Comparison group
baseline
end of project evaluation
15
16
Design 5 Post-test only of project and
comparison
X P C
Project participants
Comparison group
end of project evaluation
16
17
Design 6 Prepost of project no comparison
P1 X P2
Project participants
baseline
end of project evaluation
17
18
Design 7 Post-test only of project participants
X P
Project participants
end of project evaluation
18
19
Other questions to answer as you plan an impact
evaluation
  • What are the key questions to be answered? For
    whom? What evidence will adequately inform them?
  • Will there be a next phase, or other projects
    designed based on the findings of this
    evaluation?
  • Is this a simple, complicated or complex
    situation (see next slide)?

19
20
Implications for understanding impact and using
impact evaluation
SIMPLE COMPLICATED COMPLEX
Question answered What works? What works for whom in what contexts? How do multiple interventions combine to produce the impact? Whats working?
Process needed Knowledge transfer Knowledge translation Knowledge generation
Nature of direction Single way to do it Contingent Dynamic and emergent
Metaphor for direction Written directions Map and timetable Compass
21
Other questions to answer as you plan an impact
evaluation
  • Will focusing on one quantifiable indicator
    adequately represent impact?
  • Is it feasible to expect there to be a clear,
    linear cause-effect chain attributable to one
    unique intervention? Or will we have to account
    for multiple plausible contributions by various
    agencies and actors to higher-level impact?
  • Would one data collection method suffice, or
    should there be a combination of multiple methods
    used?

21
22
Ways to reconstruct baseline conditions
  1. Secondary data
  2. Project records
  3. Recall
  4. Key informants
  5. PRA and other participatory techniques such as
    timelines, and critical incidents to help
    establish the chronology of important changes in
    the community

23
Assessing the utility of potential secondary data
  • Reference period
  • Population coverage
  • Inclusion of required indicators
  • Completeness
  • Accuracy
  • Free from bias

24
Ways to reconstruct comparison groups
  • Judgmental matching of communities.
  • When phased introduction of project services
    beneficiaries entering in later phases can be
    used as pipeline control group.
  • Internal controls when different subjects receive
    different combinations and levels of services

25
Importance of validity
  • Evaluations provide recommendations for future
    decisions and action. If the findings and
    interpretation are not valid
  • Programs which do not work may continue or even
    be expanded
  • Good programs may be discontinued
  • Priority target groups may not have access or
    benefit

26
RWE quality control goals
  • The evaluator must achieve greatest possible
    methodological rigor within the limitations of a
    given context
  • Standards must be appropriate for different types
    of evaluation
  • The evaluator must identify and control for
    methodological weaknesses in the evaluation
    design.
  • The evaluation report must identify
    methodological weaknesses and how these affect
    generalization to broader populations.

27
Main RWE messages
  1. Evaluators must be prepared for realworld
    evaluation challenges
  2. There is considerable experience to draw on
  3. A toolkit of rapid and economical RealWorld
    evaluation techniques is available (see
    www.RealWorldEvaluation.org)
  4. Never use time and budget constraints as an
    excuse for sloppy evaluation methodology
  5. A threats to validity checklist helps keep you
    honest by identifying potential weaknesses in
    your evaluation design and analysis
About PowerShow.com