Using the CDC Evaluation F - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Using the CDC Evaluation F

Description:

... Sisyphus to endlessly roll a rock up a hill, whence it would ... Accounting for alternative explanations. Similar effects observed in similar contexts ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 34
Provided by: pc75142
Category:
Tags: cdc | evaluation | using

less

Transcript and Presenter's Notes

Title: Using the CDC Evaluation F


1
Using the CDC Evaluation Fwork to Avoid
Minefields on the Road to Good Evaluation
  • Presented to
  • 2002 National Asthma Conference
  • October 24, 2002

2
Why We Evaluate
  • ... The gods condemned Sisyphus to endlessly
    roll a rock up a hill, whence it would return
    each time to its starting place. They thought,
    with some reason, that there was no punishment
    more severe than eternally futile labor....
  • The Myth of Sisyphus

MMWRFramework for Program Evaluation in Public
Health
1
3
Defining Evaluation
  • Evaluation is...
  • the systematic investigation of the merit, worth,
    or significance of an object Michael Scriven
  • Program is...
  • any organized public health action/activity

4
Research vs. Program Evaluation
  • A continuum, not a dichotomy, but at far ends may
    differ in
  • Framework and steps
  • Decision making
  • Standards
  • Key questions
  • Design
  • Data collection sources and measures
  • Analysis timing and scope
  • Role of values in making judgments
  • Centrality of attribution as conclusion
  • Audiences for dissemination of results

5
The Continuum
  • Efficacydoes my effort work in ideal
    circumstances
  • Effectivenessdoes my effort work in real world
    settings, and work the same way across settings
  • Implementation fidelityis my (efficacious and
    effective) effort being implemented as intended.

6
Todays Focus
  • Top Minefields on the Road Conducting Good
    Evaluation!

MMWRFramework for Program Evaluation in Public
Health
5
7
Minefield 8
  • Not linking planning and evaluation

MMWRFramework for Program Evaluation in Public
Health
6
8
Minefield 7
  • Evaluating only what you can measure

MMWRFramework for Program Evaluation in Public
Health
7
9
You Get What You Measure
  • In Poland in the 1970s, furniture factories
    were rewarded based on pounds of product shipped.
    As a result, today Poles have the worlds
    heaviest furniture
  • (New York Times, 3/4/99)

MMWRFramework for Program Evaluation in Public
Health
8
10
Minefield 6
  • Thinking evaluatively only at the end

MMWRFramework for Program Evaluation in Public
Health
9
11
When to Evaluate.
  • Good program evaluation shifts our focus from
  • Did it (my effort) work?
  • to
  • Is it (my effort) working?

MMWRFramework for Program Evaluation in Public
Health
10
12
Minefield 5
  • Not asking who (else) cares

MMWRFramework for Program Evaluation in Public
Health
11
13
Minefield 4
  • Neglecting intermediate outcomes

MMWRFramework for Program Evaluation in Public
Health
12
14
Forgetting Intermediate Outcomes
13
15
Minefield 3
  • Neglecting process evaluation

MMWRFramework for Program Evaluation in Public
Health
14
16
Minefield 2
  • Confusing attribution and contribution

MMWRFramework for Program Evaluation in Public
Health
15
17
Networked Interventions
18
Minefield 1
  • Using more sticks than carrots

MMWRFramework for Program Evaluation in Public
Health
17
19
Framework forProgram Evaluation
18
20
Standards forEffective Evaluation
  • Not HOW TO do an evaluation, but help direct
    choices among options at each step
  • At each step, standards ask which choice(s)
  • Utility (7) Best serve information needs of
    intended usersFeasibility (3) Are most
    realistic, prudent, diplomatic, and frugal given
    resources
  • Propriety (8) Best meet law, ethics, and due
    regard for the welfare of those involved and
    affected
  • Accuracy (12) Best reveal and convey
    technically accurate information

19
21
Broadening Our Thinking About Evaluation
  • What to evaluate
  • When to evaluate
  • Who should be involved in evaluation
  • How to evaluate

20
22
Who Should Evaluate?
21
23
Why Involve Stakeholders
  • Smoke out disagreements in
  • Definition of the problem
  • Activities and priorities of program
  • Outcomes that equate to success
  • What constitutes proof of success
  • Get their help with..
  • Credibility of findings
  • Access to key players
  • Follow-up
  • Dissemination of results

22
24
Using Logic Modelsfor Evaluation
  • Clarity on
  • What are activities
  • What are intended effects
  • What is the sequence/order of intended effects
  • Which activities are to produce which effects
  • Consensus with stakeholders on all of the above
  • Focus the evaluation design

23
25
Some Factors That Influence Choice of Evaluation
Focus
  • Users and uses Who wants the information and
    what are they interested in?
  • Accountability to (other) stakeholders For what
    effects are key stakeholders expecting to see
    results?
  • Resources Time, money, expertise
  • Stage of development How long has the program
    been in existence?
  • Ripple effect- How far out would an
    intervention of this intensity reasonably be
    expected to have an effect?

26
Setting Evaluation Focus Some Process Issues
  • What are the likely key challenges to
    implementation fidelity?
  • Dropped baton issues are key
  • Partner failed to do their part
  • Client/family/patient failed to fulfill their
    referral
  • Other common challenges
  • Inadequate dosage
  • Bad access
  • Failure to retain participants
  • Wrong match of staff and participant

25
27
Evidence Gathering Choosing Design
  • What intervention was actually delivered?
  • Were impacts and outcomes achieved?
  • Was the intervention responsible for the impacts
    and outcomes?

28
Justifying Claims About Intervention Effectiveness
  • Performance vs. a comparison/control group
  • Time sequence
  • Plausible mechanisms (or pathways toward change)
  • Accounting for alternative explanations
  • Similar effects observed in similar contexts

27
29
Choosing DataCollection Methods
  • Function of
  • Time
  • Cost
  • Sensitivity of the issue
  • Hawthorne effect
  • Ethics
  • Validity
  • Reliability

28
30
Maximizing Use of Results Key Questions
  • Who is the audience?
  • What will be of greatest importance to them?
  • How will they use the information provided?
  • How much time will they be willing to spend
    reading and assimilating the material?
  • What type of vocabulary will express the
    information most clearly?

29
31
Some CDC Asthma Examples
  • Comprehensive School-Based Asthma Project
  • Controlling Asthma in American Cities (CAAP)
    Project

30
32
Helpful Publications _at_ www.cdc.gov/eval
31
33
Community Tool Boxhttp//ctb.ku.edu/
32
Write a Comment
User Comments (0)
About PowerShow.com