Title: Building an Organizational Evaluation Infrastructure / Culture to Improve the Management and Outcomes of Health Programs
1Building an Organizational Evaluation
Infrastructure / Cultureto Improve the
Management and Outcomes of Health Programs
Judith Hager, MA, MPH Molly Bradshaw,
MPH Geraldine Oliva, MD, MPH Family Health
Outcomes Project University of California, San
Francisco October 24, 2001
2 An Evaluation Infrastructure
- An institutional philosophy / ethic to provide
quality, relevant, cost effective services - Shared systematic approach / framework
- Staff with an understanding of and basic
evaluation skills - Access to evaluation resources (tools,
Information, data, etc.)
3Why an Evaluation Infrastructure
- A Formal System of Accountability
- Influence External Decision Making
- Assist Program Management Decisions
- How to best use resources, improve client and
staff satisfaction - Promote team-building
4Evaluation Is
- The systematic investigation of the quality and
effectiveness of organized efforts or activities - The systematic assessment of the merit, worth, or
significance of an object (CDC)
5A Program is
- Any organized set of activities intended to
achieve an outcome - (supported by resources)
- Can be an Initiative, Departmental Program or a
Project
6Three approaches to building an evaluation
infrastructure
- State MCH Sponsored Evaluation Course
- Non-profit Community Collaboration Framework
- County Health Department capacity-building
7MCH Evaluation Course for County MCH Teams
- Two workshop days / county teams
- Choose and develop a real program to
evaluate - Technical assistance by phone or on-
site - Assignments and feedback
8Solano Coalition for Better Health - (5013C)
- Comprised of hospitals, health plans, health
depart., CBOs, comm. Clinics - Contract for development /
facilitation / data collection analysis - Three goals / comm. health indicators
- Agree to share info
9Mendocino County Health Department
- Mandate for all Division/program managers
- Philosophy of program improvement / resource
allocation - 2 day workshop
- Develop evaluations /group critique
- Manuals / train the trainers
10Elements of An Evaluation Infrastructure
- Mandate
- Purpose
- Model/Approach
- Training
- Expertise
- Data Capability
- Gestalt / Integrated
11Evaluation Planning Framework (CDC)
Engage Stakeholders
Describe the Program
Ensure Use Share Lessons Learned
Focus the Evaluation Design
Justify Conclusions
Gather and Analyze Evidence
12Creating and Using a Logic Model
- Purposes
- Understand the program,
- how it works,
- desired results and
- Test logic
-
13More about why.
- Examine / improve broad, fuzzy objectives
- Show the chain of events linking inputs to
results - Clarify difference between activities and
outcomes - Identify gaps in logic / assumptions
- Help determine what to evaluate / key ?s
- Build understanding consensus about the program
14Building a Logic Model
- Evaluation logic models
- depict how a program works to achieve its
intended outcomes - may be a flow chart, table, diagram, etc
- have common elements
15Elements of an Evaluation Logic Model
- Inputs (resources)
- Activities (interventions)
- Outputs
- Outcomes (short, intermediate and long-term)
- Some models also include other elements such as
problem statement, assumptions, environment and
program target (e.g., 13 to 17 year olds / age,
sex, location)
16Today, we will use an adaptation of the UWEX
logic Model
-
- A graphic representation that shows logical
relationships between inputs, outputs and
outcomes relative to a situation - Elements
- Problem statement/ situation
- Inputs
- Outputs
- Outcomes
- Assumptions
- Environment
Ellen Taylor-Powell, University of Wisconsin -
Extension
17Program Logic Model Preparation
- Problem Statement
- Program Description
- Goals and Objectives
- Stakeholders / Use of Evaluation
- Program concepts / action theory
- Literature Review
-
18 S I T U A T I O N
INPUTS
OUTCOMES
OUTPUTS
ASSUMPTIONS (Theories that guide your
program) 1. 2. 3.
ENVIRONMENT (external factors)
19What it tells us
INPUTS
OUTPUTS
OUTCOMES
Programmatic investments
Short
Inter- mediate
Long term
Activities
Participation
Effort / What program does
Who program targets
Resources allocated
With what results
20 PLANNING
OUTPUTS
OUTCOMES
INPUTS
Programmatic investments
Short
Medium
Long term
Participation
Activities
EVALUATION
21(No Transcript)
22(No Transcript)
23(No Transcript)
24Logical Linkages ExampleSeries of If-Then
Relationships
- IF THEN IF THEN
IF THEN IF
THEN
Program invests time money
Design parenting curriculum
Parents increase knowledge
Parenting improved
Decrease rates of child abuse
INPUTS OUTPUTS OUTCOMES
25Program Example
Problem Child Abuse
INPUTS
OUTPUTS
OUTCOMES
Parents increase knowledge of child develop.
Design parent ed. curriculum
Staff
Parents use improved parenting skills
Reduced rates of child abuse neglect
Targeted parents attend
Money
Provide 6 training session
Parents learn new ways to discipline
Partners
26Where Does Evaluation Fit?
INPUTS
OUTPUTS
OUTCOMES
Design parent ed. curriculum
Parents increase knowledge/ child develop.
Staff
Parents use improved parenting skills
Reduced rates of child abuse neglect
Targeted parents attend
Money
Provide 6 training sessions
Parents learn new ways to discipline
Partners
EVALUATION What do you want to know? What data
do you need?
27Elements of An Evaluation Infrastructure
- Mandate
- Purpose
- Model/Approach
- Training
- Expertise
- Data Capability
- Gestalt / Integrated
28Limitations of an evaluation infrastructure
- Requires staff commitment / time
- Difficult to evaluate own work
- May not support rigorous evaluation
- Not the answer to management problems
- Still need access to expertise
29Recommendations
- Leadership build trust
- Develop a common language
- Introduce evaluation model
- Provide training
30Recommendations
- Build a team approach
- Provide evaluation tools
- Allocate time / long term
- Provide guidelines about
what is important to evaluate
31Recommendations
- Understand Evaluation is Complex
- Accountability for the denominator population
- Understand stages of evaluation
- Know when consultants work best
32Program Planning and Evaluation