Introduction to Program Evaluation (Ex-Post Policy Analysis) - PowerPoint PPT Presentation


PPT – Introduction to Program Evaluation (Ex-Post Policy Analysis) PowerPoint presentation | free to download - id: 48054b-ZDcyM


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Introduction to Program Evaluation (Ex-Post Policy Analysis)


Introduction to Program Evaluation (Ex-Post Policy Analysis) Why It s NOT Ex-Ante Policy Analysis Why It s NOT Research Current Jumble of Approaches – PowerPoint PPT presentation

Number of Views:92
Avg rating:3.0/5.0
Slides: 46
Provided by: Aar80


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Introduction to Program Evaluation (Ex-Post Policy Analysis)

Introduction to Program Evaluation (Ex-Post
Policy Analysis)
  • Why Its NOT Ex-Ante Policy Analysis
  • Why Its NOT Research
  • Current Jumble of Approaches

Before and After
  • What we do before a policy is passed is generally
    referred to as Ex-ante Policy Analysis
  • What we do after a policy is passed and programs
    are established and operating can be referred to
    as Ex-post policy analysis, but, most often is
    referred to as Program Evaluation

(No Transcript)
Ex-ante policy analysis "for policy making"
  • - Analyzing policies. programs, and projects
    BEFORE they are implemented
  • Commonly referred to as policy analysis

Policy formulation
Research for .
Ex-post policy analysis "of policy"
  • - Analyzing policies, programs, and projects
    AFTER they have been implemented
  • Often called program evaluation

Research during and after policy implementation
Ex-ante policy analysis
  • Ex-ante seems straight forward, doesnt it?
  • Figure out what you want to do
  • Figure out ways to do it
  • Compare the ways
  • Choose the best one
  • But its not It is incredibly chaotic

Policy Tower of Babel
  • The policy analysis field is currently home to a
    babble of tongues
  • Dozens of approaches, methodologies and
    frameworks are discussed throughout the
  • Almost always without reference to the other
    half-dozen nearly identical approaches

No unification of thought coming
  • The policy field is currently marked by an
    extraordinary variety of technical approaches,
    reflecting the variety of research traditions in
    contemporary social science. That variety is
    likely to persist for the foreseeable future, for
    the reductionist dream of a unified social
    science under a single theoretical banner is
  • Davis Bobrow and John Dryzek, 1987

  • While there are literally dozens (if not
    hundreds) of methods to conduct Policy
    Analysis they can be summarized into some larger
    headings based on their underlying assumptions
    (beliefs in truth)
  • Of course, many policy professors have attempted
    just that and now we have many different
    frameworks of methods

  • The one I find most conceptually clear is that of
    Bobrow and Dryzek, they see all of these
    different approaches falling into one of 5
    overarching frameworks
  • Welfare Economics
  • Public Choice
  • Social Structure
  • Information Processing
  • And Political Philosophy

  • Welfare economics
  • has the greatest number of policy field
    practitioners and is manifested in such familiar
    techniques as cost-benefit analysis and
    cost-effectiveness analysis

  • Public Choice
  • Straddles the disciplines of microeconomics,
    political science, and public administration and
    concerns itself mostly with the analysis and
    design of decision structures

  • Social structure
  • Rooted in sociology, has some crucial
    subdivisions, most noticeably those focusing on
    individual endowments vs. those focusing on group

  • Information processing
  • Mostly focuses on the limits inherent upon any
    participant in the policy process (although some
    optimistic practitioners see a chance for change
    through recognition of the situation)

  • Political philosophy
  • Practitioners focus on applying moral reasoning
    to the content of policy and the process of
    policy making
  • A key idea to remember is that policy analyses
    conducted within these frames, by and large, are
    not directly concerned with the final results in
    terms of program outcomes

So why is the policy analysis field so chaotic?
  • What do you think?
  • What does policy result in?
  • Given that, is there any way that a completely
    rational view can achieved?

Wamsley and Zald
  • If we think of every program, organization,
    network, etc as having an external/internal
    dimension and a political economic dimension, the
    difference between policy analysis and program
    evaluation becomes more clear

Wamsley Zald Remapped
Wamsley Zald Remapped
  • External to the program itself is a political and
    economic environment trying to decide WHAT TO DO!

Wamsley Zald Remapped
  • Internal to the program itself is a social and
    technical environment trying to decide IF WE DID
    IT WELL!

One Policy provides environment for many Programs
Policy Analysis
Program Evaluation
What is Program Evaluation (Ex-post Policy
  • Early definition
  • determining the worth or merit of something.
    Scriven (1967)
  • Contemporary definition
  • the identification, clarification, and
    application of defensible criteria to determine
    an evaluation objects value (worth or merit) in
    relation to those criteria Ftizpatrick, et al.
  • Whats the difference?

What is Program Evaluation (Ex-post Policy
  • One educator may like a new reading curriculum
    because of the love of reading it instills
  • Another educator may not like the same curriculum
    because it doesnt move the child along as
    rapidly as other curricula in terms of letter
    interpretation, word interpretation, or sentence
  • They are looking at the same program using
    different criteria

Program Evaluation is not Research
  • Research and Evaluation differ in their purposes
    and, as a result, in the roles of the researcher
    and evaluator in their work, their preparation,
    the generalizability of their results, and the
    criteria used to judge their work.

Program Evaluation is not Research
Research Evaluation
Agenda setter
Develop knowledge/theory
Help make judgments/decisions
Specific to the evaluation object
Internal validity (causality) external validity
Accuracy, utility, feasibility, propriety
One discipline
  • Research seeks to prove, evaluation seeks to

  • M.Q. Patton

Formal vs. Informal Evaluation
  • Evaluation is not new!
  • Neanderthals used it in determining with saplings
    made the best spears

Formal vs. Informal Evaluation
  • English yeoman abandoned their own crossbows in
    favor of the Welsh longbow
  • No GAO report has been found but we assume an
    informal evaluation was conducted as some point
  • Result clobbered the French who tried the
    longbow but went back to the crossbow (BAD

Formal vs. Informal Evaluation
  • As humans we informally evaluate things everyday
  • Administrators make quick judgments on personnel,
    programs, budgets, etc. These judgments lead to
  • A policy maker may make a judgment leading to a
    voting decision on a policy based on a single

Formal vs. Informal Evaluation
  • Informal evaluation may result in poor or wise
  • The point is that they are characterized by an
    absence of breadth and depth because they lack
    systematic procedures and formally collected
  • Program evaluation is about formalizing our
    approaches in forming judgments and making

Formal Evaluation Process
  • Determine standards for judging quality
  • Collect relevant information
  • Apply the standards to determine value, quality,
    utility, effectiveness or significance
  • Identify recommendations to optimize evaluation
    object (program)

Evaluations Purposes
  • Typical Purpose
  • determine merit or worth of something render
    judgments about the value of whatever is being
  • Alternative purposes
  • Serve political functions
  • Facilitate learning
  • Social betterment
  • Foster deliberative democracy

Why Evaluate Programs?
  • To gain insight about a program and its
    operations to see where we are going and where
    we are coming from, and to find out what works
    and what doesnt
  • To improve practice to modify or adapt practice
    to enhance the success of activities
  • To assess effects to see how well we are
    meeting objectives and goals, how the program
    benefits the community, and to provide evidence
    of effectiveness
  • To build capacity - increase funding, enhance
    skills, strengthen accountability

What Can be Evaluated?
  • Laboratory diagnostics
  • Communication campaigns
  • Infrastructure-building projects
  • Training and educational services
  • Administrative systems
  • Direct service interventions
  • Community mobilization efforts
  • Research initiatives
  • Surveillance systems
  • Policy development activities
  • Outbreak investigations

When to Conduct Evaluation?
Planning a NEW program
Assessing a DEVELOPING program
Assessing a STABLE, MATURE program
Assessing a program after it has ENDED
The stage of program development influences the
reason for program evaluation.
Two basic types of Evaluation
  • Formative (Process)
  • Provide information for program improvement,
    typically to judge the merit and worth of a part
    of a program
  • Audience is generally the people delivering the
    program or those close to it.
  • Typically qualitative in nature

Two basic types of Evaluation
  • Summative (Impact or Outcomes)
  • Summative evaluation is a process of identifying
    larger patterns and trends in performance and
    judging these summary statements against criteria
    to obtain performance ratings
  • Provide information for making decisions about
    program adoption, continuation, or expansion
  • Audience is generally potential consumers
    (students, teachers, employees, managers, etc)
  • Mostly quantitative in nature

Balance between Formative and Summative
Three subtypes Needs Assessment, Process, and
Outcome Evaluations
  • Needs assessment
  • Does a problem/need exist?
  • Recommend ways to reduce the problem
  • Process/Monitoring
  • Description of program delivery
  • Outcome
  • Descriptions of changes in recipients or other
    secondary audiences based on program delivery

A Typology of Evaluation Studies
Types Types
Formative (Process) Revise/Change Summative (Impact) Begin/Continue/Expand
Subtypes Needs Assessment Do we need to do things differently? Should we begin a program? Is there sufficient need?
Subtypes Process Is more training of staff needed to deliver the program appropriately? Are sufficient numbers of the target audience participating in the program to merit continuation
Subtypes Outcome How can we revise our training to better achieve desired outcomes? Is this program achieving its goals to a sufficient degree that its funding should be continued?
So thats it?
  • No way!
  • The way an evaluator looks at Truth vs. truth
    creates another dimension!
  • Each one of those study types can be conducted in
    a manner focusing on replication with a lot of
    data or focusing on deep understanding of very
    little data

Objectivist vs. Subjectivist Epistemology
  • Objectivism
  • Requires an evaluation study to utilize data
    collection and analysis techniques that yield
    results that are reproducible and verifiable by
    other competent persons using the same
  • Subjectivism
  • Bases its validity claims on an appeal to
    experience rather than to the scientific method

A Typology of Evaluation Studies
Types Types
Formative (Process) Revise/Change Summative (Impact) Begin/Continue/Expand
Subtypes Needs Assessment Do we need to do things differently?
Subtypes Process
Subtypes Outcome
This question could be answered by a survey of
all employees (quantitative-objective) or by
convening a panel of experts in the field
Evaluation Approaches
  • The Objective-Subjective Dimension creates a
    broader set of approaches
  • Any of the types of evaluation could fall
    within any of these approaches
  • It all depends on the methodologies employed (how
    you collect and analyze your data)

Evaluation Approaches
  • Objectives-Oriented Approaches
  • Focus on specifying goals and objectives and
    determining the extent to which they have been
  • Management-Oriented Approaches
  • Central concern is identifying and meeting the
    information needs of managerial decision makers.
  • Consumer-Oriented Approaches
  • Central issue is developing evaluative
    information on products and accountability for
  • Expertise-Oriented Approaches
  • Depend on the direct application of professional
    expertise to judge quality of whatever is being
  • Participant-Oriented Approaches
  • Involvement of participants (primarily
    stakeholders) are central in determining the
    values, criteria, needs, data, and conclusions
    for the evaluation.