Introduction to Program Evaluation (Ex-Post Policy Analysis) - PowerPoint PPT Presentation

Loading...

PPT – Introduction to Program Evaluation (Ex-Post Policy Analysis) PowerPoint presentation | free to download - id: 48054b-ZDcyM



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Introduction to Program Evaluation (Ex-Post Policy Analysis)

Description:

Introduction to Program Evaluation (Ex-Post Policy Analysis) Why It s NOT Ex-Ante Policy Analysis Why It s NOT Research Current Jumble of Approaches – PowerPoint PPT presentation

Number of Views:92
Avg rating:3.0/5.0
Slides: 46
Provided by: Aar80
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Introduction to Program Evaluation (Ex-Post Policy Analysis)


1
Introduction to Program Evaluation (Ex-Post
Policy Analysis)
  • Why Its NOT Ex-Ante Policy Analysis
  • Why Its NOT Research
  • Current Jumble of Approaches

2
Before and After
  • What we do before a policy is passed is generally
    referred to as Ex-ante Policy Analysis
  • What we do after a policy is passed and programs
    are established and operating can be referred to
    as Ex-post policy analysis, but, most often is
    referred to as Program Evaluation

3
(No Transcript)
4
Ex-ante policy analysis "for policy making"
  • - Analyzing policies. programs, and projects
    BEFORE they are implemented
  • Commonly referred to as policy analysis

Policy formulation
Research for .
5
Ex-post policy analysis "of policy"
  • - Analyzing policies, programs, and projects
    AFTER they have been implemented
  • Often called program evaluation

Research during and after policy implementation
6
Ex-ante policy analysis
  • Ex-ante seems straight forward, doesnt it?
  • Figure out what you want to do
  • Figure out ways to do it
  • Compare the ways
  • Choose the best one
  • But its not It is incredibly chaotic

7
Policy Tower of Babel
  • The policy analysis field is currently home to a
    babble of tongues
  • Dozens of approaches, methodologies and
    frameworks are discussed throughout the
    literature
  • Almost always without reference to the other
    half-dozen nearly identical approaches

8
No unification of thought coming
  • The policy field is currently marked by an
    extraordinary variety of technical approaches,
    reflecting the variety of research traditions in
    contemporary social science. That variety is
    likely to persist for the foreseeable future, for
    the reductionist dream of a unified social
    science under a single theoretical banner is
    dead.
  • Davis Bobrow and John Dryzek, 1987

9
Frames
  • While there are literally dozens (if not
    hundreds) of methods to conduct Policy
    Analysis they can be summarized into some larger
    headings based on their underlying assumptions
    (beliefs in truth)
  • Of course, many policy professors have attempted
    just that and now we have many different
    frameworks of methods

10
Frames
  • The one I find most conceptually clear is that of
    Bobrow and Dryzek, they see all of these
    different approaches falling into one of 5
    overarching frameworks
  • Welfare Economics
  • Public Choice
  • Social Structure
  • Information Processing
  • And Political Philosophy

11
Frames
  • Welfare economics
  • has the greatest number of policy field
    practitioners and is manifested in such familiar
    techniques as cost-benefit analysis and
    cost-effectiveness analysis

12
Frames
  • Public Choice
  • Straddles the disciplines of microeconomics,
    political science, and public administration and
    concerns itself mostly with the analysis and
    design of decision structures

13
Frames
  • Social structure
  • Rooted in sociology, has some crucial
    subdivisions, most noticeably those focusing on
    individual endowments vs. those focusing on group
    endowments

14
Frames
  • Information processing
  • Mostly focuses on the limits inherent upon any
    participant in the policy process (although some
    optimistic practitioners see a chance for change
    through recognition of the situation)

15
Frames
  • Political philosophy
  • Practitioners focus on applying moral reasoning
    to the content of policy and the process of
    policy making
  • A key idea to remember is that policy analyses
    conducted within these frames, by and large, are
    not directly concerned with the final results in
    terms of program outcomes

16
So why is the policy analysis field so chaotic?
  • What do you think?
  • What does policy result in?
  • Given that, is there any way that a completely
    rational view can achieved?

17
Wamsley and Zald
  • If we think of every program, organization,
    network, etc as having an external/internal
    dimension and a political economic dimension, the
    difference between policy analysis and program
    evaluation becomes more clear

18
Wamsley Zald Remapped
19
Wamsley Zald Remapped
  • External to the program itself is a political and
    economic environment trying to decide WHAT TO DO!

20
Wamsley Zald Remapped
  • Internal to the program itself is a social and
    technical environment trying to decide IF WE DID
    IT WELL!

21
One Policy provides environment for many Programs
Policy Analysis
Program Evaluation
22
What is Program Evaluation (Ex-post Policy
Analysis)?
  • Early definition
  • determining the worth or merit of something.
    Scriven (1967)
  • Contemporary definition
  • the identification, clarification, and
    application of defensible criteria to determine
    an evaluation objects value (worth or merit) in
    relation to those criteria Ftizpatrick, et al.
    (2004)
  • Whats the difference?

23
What is Program Evaluation (Ex-post Policy
Analysis)?
  • One educator may like a new reading curriculum
    because of the love of reading it instills
  • Another educator may not like the same curriculum
    because it doesnt move the child along as
    rapidly as other curricula in terms of letter
    interpretation, word interpretation, or sentence
    meaning
  • They are looking at the same program using
    different criteria

24
Program Evaluation is not Research
  • Research and Evaluation differ in their purposes
    and, as a result, in the roles of the researcher
    and evaluator in their work, their preparation,
    the generalizability of their results, and the
    criteria used to judge their work.

25
Program Evaluation is not Research
Research Evaluation
Purpose
Agenda setter
Generalizability
Criteria
Preparation
Develop knowledge/theory
Help make judgments/decisions
Researcher
Stakeholders
Specific to the evaluation object
Widespread
Internal validity (causality) external validity
(generalizability)
Accuracy, utility, feasibility, propriety
One discipline
Interdisciplinary
26
  • Research seeks to prove, evaluation seeks to
    improve

  • M.Q. Patton

27
Formal vs. Informal Evaluation
  • Evaluation is not new!
  • Neanderthals used it in determining with saplings
    made the best spears

28
Formal vs. Informal Evaluation
  • English yeoman abandoned their own crossbows in
    favor of the Welsh longbow
  • No GAO report has been found but we assume an
    informal evaluation was conducted as some point
  • Result clobbered the French who tried the
    longbow but went back to the crossbow (BAD
    EVALUATION)

29
Formal vs. Informal Evaluation
  • As humans we informally evaluate things everyday
  • Administrators make quick judgments on personnel,
    programs, budgets, etc. These judgments lead to
    decisions
  • A policy maker may make a judgment leading to a
    voting decision on a policy based on a single
    speech

30
Formal vs. Informal Evaluation
  • Informal evaluation may result in poor or wise
    decisions
  • The point is that they are characterized by an
    absence of breadth and depth because they lack
    systematic procedures and formally collected
    evidence
  • Program evaluation is about formalizing our
    approaches in forming judgments and making
    decisions

31
Formal Evaluation Process
  • Determine standards for judging quality
  • Collect relevant information
  • Apply the standards to determine value, quality,
    utility, effectiveness or significance
  • Identify recommendations to optimize evaluation
    object (program)

32
Evaluations Purposes
  • Typical Purpose
  • determine merit or worth of something render
    judgments about the value of whatever is being
    evaluated
  • Alternative purposes
  • Serve political functions
  • Facilitate learning
  • Social betterment
  • Foster deliberative democracy

33
Why Evaluate Programs?
  • To gain insight about a program and its
    operations to see where we are going and where
    we are coming from, and to find out what works
    and what doesnt
  • To improve practice to modify or adapt practice
    to enhance the success of activities
  • To assess effects to see how well we are
    meeting objectives and goals, how the program
    benefits the community, and to provide evidence
    of effectiveness
  • To build capacity - increase funding, enhance
    skills, strengthen accountability

34
What Can be Evaluated?
  • Laboratory diagnostics
  • Communication campaigns
  • Infrastructure-building projects
  • Training and educational services
  • Administrative systems
  • Direct service interventions
  • Community mobilization efforts
  • Research initiatives
  • Surveillance systems
  • Policy development activities
  • Outbreak investigations

34
35
When to Conduct Evaluation?
Planning a NEW program
Assessing a DEVELOPING program
Assessing a STABLE, MATURE program
Assessing a program after it has ENDED
Conception
Completion
The stage of program development influences the
reason for program evaluation.
36
Two basic types of Evaluation
  • Formative (Process)
  • Provide information for program improvement,
    typically to judge the merit and worth of a part
    of a program
  • Audience is generally the people delivering the
    program or those close to it.
  • Typically qualitative in nature

37
Two basic types of Evaluation
  • Summative (Impact or Outcomes)
  • Summative evaluation is a process of identifying
    larger patterns and trends in performance and
    judging these summary statements against criteria
    to obtain performance ratings
  • Provide information for making decisions about
    program adoption, continuation, or expansion
  • Audience is generally potential consumers
    (students, teachers, employees, managers, etc)
  • Mostly quantitative in nature

38
Balance between Formative and Summative
39
Three subtypes Needs Assessment, Process, and
Outcome Evaluations
  • Needs assessment
  • Does a problem/need exist?
  • Recommend ways to reduce the problem
  • Process/Monitoring
  • Description of program delivery
  • Outcome
  • Descriptions of changes in recipients or other
    secondary audiences based on program delivery

40
A Typology of Evaluation Studies
Types Types
Formative (Process) Revise/Change Summative (Impact) Begin/Continue/Expand
Subtypes Needs Assessment Do we need to do things differently? Should we begin a program? Is there sufficient need?
Subtypes Process Is more training of staff needed to deliver the program appropriately? Are sufficient numbers of the target audience participating in the program to merit continuation
Subtypes Outcome How can we revise our training to better achieve desired outcomes? Is this program achieving its goals to a sufficient degree that its funding should be continued?
41
So thats it?
  • No way!
  • The way an evaluator looks at Truth vs. truth
    creates another dimension!
  • Each one of those study types can be conducted in
    a manner focusing on replication with a lot of
    data or focusing on deep understanding of very
    little data

42
Objectivist vs. Subjectivist Epistemology
  • Objectivism
  • Requires an evaluation study to utilize data
    collection and analysis techniques that yield
    results that are reproducible and verifiable by
    other competent persons using the same
    techniques.
  • Subjectivism
  • Bases its validity claims on an appeal to
    experience rather than to the scientific method

43
A Typology of Evaluation Studies
Types Types
Formative (Process) Revise/Change Summative (Impact) Begin/Continue/Expand
Subtypes Needs Assessment Do we need to do things differently?
Subtypes Process
Subtypes Outcome
This question could be answered by a survey of
all employees (quantitative-objective) or by
convening a panel of experts in the field
(qualitative-subjective)
44
Evaluation Approaches
  • The Objective-Subjective Dimension creates a
    broader set of approaches
  • Any of the types of evaluation could fall
    within any of these approaches
  • It all depends on the methodologies employed (how
    you collect and analyze your data)

45
Evaluation Approaches
  • Objectives-Oriented Approaches
  • Focus on specifying goals and objectives and
    determining the extent to which they have been
    attained.
  • Management-Oriented Approaches
  • Central concern is identifying and meeting the
    information needs of managerial decision makers.
  • Consumer-Oriented Approaches
  • Central issue is developing evaluative
    information on products and accountability for
    consumers.
  • Expertise-Oriented Approaches
  • Depend on the direct application of professional
    expertise to judge quality of whatever is being
    evaluated.
  • Participant-Oriented Approaches
  • Involvement of participants (primarily
    stakeholders) are central in determining the
    values, criteria, needs, data, and conclusions
    for the evaluation.
About PowerShow.com