Program Evaluation Screencast - PowerPoint PPT Presentation

Loading...

PPT – Program Evaluation Screencast PowerPoint presentation | free to download - id: 785f0a-ODJkY



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Program Evaluation Screencast

Description:

Middle school students in one school ... This study assessed the impact of a dating violence sensitivity ... Assessment Evaluability of a program based ... – PowerPoint PPT presentation

Number of Views:2
Avg rating:3.0/5.0
Date added: 23 August 2018
Slides: 23
Provided by: mcse150
Learn more at: http://rampages.us
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Program Evaluation Screencast


1
Program Evaluation Screencast
https//ess.echo360.vcu.edu8443/ess/echo/presenta
tion/fff3175a-9600-4bda-91fa-a21a1fe32758
  • Prepared by Mary Secret
  • Based on materials from the following sources
  • Babbie,, E. 2014 The Practice of Social
    Research, (14th edition). Boston, MAThomson
    Wadsworth
  • Corcoran, J. Secret, M. (2013). Social Work
    Research Skills Workbook. New York Oxford
  • Engel, R.J., Schutt, R.K., (2013). The practice
    of research in social work (3rd Ed). Thousand
    Oaks, CA Sage.

ECHO 360 links https//ess.echo360.vcu.edu8443/e
ss/echo/presentation/fff3175a-9600-4bda-91fa-a21a1
fe32758?ectrue https//ess.echo360.vcu.edu8443/
ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a
1fe32758
2
What is the purpose of program evaluation
  • To investigate social programs
  • To assess effectiveness of social policies and
    programs.

3
Program Evaluation Prologue
  • is not a specific activity or method that you
    can point to or associate with any particular
    step of the research process
  • encompasses all aspects of research processes
    and methods

4
  • Major comprehensive program evaluation can
  • Include experimental and non-experimental
    research designs,
  • Use both qualitative and quantitative approaches,
  • Collect data from secondary data sources,
    interview participants,
  • Use standardized or non-standardized measurement
    instruments,
  • Include both probability and nonprobability
    samples,
  • Must adhere to the standard research ethics

5
  • Program Evaluation
  • is distinguished from other types of social
    science research not by the design, method, or
    the approach
  • but the underlying intent, the purposes that
    guide the evaluation process

6
What is the purpose of program evaluation
  • To investigate social programs
  • To assess effectiveness of social policies and
    programs.

7
Question FIRST!!!
The specific methods depends on the evaluation
question of interest about a specific program,
policy or intervention
  • Questions to be answered
  • Is the program needed?
  • Do a needs assessment
  • How does the program operate?
  • Do a formative or process evaluation
  • What is the programs impact?
  • Do a summative or outcome evaluation
  • How efficient is the program?
  • Do a cost benefit or a cost effectiveness
    analysis

8
The language of evaluation Fill in the Blank
  • the impact of the program the intended result
    the response variable the dependent variable
  • Outcomes
  • the services delivered or new products produced
    by the program process
  • Outputs
  • resources, raw materials, clients, and staff that
    go into a program
  • In puts
  • Population for whom the program is designed.
  • Target Population
  • individuals and groups who have some basis of
    concern with the program, often setting the
    research agenda and controlling research findings
  • Stakeholders
  • information about service delivery system
    outputs, outcomes, or operations that is
    available to any program stakeholders
  • Feedback

9
What is a Needs Assessment
  • Systematically researching questions about the
    needs of a target population for program planning
    purposes that obtains information from
  • Key Informants
  • expert opinions from individuals who have special
    knowledge about the needs and about the existing
    services
  • Rates under treatment
  • secondary analysis of existing statistics to
    estimate need for services based on number and
    characteristics of clients who are already being
    served.
  • Social Indicators
  • existing statistics that reflect conditions of an
    entire population.. i.e. census data, Kids Count
    data.

Rubin and Babbe, (2007) Essential Research
Methods for Social Work. Brooks ColeCA
10
What is a Process or Formative Evaluation
  • How do you know whether or not the service was
    delivered in the manner intended i.e. according
    to protocol or evidence based practice model
  • Must measure (collect data) the Independent
    Variable the intervention.. What services were
    actually delivered, i.e. Number of counseling
    sessions, hours of training, number of meetings,
    etc etc

11
What is an Outcome Evaluation Also known as
impact evaluation and summative evaluation
  • Evaluation research that examines the
    effectiveness of the treatment or other service
  • Program is independent variable (treatment)
  • Outcomes are dependent variables
  • Experimental design is preferred method for
    maximizing internal validity because of
  • Random assignment into an experimental group and
    a control/comparison group
  • Manipulation of the independent variable

12
A closer look at Experimental Designs
  • Research Design Notations
  •  
  • RRandom assignment
  • OObservation, data collection
  • XIntervention or treatment
  •  

13
Classic Experimental Design
A local drug treatment program wanted to assess the effectiveness of adding an 8-week yoga class to their current counseling and medication treatment. A counselor has 30 clients who had been in treatment for less than 2 weeks. She randomly assigned 15 clients to participate in yoga and other 15 participated in the regular treatment activities. She has all 30 clients complete an assessment packet that examined their substance abuse history and their social and emotional functioning. She provides the yoga meditation to the After 8 weeks, she evaluated the participants substance use and their social and emotional functioning. R Experimental Group O X1 O Pretest 8-week Posttest Yoga class medication and counseling     Comparison Group O X2 O Pretest Medication Posttest and counseling  

Controls for selection bias and history and
maturation and statistical regression threat to
internal validity
14
Quasi-experimental design
A statewide evaluation of the ENABL program was conducted to assess its ability to increase adolescents knowledge and beliefs about pregnancy prevention. ENABL is a 4-week program aimed at preventing teenage pregnancy through abstinence. Middle school students in one school district comprised a treatment group (n 974), and students in another school district comprised a control group who received no intervention. Subjects completed a pretest and posttest reflecting knowledge and beliefs about teenage pregnancy before and after the ENABL program.  NO RANDOM ASSIGNMENT Experimental Group O X O Pretest 4-week Posttest ENABL program       Control Group O O Pretest Posttest      
Less control for threat to internal validity ..
Possibility of selection bias
15
Pretest/Posttest design (pre-experimental)
  This study assessed the impact of a dating violence sensitivity group intervention. The participants were 190 high school students, ages 13 to 19 years. Questionnaire data about the knowledge and values and skills needed to avoid dating violence were collected from the participants before the 12-week group training and at the end of the training sessions.   NO RANDOM ASSINGMENT NO CONTROL/COMPARISON GROUP     Experimental group O X O Pretest 12-week Posttest training program    
Least control for threat to internal validity ..
History, maturation,
16
What about Measurement
  • Use of many different types of measurement tools,
  • dependent on the intent and type of
    evaluation research

17
USING MULTIPLE MEASURES and SEVERAL DATA
COLLECTION STRATEGIES TO EVAULATE THE FACT PROGRAM
Independent varialbe
Measures for outcome and causal mechanisms Does
the program cause change? How does change happen

Measures for input And program efficiency
Measures for Process/Implementation
Evaluation What services are being delivered, by
who, how?
18
LOGIC MODEL
  • many pieces of information that must be organized
    and then interpreted.
  • need a way in which this information can be
    organized.

19
What is the Logic Model?
  • A schematic representation of the various
    components that make up a social service program.
  • Logic models may describe
  • theory and its link to change (theory approach
    model) where attention is on the how and why a
    program works
  • outcomes (outcome approach model) where the focus
    of the logic model is to connect resources and
    activities to expected changes
  • activities (activities approach model) or
    describing what the program actually does

20
Logic model- outcomes example
Short term outcomes.Measured by the research
Program inputs
Program processes
Long term outcomes/ difficult to measure
Identifying the causal mechanism
21
Whats an Evaluabilty Assessment
Newly emergent programs that are not fully
operational are not ready for, and indeed can
be tarnished by a summative evaluation geared to
assessing program outcomes. HOW SO??
  • a systematic process that helps identify whether
    program evaluation is justified, feasible, and
    likely to provide useful information.
  • determines whether a program is ready for
    evaluationeither a process or outcome
    evaluation, or both.
  • Is the program able to produce the information
    required for a process evaluation,.. AT WHAT
    STAGE OF IMPLEMTATION IS THE PROGRAM?
  • Can a program meet the other criteria for
    beginning an outcome evaluation.
  • determines whether a program has the basic
    foundation for an evaluation to take place

Evaluability Assessment Examining the
Readiness of a Program for Evaluation. Juvenile
Justice Evaluation Center Justice Research and
Statistics Association. Program Evaluation
Briefing Series 6. May, 2003, p.
6 http//www.jrsa.org/pubs/juv-justice/evaluabilit
y-assessment.pdf
22
Evaluability of a program based on
  • ESTABLISHED PROGRAM
  • measurable outcomes
  • defined service components
  • an established recruiting, enrollment, and
    participation process
  • good understanding of the characteristics of
    the target population, program participants and
    program environment
  • ability to collect and maintain information
  • adequate program size
  • RESEARCH SAVVY SERVICE DELIVERY STAFF
  • problem solving values and skills
  • prior experience with evaluation confidence in
    program
  • commitment to new knowledge
  • openness to change
About PowerShow.com