9step Evaluation Process - PowerPoint PPT Presentation

1 / 108
About This Presentation
Title:

9step Evaluation Process

Description:

Increase your knowledge of processes involved in program evaluation ... Program Evaluation Training ... Exercise 2 Scope Write (p. 3 of Workbook) ... – PowerPoint PPT presentation

Number of Views:51
Avg rating:3.0/5.0
Slides: 109
Provided by: janet79
Category:

less

Transcript and Presenter's Notes

Title: 9step Evaluation Process


1
Basic Program Evaluation
NTSC Training Materials
2
Purpose/Objectives
  • Increase your knowledge of processes involved in
    program evaluation
  • Provide information and resources to help you
    design and conduct your own program evaluation

3
Program Evaluation Training
This training presentation is in 16 modules,
encompassing 9 steps in the evaluation process
4
Program Evaluation Training Modules
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence Needed
  • Module 6 Specifying the Design

5
Program Evaluation Training Modules
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

6
Program Evaluation Training Modules
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions
    and Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

7
Program Evaluation Training Modules
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence Needed
  • Module 6 Specifying the Design

8
Module 1 Introduction
  • Module 1 Introduction
  • Why evaluate?
  • What is evaluation?
  • What does evaluation do?
  • Kinds of evaluation

9
Why Evaluate?
  • Determine program outcomes
  • Identify program strengths
  • Identify and improve weaknesses
  • Justify use of resources
  • Increased emphasis on accountability
  • Professional responsibility to show
    effectiveness of program

10
What is Program Evaluation?
  • Purposeful, systematic, and careful collection
    and analysis of information used for the purpose
    of documenting the effectiveness and impact of
    programs, establishing accountability, and
    identifying areas needing change and improvement

11
What Evaluation Does
  • Looks at the results of your investment of time,
    expertise, and energy, and compares those results
    with what you said you wanted to achieve

12
Kinds of Evaluation
  • Outcome
  • Implementation
  • Formative
  • Summative

13
Outcome Evaluation
What Identifies the results or effects of a
program When You want to measure students or
clients knowledge, attitudes, and behaviors as a
result of a program Examples Did program
increase achievement, reduce truancy, create
better decision-making?
14
Implementation Evaluation
What Documents what the program is and to what
extent it has been implemented When A new
program is being introduced identifies and
defines the program identifies what you are
actually evaluating Examples Who receives
program, where is program operating is it being
implemented the same way at each site?
15
Timing of Evaluation
  • Formative
  • as the program is happening to make changes as
    program is being implemented
  • Summative
  • at the end of a program to document results

16
Module 2 Overview
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence Needed
  • Module 6 Specifying the Design

17
Overview The 9-step Process
  • Planning
  • Development
  • Implementation
  • Feedback

18
Overview The 9-step Process
19
Overview The 9-step Process
20
Overview The 9-step Process
21
Module 3 Defining the Purpose
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence
    Needed
  • Module 6 Specifying the Design

22
9-step Evaluation Process
Step 1 Define Purpose and Scope
23
Step 1 Scope/Purpose of Evaluation
  • Why are you doing the evaluation?
  • mandatory? program outcomes? program
    improvement?
  • What is the scope? How large will the effort
    be?
  • large/small broad/narrow
  • How complex is the proposed evaluation?
  • many variables, many questions?
  • What can you realistically accomplish?

24
Resource Considerations
  • Resources
  • Staff
  • who can assist?
  • need to bring in expertise?
  • do it yourself?
  • advisory team?
  • Time
  • Set priorities
  • How you will use the information

25
Module 4 Specifying the Questions
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence Needed
  • Module 6 Specifying the Design

26
9-step Evaluation Process
Step 2 Specify Evaluation Questions
27
Evaluation Questions
What is it that you want to know about your
program? operationalize it (make it measurable)
Do not move forward if you cannot answer this
question.
28
Sources of Questions
  • Strategic plans
  • Mission statements
  • Policies
  • Needs assessment
  • Goals and objectives
  • National standards and guidelines

29
Broad Questions
  • Broad Scope
  • Do our students contribute positively to society
    after graduation?
  • Do students in our new mentoring program have a
    more positive self-concept and better
    decision-making skills than students without
    access to the mentoring program?
  • To what extent does the states career
    development program contribute to student
    readiness for further education and training and
    success in the workforce?

30
Narrow Questions
  • Narrow Scope
  • Can our 6th grade students identify appropriate
    and inappropriate social behaviors?
  • How many of our 10th grade students have
    identified their work-related interests using an
    interest inventory?
  • Have 100 of our 10th grade students identified
    at least 3 occupations to explore further based
    on their interests, abilities, and knowledge of
    education and training requirements?

31
Exercise 1 Scope (p. 2 of Workbook)
  • From the list of questions, identify those that
    might be considered broad and those that might be
    considered narrow
  • How large will the resources need to be to
    answer the question

32
Exercise 2 Scope Write (p. 3 of Workbook)
  • List one broad evaluation question and one
    narrow evaluation question

33
Module 5 Identifying Evidence Needed
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence Needed
  • Module 6 Specifying the Design

34
Identifying Evidence Needed to Answer Your
Questions
  • What evidence do you have to answer your
    question?

35
Identifying Evidence Needed to Answer Your
Questions
  • Need to think about what information you need in
    order to answer your evaluation questions

36
Example Evidence Broad Scope
  • Do our students contribute positively to society
    after graduation?
  • Percent of our students that are employed, in
    education or training programs, in the military,
    are supporting a family by working at home,
    and/or are volunteering for charitable causes 3
    years after high school graduation
  • Percent of our students that vote in local and
    national elections 5 years after graduation

37
Example Evidence Narrow Scope
  • Have 100 of our 10th grade students identified
    at least 3 occupations to further explore that
    are based on their interests, abilities, and
    knowledge of the education and training
    requirements?
  • Number of 11th and 12th grade students
    participating in the career class that
    demonstrated increased career maturity from a
    pre- and post-test

38
Exercise 3 Evidence (p. 4 of Workbook)
  • List evidence you need to have to answer the
    question

39
Module 6 Specifying the Design
  • Module 1 Introduction
  • Module 2 Overview
  • Module 3 Defining the Purpose
  • Module 4 Specifying the Questions
  • Module 5 Identifying Evidence
    Needed
  • Module 6 Specifying the Design

40
9-step Evaluation Process
Step 3 Specify Evaluation Design
41
Types of Designs
  • Relates to when data should be collected
  • Status (here and now snapshot)
  • Comparison (group A vs. group B program A vs.
    program B)
  • Change (what happened as a result of a program
    what differences are there between time A and
    time B)
  • Longitudinal (what happens over extended time)

42
Exercise 4 Design (p. 5 of Workbook)
  • What type of design fits each evaluation
    question?
  • Status
  • Comparison
  • Change
  • Longitudinal

43
Module 7 Data Collection Plan
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial
    Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

44
9-step Evaluation Process
Step 4 Create a Data Collection Action Plan
45
Organize Your Evaluation With a Data Collection
Action Plan
46
Components of a Data Collection Action Plan
  • What Will be Collected?
  • based on evidence required
  • How Collected? Instrumentation
  • surveys? published instrument? focus group?
    observations?

47
Components of a Data Collection Action Plan
  • From Whom Collected?
  • who or what provides evidence
  • When Collected and by Whom?
  • specific dates, times, persons
  • How Data are to be Analyzed?

48
Data Sources Who and What
  • Students
  • Parents
  • Teachers
  • Counselors
  • Employers
  • Friends
  • Documents and other records

49
Exercise 5 Data Sources (p. 6 of Workbook)
  • Who/what are the data sources for the following
    questions?

50
Module 8 How to Collect Data
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial
    Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

51
Data Collection Options
  • Commercial instrument
  • Survey/questionnaire
  • Focus group/interviews
  • Observations
  • Archived information

52
Module 9 Using Commercial Instruments
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial
    Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

53
Commercial Instruments
  • Sometimes best to use published or research
    instruments
  • particularly for tough constructs
  • since its not made specifically for you, may not
    answer your question entirely

54
Sources of Information on Instruments
  • Counselors Guide to Career Assessment
    Instruments
  • Relevance, the Missing Link - A Guide for
  • Promoting Student Success Through Career
  • Development Education, Training, and
  • Counseling
  • The Buros Institute
  • ETS Test Collection
  • The Association for Assessment in Counseling and
    Education

55
Exercise 6 Decision-Making Checklist (p. 7 of
Workbook)
  • This checklist will help you conduct a review of
    data collection instruments that you are
    considering using in your evaluation

56
Module 10 Using Self- Constructed Instruments
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial
    Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

57
Self-Constructed Instruments Questionnaires
  • Focus on evidence you need
  • Use simple language
  • Ask only what you need keep it short
  • Dont use jargon
  • Each question should focus on one idea
  • Make sure terms are clear
  • Make it easy for person to answer the questions
    (check rather than write, where possible)
  • Use extended response when you want details

58
Types of Scales (1)
Specific (yes, no number gender) Extended (1-3,
1-5, 1-7)
59
Types of Scales (2)
60
Anchored Scales
61
Scales for Younger Students
62
Self-Constructed Instruments Focus
Groups/Interviews
  • Good to use when you want extended and detailed
    responses
  • Craft an agenda and stick to it
  • Keep groups small (6-10) time short (1-1.5
    hours)
  • Specify objectives of session
  • Questions need to be clear one question at a
    time
  • Encourage everyone to participate
  • Use opportunity to probe deeper on a topic

63
Observations and Observational Checklist
  • You can observe a lot just by watching.
  • -- Yogi Berra
  • Go to pages 8 and 9 of your Workbook and review
    an example of an observational checklist

64
Archives and Documents
  • Examine Whats Already Available
  • Examples
  • Attendance records
  • Truancy reports
  • Grades
  • Bullying incidents
  • Report cards
  • Portfolios
  • Discipline referrals
  • Public service hours
  • Police reports

65
Exercise 7 - Data Collection Action Plan
  • Review examples of a completed
  • Data Collection Action Plan on
  • Pages 10-12 of the Workbook

66
Module 11 Collecting Data
  • Module 7 Data Collection Plan
  • Module 8 How to Collect Data
  • Module 9 Using Commercial
    Instruments
  • Module 10 Using Self-Constructed
    Instruments
  • Module 11 Collecting Data

67
9-step Evaluation Process
Step 5 Collect Data
68
How Much Data Should You Collect?
  • How much data do you need?
  • 100 of target audience is ideal may be too
    expensive and time consuming
  • If not 100, sample is OK if group is
    representative of group as a whole (population)

69
Types of Samples
70
Data Collection Considerations
  • When should you collect the information?
  • Who should collect it?

71
Module 12 Analyzing Data
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions and
    Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

72
9-step Evaluation Process
Step 6 Analyze Data
73
What is Data Analysis?
  • Data collected during program evaluation are
    compiled and analyzed (counting number
    crunching)
  • Inferences are drawn as to why some results
    occurred and others did not
  • Can be very complex depending on your evaluation
    questions
  • We will focus on simple things that can be done
    without expert consultants

74
Types of Data AnalysisSimple Frequency Counts
75
Types of Data AnalysisSort by Relevant Categories
76
Types of Data AnalysisCalculate Percentages
Exercise 8 (p.13 of Workbook)
77
Types of Data AnalysisShowing Change or
Differences
78
Types of Data Analysis Reaching an Objective or
Goal
79
Types of Data Analysis Observing Trends
80
Types of Data Analysis Graph Results
81
Types of Data Analysis Calculate Averages
Exercise 9 (p. 14 of Workbook)
82
Types of Data Analysis Calculate Weighted Averages
83
Types of Data Analysis Calculate Weighted Averages
84
Types of Data Analysis Rank Order Weighted
Averages
85
Types of Data Analysis Graph Weighted Averages
86
Using Focus Group/Interview Information
  • Qualitative findings from focus groups, extended
    response items, etc., should be analyzed in a
    different way
  • Code words/frequency
  • Identify themes
  • Pull quotes
  • Summarize and draw conclusions

87
Module 13 Drawing Conclusions and Documenting
Findings
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions and
    Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

88
9-step Evaluation Process
Step 7 Drawing Conclusions and Documenting
Findings
89
Drawing Conclusions
  • Examine results carefully and objectively
  • Draw conclusions based on your data
  • What do the results signify about your program?

90
Exercise 10 Interpreting Results (p. 15-16 of
Workbook)
  • Complete the Interpreting Results Exercise on
    pages 15-16 of the Workbook.

91
Unintended Consequences
  • Watch for positive and negative outcomes that
    you did not plan on
  • - For example, if your career development
    program focuses on increasing students awareness
    of how to identify their interests and skills, it
    may have the unintended consequence of leaving
    little time for students to explore occupations
    and jobs in their area.
  • - Or, if your program has overemphasized the
    importance of getting a college education,
    students may not be considering the positive
    benefits of other kinds of postsecondary
    training.

92
What to Include in Your Documentation
  • Program description
  • Evaluation questions
  • Methodology (how and from whom and when)
  • Response rate
  • Methods of analysis
  • Conclusions listed by evaluation question
  • General conclusions and findings
  • Action items
  • Recommendations for program improvement and
    change

93
Document the Successes and Shortfalls
  • Highlight and brag about positive outcomes
  • Document shortfalls
  • Provides opportunities to
  • improve program
  • make recommendations to benefit the program

94
Module 14 Disseminating Information
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions and
    Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

95
9-step Evaluation Process
Step 8 Disseminate Information
96
Determining Dissemination Methods
  • Inform all your relevant stakeholders on results
  • Dissemination methods should differ by your
    target audience

97
Potential Audiences
  • Your program staff
  • Businesses
  • Partners that work with your program
  • Employers
  • School Level
  • School administrators
  • Counselors
  • Teachers
  • Students
  • Parents

98
Potential Audiences
  • Media
  • Local newspaper
  • TV station
  • Radio program
  • Community or school newsletter
  • Education Researchers
  • Members of Community or faith based
    organizations
  • Church members
  • Religious leaders
  • Rotary club
  • Boys or girls club
  • Anyone who participated in your evaluation!

99
Dissemination Techniques
  • Reports
  • Journal articles
  • Conferences
  • Career Newsletter/Tabloids
  • Presentations
  • Brochures
  • TV and newspaper interviews
  • Executive summary
  • Posting on Web site

100
Exercise 11 Disseminating Information (p. 17 of
Workbook)
  • Using the information provided in exercise 6,
    describe how you would disseminate the
    information to
  • Program funders
  • Parents

101
Module 15 Feedback for Program Improvement
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions and
    Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

102
9-step Evaluation Process
Step 9 Feedback to Program Improvement
103
Opportunities to Fix Shortfalls
  • Evaluation results may show areas where
    improvement is necessary
  • - 25 of 11th graders are unable to complete a
    skills based resume
  • - 85 of our students drop out of college in
    the first year
  • - Most employers do not want your students to
    serve an interns in their companies

104
Feedback to Program Improvement
  • You can use evaluation findings to make program
    improvements
  • Consider adjustments
  • Re-examine/revise program strategies
  • Change programs or methodologies
  • Increase time with the program
  • Use your results as a needs assessment for
    future efforts

105
Module 16 - Conclusion
  • Module 12 Analyzing Data
  • Module 13 Drawing Conclusions and
    Documenting Findings
  • Module 14 Disseminating Information
  • Module 15 Feedback for Program
    Improvement
  • Module 16 Conclusion

106
Conclusion
  • Evaluation helps you
  • determine the effects of the program on
    recipients
  • know if you have reached your objectives
  • improve your program

107
Conclusion
  • The 9-step process works
  • A credible evaluation can be done with careful
    planning and some basic math skills

108
Exercise 12 Developing a Data Collection Action
Plan (page 18 of Workbook)
Using all the information you have gathered from
the workbook exercises and the powerpoint slides,
you can develop your own Data Collection Action
Plan on page 18 of your Workbook
Write a Comment
User Comments (0)
About PowerShow.com