Bringing Your Program Full Circle: Evaluation Tips and Techniques 2008 National Air Quality Conference Elizabeth Schmitz, KY Division for Air Quality Julie Ernst, University of MN Duluth - PowerPoint PPT Presentation

Loading...

PPT – Bringing Your Program Full Circle: Evaluation Tips and Techniques 2008 National Air Quality Conference Elizabeth Schmitz, KY Division for Air Quality Julie Ernst, University of MN Duluth PowerPoint presentation | free to download - id: 409d15-MWNjZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Bringing Your Program Full Circle: Evaluation Tips and Techniques 2008 National Air Quality Conference Elizabeth Schmitz, KY Division for Air Quality Julie Ernst, University of MN Duluth

Description:

Bringing Your Program Full Circle: Evaluation Tips and Techniques 2008 National Air Quality Conference Elizabeth Schmitz, KY Division for Air Quality – PowerPoint PPT presentation

Number of Views:850
Avg rating:3.0/5.0
Slides: 68
Provided by: NojNas
Learn more at: http://www.epa.gov
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Bringing Your Program Full Circle: Evaluation Tips and Techniques 2008 National Air Quality Conference Elizabeth Schmitz, KY Division for Air Quality Julie Ernst, University of MN Duluth


1
Bringing Your Program Full Circle Evaluation
Tips and Techniques2008 National Air Quality
ConferenceElizabeth Schmitz, KY Division for
Air QualityJulie Ernst, University of MN Duluth
2
Overview of Program Evaluation
3
What is Program Evaluation?
  • The systematic collection of information about
    the activities, characteristics, and outcomes of
    programs to make judgments about the program,
    improve program effectiveness, and/or inform
    decisions about future programming.
    (Patton, 1997).

4
Why evaluate?
  • Understand what is working and what isnt
    (program improvement)
  • Make sound decisions (should the program be
    continued, scaled back, discontinued, enhanced?)
  • Justify programs, explain accomplishments
  • Promote programs, products, and services
  • Gain funding
  • Guide program development

5
Importance of Use
  • Idea of using the evaluation data to improve a
    program or make decisions is central to
    evaluation
  • Evaluations should be conducted with a specific
    use for and user of the evaluation in mind

6
Types of Evaluation based on Purpose (Intended
Use)
  • Front-End to guide program development (Is this
    program needed? How should it be designed? What
    should the program outcomes be?) Used by those
    developing the program
  • Formative to guide program improvement (What is
    working? What needs to be improved? How it can be
    improved?) Generally used internally Often
    occurs in early stages of program development
  • Summative to guide decisions about the programs
    future Used internally and externally by key
    decision-makers (program staff, supervisors,
    funders) Often occur later in program
    development

7
Formative v. Summative
  • When the cook tastes the soup, thats formative
    evaluation when the guest tastes it, thats
    summative evaluation.
  • (Robert Stake in Patton, 1997)

8
Check for Understanding
  • What type of evaluation do the following examples
    describe?
  • After developing a series of potential messages,
    the APCD held focus groups to determine what
    message was most likely to move Louisville
    residents to take action that would benefit air
    quality.

9
Check for Understanding
  • DAQ is currently designing an online survey that
    will be emailed to all 6-12 grade teachers in
    Kentucky. The results of this survey will assess
    teacher needs in order to guide development of a
    unit of study.
  • Pre- and post-tests are routinely used to
    evaluate the learning gains of workshop
    participants, and tallied at the end of a year
    for an overall picture of program efficacy.

10
The Evaluation Process
  1. Focus your evaluation
  2. Develop your evaluation plan
  3. Develop data collection tools
  4. Collect data
  5. Analyze data and interpret results
  6. Communicate and use the results to improve the
    program or make decisions

11
Focusing your Evaluation
12
Focusing Your Evaluation
  • A. Identify the purpose for your evaluation
    (clarify uses and users)
  • For example
  • The purpose of this evaluation is to determine
    which EE programs support the mission of the KY
    DAQ, in order for the Director to make summative
    decisions regarding which programs to continue
    and which to suspend.

13
Focusing Your Evaluation
  • Questions to consider
  • Who are your program stakeholders?
  • Why are you considering evaluating your program?
  • Who are your evaluation stakeholders?
  • Who is the primary intended user of your
    evaluation?
  • Specifically, how will the results be used?

14
Focusing Your Evaluation
  • Identify the purpose for your evaluation (clarify
    uses and users)
  • Describe your program
  • Introducing the logic model!

15
What is a Logic Model?
  • Diagram that summarizes key elements of a program
    in a way that shows the relationships among
    program elements
  • Relationship between what we put in, what we do,
    and outcome) describes the sequence of events
    thought to bring about benefits or change

16
Everyday Logic Model
H U N G R Y
Gather Ingredients
Cook and Eat
Feel Satisfied
17
Logic Model
S I T U A T I O N
INPUTS
OUTPUTS
OUTCOMES
18
  • SITUATION
  • The conditions that give rise to the
    program
  • What needs to be done?
  • What do our stakeholders want done?
  • What are our priorities?

19
  • INPUTS
  • What we invest
  • Resources and contributions
  • Staff, volunteers, time, money, materials,
    equipment, technology, partners, facilities

20
  • OUTPUTS
  • What we do and who we reach
  • Activities (training, recruitment, workshop etc.)
    and Products (activity guide, exhibit,
    curriculum, poster, etc.)
  • People we reach (visitors, citizens,
    participants, students)

21
  • OUTCOMES
  • The results
  • Learning Awareness, Knowledge, Attitudes,
    Skills, Opinions, Motivations
  • Action Behavior, Decision-making, Social action,
    Policies
  • Ultimate Impact Social, Economic, and
    Environmental Conditions

22
Outputs v. Outcomes
  • Output (Activity) driven
  • Teens volunteered an average of 10 hours over the
    summer in community service projects.
  • Outcome (Impact) driven
  • Teens learn how to identify and solve a community
    need.
  • Teens feel more responsible for their community.
  • Outcomes answer SO WHAT? What difference does
    the program make?

23
Logic Model 2 other pieces
S I T U A T I O N
INPUTS
OUTPUTS
OUTCOMES
External Factors
24
External Factors
  • Context in which the program is situated and
    external conditions which influence the success
    of the program, such as
  • Politics
  • Policies
  • Demographics
  • Economics
  • Culture
  • Biophysical environment

25
Logic Model 2 other pieces
S I T U A T I O N
INPUTS
OUTPUTS
OUTCOMES
External Factors
Assumptions
26
Underlying Assumptions
  • Beliefs we have about the program and the way we
    think it will work
  • the participants
  • the way the program will operate
  • how resources will be used
  • Faulty assumptions lead to poor results Are your
    assumptions realistic and sound?

27
Check for Understanding
  • 13 counties may face non-attainment designation
    as a result of the new 8-hour ozone standard.
  • Teachers applied their new air quality
    understanding in the classroom.
  • The Office of Energy Policy awarded a grant of
    350 for the purchase of CFLs.
  • The Clean Air for KY program reached 4,000
    students.
  • Increasing students knowledge about air quality
    through school-based outreach will encourage
    students to take action, like turning off the
    lights, at home.

28
Focusing Your Evaluation
  • Identify the purpose for your evaluation (clarify
    uses and users)
  • Describe your (EE) program
  • Consider logistics
  • Available staff for the evaluation
  • Timeframe
  • Money/other resources available
  • Contextual or other external factors that may
    affect the evaluation process

29
Developing your Evaluation Plan
30
Evaluation Plan
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Data CollectionTools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?

31
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Evaluation Questions 2 Phases
  • 1. Generate a list of potential evaluation
    questions.
  • 2. Narrow your list - would the evaluation
    question
  • Be of interest to primary intended user?
  • Provide information that addresses the intended
    use for the evaluation results?
  • Contribute information that is not already
    known?
  • Be of continuing interest?
  • Be feasible, in terms of time, money, and skill?
  • Issues can emerge that require new or revised
    questions be flexible, yet do not chase every
    new or interesting direction that emerges

32
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?
Have fire fighters understanding of 401 KAR 63005 and the health effects of illegal burning increased?
Has the training changed their attitude towards and/or under-standing of the issue?
33
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Indicators
  • Evidence or information that represents the
    phenomena of interest What would indicate this
    program objective/outcome has been achieved?
    What does success look like?
  • Help you know something they are usually
    specific and measurable
  • For each aspect you want to measure, ask What
    would it look like? What kind of information is
    needed?

34
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?
Have fire fighters understanding of 401 KAR 63005 and the health effects of illegal burning increased? Can list what can/cant be burned legally can list health effects, identify routes of exposure and sensitive populations
Has the training changed their attitude towards and/or under-standing of the issue? Discussion indicates attitude/understanding shift more illegal burns referred to DAQ by F.D.s
35
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Determine Sources of Information
  • (Who will provide the data?)
  • Participants
  • Non-participants
  • Key informants (parents, teachers, previous
    participants)
  • Program staff, administrators, or partners
  • Program documents (logs, records, minutes of
    meetings)

36
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?
Have fire fighters understanding of 401 KAR 63005 and the health effects of illegal burning increased? Can list what can/cant be burned legally can list health effects, identify routes of exposure and sensitive populations Fire fighters
Has the training changed their attitude towards and/or under-standing of the issue? Discussion indicates attitude/understanding shift more illegal burns referred to DAQ by F.D.s Fire fighters Regional office staff
37
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Determine Data Collection Tools
  • Tool what you use to collect data
  • Choice of tool dependent on
  • Intended users of and use for evaluation
  • Evaluation question, indicator, and source of
    information previously identified
  • Amount of time and money
  • Skill and philosophy of evaluator
  • Weighing of advantages and disadvantages

38
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Determine Data Collection Tools
  • Types of Data (Information) that Can Be
    Collected
  • Qualitative Descriptive, narrative, rich in
    explanation. Often collected using a smaller set
    of participants. Depth
  • Quantitative Numerical measurement. Often
    collected through larger set of participants. In
    some cases, can be generalizable to a population.
    Breadth

39
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?
Have fire fighters understanding of 401 KAR 63005 and the health effects of illegal burning increased? Can list what can/cant be burned legally can list health effects, identify routes of exposure and sensitive populations Fire fighters Pre-Post test (Quant.)
Has the training changed their attitude towards the issue? Discussion indicates attitude/understanding shift more illegal burns referred to DAQ by F.D.s Fire fighters Regional office staff Focus Group (Qual.) TEMPO reports (Quant.)
40
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Design
  • When would you need to collect data from whom
    if you want to show
  • gain or change in participants knowledge?
  • participants outperform another group?
  • participants have the desired characteristic,
    behavior, or knowledge after your program?
  • changes in participants over time?
  • results can be attributed to your program in a
    causal sense?

41
Evaluation Questions Indicators Sources of info Tools Design/ Sampling
  • Sampling (Who/how many?)
  • A sample is a subgroup of a larger group
    (population)
  • Sampling refers to the method used to select the
    people, classrooms, counties, etc. to study
  • Sampling decisions are based on population size,
    what you want to know, and the resources
    available.
  • First questions to ask What is the population of
    interest and is sampling needed? If the
    population is small, you likely will include all
    its members.

42
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered?
Has fire fighters knowlege of 401 KAR 63005 and the health effects of illegal burning increased? Can list what can/cant be burned legally can list health effects, identify routes of exposure and sensitive populations Fire fighters Pre-Post test All training participants for each session pre- and post- test
Has the training changed their attitude towards and/or under-standing of the issue? Discussion indicates attitude/understanding shift more illegal burns referred to DAQ by F.D.s Fire fighters Regional office staff Focus Group TEMPO reports Volunteer sub-groups All regional offices Bi-annual reports
43
Developing Data Collection Tools
44
Developing Data Collection Tools
  • Interviews or surveys?
  • Interview Open-ended questions designed to
    elicit thoughts, feelings, experiences, and
    stories from respondents no response provided
    Qualitative
  • Survey a list of stable and primarily closed
    questions Quantitative

45
Interviews
  • Create an Interview Guide
  • How Many Respondents?
  • Iterative Process

46
Conducting Interviews
  • Be consistent with questions and cues!
  • Reassure participants that you will protect their
    identity
  • Begin and end by thanking participants
  • Request permission to tape
  • Hold in neutral, private territory where the
    respondent will be comfortable

47
Focus Groups
  • Group interviews that encourage participants to
    build on each others responses
  • Guidelines for developing interviews are true for
    focus groups as well
  • Last roughly 1-2 hours and involve 6-10 people

48
Surveys
  • Process large amounts of data
  • Best for generalizing results to a larger
    population
  • Can be used for planning, formative, and
    summative types of evaluation
  • Always pilot test your surveys!

49
Collect Data
50
Collect Data
  • First, review your logic model, evaluation focus,
    and planning matrix to ensure that you are on
    target with the data you collect
  • Look for existing data that may meet your needs
    before designing new data collection protocol

51
Set Data Collection Schedule
What data collection tools are you using? What are the sources for your data? Who will collect the data? Who can confirm the schedules? When should data be collected?

52
Data Collection
  • Maintain consistency!
  • Data collection methods and instruments need to
    be pilot tested
  • Standardize data collection
  • Ethics of Conducting an Evaluation
  • Ask only needed information
  • Confidentiality and anonymity
  • Informed consent

53
Implementing a Survey
  • How will (or will you) entice people to complete
    your survey?
  • Explain why the survey is important
  • Thank people for their participation
  • Send a postcard reminder after 3 weeks
  • Train surveyors to be gracious, polite,
    consistent, and friendly

54
Focus Groups
  • If respondent types vary on 2 or more important
    dimensions, hold separate meetings
  • Have at least 2 sessions for each group.
  • Continue to schedule focus groups as long as you
    continue to get different types of information.

55
Focus Group Planning
  • Location pleasant, neutral (i.e. library,
    extension office, school)
  • Time convenient for participants
  • Incentives
  • Thank and confirm location, time, date with
    participants in advance

56
Focus Group Implementation
  • Moderator, note-taker, recording device, quiet
    snacks!
  • If possible, seat dominant personalities near
    moderator and shy participants across from the
    moderator, let participants know about the
    recorder in a non-threatening way
  • Transcribe tapes within 3-5 days

57
Analyze Data and Interpret Results
58
Surveys
  • When the surveys come back
  • Code them
  • Enter data on a spreadsheet
  • Reduce the data to summary statements and relate
    these back to your evaluation questions
  • Analyze the data
  • Frequencies
  • Means
  • Correlate one variable to another
  • Look for significant differences

59
Focus Groups
  • Transcription not always necessary notes are
    often sufficient
  • Identify similarities and patterns
  • Group themes into meaningful categories

60
Communicate and Use Evaluation Results
61
Organize and synthesize your data
  • Tie specific results to your evaluation
    questions, using your coded responses.
  • Present your findings without interpretation
    save that for conclusions
  • Use qualitative data to describe quantitative
    data, if available

62
Evaluation Questions What do you want to know? Indicators How will you know it? Sources of Info. Who can provide the info? Tools What will you use to gather the info? Design/ Sampling From whom and when will info be gathered? Evaluation Results What did you find?
Have fire fighters knowledge of 401 KAR 63005 and the health effects of illegal burning increased? Can list what can/cant be burned legally can list health effects, identify routes of exposure and sensitive populations Fire fighters Pre-Post test All training participants for each session pre- and post- test Average test score increased from 60 to 75
63
Draw Conclusions
  • Explain results
  • Review open ended questions for insights
  • Dont ignore unexpected results
  • Consider unsolicited comments
  • Avoid claiming causality (use the data suggest
    or the results indicate)

64
Make Recommendations
  • Focus on the most important, data-based
    conclusions
  • Be practical
  • List specific action items
  • Request stakeholder review

65
Sharing Your Findings
  • Formal written report
  • PowerPoint
  • Press Release
  • Web Site
  • Customize for audience and occasion

66
Resources
  • Earnst, J.A., Monroe, M.C., Simmons, Bora.
    Evaluating Your Environmental Education Programs
    A Workbook for Practitioners, 2007 (Draft), to be
    published by Earth Island Press
  • Education Program Evaluation (Course OUT 8102) 4
    day course through USFWS National Conservation
    Training Center www.nctc.org
  • Applied EE Program Evaluation (NRES 410/610)
    online course through University of Wisconsin
    Stevens Point www.uwsp.edu/natres/rwilke/eetap/W
    ebsite2006/AEEPE.htm

67
Thanks!
  • Dr. Julie Ernst, U of MN Duluth
  • US Fish Wildlife Service, Division of Education
    and Outreach, National Conservation Training
    Center
  • National Oceanic and Atmospheric Agency
  • North American Association for Environmental
    Education
About PowerShow.com