Evaluation Tools - PowerPoint PPT Presentation

Loading...

PPT – Evaluation Tools PowerPoint presentation | free to view - id: 48054d-ZjI0Y



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Evaluation Tools

Description:

Evaluation Tools & Techniques 1 KNR 271 Program Evaluation - Guiding Questions Why Why doing evaluation? Primary and secondary purposes Utilization-focused Program ... – PowerPoint PPT presentation

Number of Views:540
Avg rating:3.0/5.0
Slides: 29
Provided by: HPRDepa
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Evaluation Tools


1
Evaluation Tools Techniques 1
  • KNR 271

2
Program Evaluation - Guiding Questions
  • Why
  • Why doing evaluation?
  • Primary and secondary purposes
  • Utilization-focused
  • Program improvement?
  • Satisfy external standards?
  • Compare to similar programs?
  • Document impacts of programs on participants?

3
Program Evaluation - Guiding Questions
  • Who
  • Who will conduct eval?
  • Who is the eval designed for?
  • Who will participate (sample)?
  • What
  • What will be evaluated?
  • Comprehensive or one component
  • Areas facilities, personnel, participants,
    policies procedures, or program

4
Program Evaluation - Guiding Questions
  • When
  • When will the eval be conducted?
  • When does the report need to be disseminated?
  • Where
  • Where will the eval be conducted?
  • On-site, mailed
  • How
  • Which paradigm? Which techniques?

5
Sampling
  • Choosing who will participate in eval
  • Population
  • All people who could be included
  • Users and nonusers
  • Recreation students in Illinois
  • Recreation students at ISU
  • ISU students in KNR 271
  • ISU students in Dr. Klitzings KNR 271

6
Sampling (cont.)
  • Sample
  • Subset of larger population who will participate
    in evaluation
  • Representative of population
  • If representative, can generalize findings to
    larger population
  • Or make statements about the overall programs
    quality
  • Need to state how sample was selected

7
Sampling (Cont.)
  • Random sample
  • Every person in population has the same chance as
    every other person to be selected for the sample
  • Need to identify EVERY person

8
Sampling (Cont.)
  • How large a sample? (Random)
  • As population size decreases, sample size
    increases
  • Rule of thumb (Rossman Schlatter, 2003)
  • Under 500 50
  • 500-1,500 30
  • 1,500-2,500 25
  • Over 2,500 400 people

9
Sampling (Cont.)
  • Select sample (Random)
  • Fishbowl with replacement
  • Random numbers table
  • Good with samples under 100
  • Computer programs

10
Sampling (cont.)
  • Systematic random sample
  • 1st is random then by integral
  • Nth name from list
  • Mailing list

11
Sampling (cont.)
  • Determine sample size (Systematic Random)
  • See rule of thumb for random
  • Population x sample size
  • 1,200 x .30 360
  • Determine interval (Systematic Random)
  • Population / sample size interval
  • 1,200 / 360 3.33 (3)
  • Rossman Schlatter, 2003

12
Sampling (cont.)
  • Systematic Random of unknown population (festival
    attendance)
  • Estimate attendance number / sample interval
  • 5,000 / 400 12.5
  • Every 12th ticket would be sampled
  • Rossman Schlatter, 2003

13
Sampling (cont.)
  • Nonrepresentative sampling
  • Cant generalize findings to population
  • 2 types
  • Convenience
  • Purposive

14
Sampling (cont.)
  • Convenience sampling
  • People who are readily available
  • Folks at a festival
  • Purposive sampling
  • Chosen based on criteria
  • Participate in program
  • Homeless women who live at CWT

15
Practice
  • Consider KNR 271 to be the population
  • Select a sample using
  • Random technique
  • Systematic random
  • Convenience
  • Purposive
  • How do the samples differ?
  • Which allows you to generalize to the class?

16
Judgment Criteria
  • Reliability validity
  • Quantitative
  • Trustworthiness
  • Qualitative

17
Reliability
  • How reliable is the instrument?
  • Does it measure consistently over time?
  • 2 different groups
  • Same group at 2 different times
  • Compare answers (80)
  • Quantitative
  • Well-written items, lengthen, pilot test, clear
    directions, appropriate for audience
  • Qualitative (audit trail) (Dependability)

18
Validity
  • Does it measure what we want it to measure?
  • Face validity
  • Content validity
  • Cover all areas we want?
  • Literature review, expert panel
  • Construct validity
  • Measures construct

19
Validity (cont.)
  • Quantitative
  • Pilot test
  • Clear directions
  • Qualitative (Credibility)
  • Prolonged engagement
  • Use of examples
  • Negative cases
  • Thick, rich descriptions

20
Usability
  • How easy and convenient?
  • Usefulness
  • Quantitative
  • 15 minutes or less
  • Easy to score and interpret
  • Process is reasonably cost
  • Qualitative
  • Trained evaluator
  • Triangulation

21
Triangulation
  • More than 1 source
  • Participant, staff, observers
  • Multiple evaluators
  • 2-3 program leaders
  • Multiple tools
  • Observation, interview, document review

22
Tools Techniques
  • Quantitative
  • Head counts
  • Questionnaires/surveys
  • Closed-ended (Likert scale, semantic
    differential)
  • Forced-choice (Y/N, T/F, checklist, rank order)
  • Open-ended (single stage, multiple stage)
  • See p. 277, 274, 275
  • Importance - Performance analysis

23
Tools Techniques (cont.)
  • Qualitative
  • Review documents and records
  • Observation
  • Open observer
  • Covert participant observer
  • Case study
  • Structured or flexible
  • Sociometry
  • Relationships in graphic form

24
Tools Techniques (cont.)
  • Qualitative cont.
  • Focus groups
  • 8-12
  • Interviews
  • Informal conversation
  • Interview guide
  • Standardized open-ended
  • Closed quantitative interview

25
Elements to Include in Evaluation Instruments
  • Title
  • Purpose
  • 1-2 sentences
  • The purpose of this evaluation is to seek input
    from participants so that program improvements
    can be made in the future

26
Elements to Include in Evaluation Instruments
  • Directions
  • Instructions on completing all question formats.
  • If Likert, define the scale
  • Provide a time estimate for completion
  • This evaluation should take less than 10 minutes
    to complete. For each item, use the following
    scale.

27
Elements to Include in Evaluation Instruments
  • Items
  • Conclusion
  • Thank for participation
  • Tell what to do with completed form

28
Practice
  • Create an evaluation form that measures
    satisfaction with a T-Ball League
  • Use pages 275-277
  • Use elements to include from above
  • 6 closed questions
  • 3 semantic differential, 3 Likert
  • 3 open ended
About PowerShow.com