Evaluation Planning III: Identifying and Selecting the Evaluation Questions and Criteria - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Evaluation Planning III: Identifying and Selecting the Evaluation Questions and Criteria

Description:

Evaluations are conducted to answer questions and to apply ... existing literature to help develop causative models and questions to guide the evaluations ... – PowerPoint PPT presentation

Number of Views:172
Avg rating:3.0/5.0
Slides: 17
Provided by: maryclair
Category:

less

Transcript and Presenter's Notes

Title: Evaluation Planning III: Identifying and Selecting the Evaluation Questions and Criteria


1
Evaluation Planning III Identifying and
Selecting the Evaluation Questions and Criteria
Dr. Suzan Ayers Western Michigan
University (courtesy of Dr. Mary Schutten)
2
Evaluation Questions
  • Evaluations are conducted to answer questions and
    to apply criteria to judge the value of something
  • Evaluation Questions provide the direction and
    foundation for the evaluation
  • They articulate the focus of the study

3
Criteria and Standards
  • Criteria used to identify the characteristics of
    a successful program (measure)
  • Standards designate the level of performance the
    program must achieve on these criteria to be
    deemed a success (performance)
  • Without standards, evaluator can not judge the
    results, without criteria unable to judge the
    program itself

4
Phases of Identifyingand Selecting Questions
  • Divergent phase a comprehensive laundry list
    of potentially important questions and concerns
    many sources, all questions are listed
  • Convergent phase evaluators select from the
    laundry list the most critical questions to be
    answered
  • Criteria are developed after the convergent phase

5
Divergent Phase Sources
  • Questions, concerns, values of stakeholders
  • Policy makers (legislators, board members)
  • Administrators, managers (direct program)
  • Practitioners (operate program)
  • Primary consumers (clients, students, patients)
  • Secondary consumers (affected audiences)
  • What is their perception of the program? What Qs
    or concerns do they have? How well do they think
    it is doing? What would they change if given the
    chance?

6
Stakeholder Interview Questions fig 12.1
  • What is your general perception of the program?
    What do you think of it?
  • What do you perceive as the purposes?
  • What do you think the program theory is?
  • What concerns do you have about the program?
    Outcomes? Operations?
  • What major questions would you like the
    evaluation to answer? Why?
  • How could you use the information provided by
    these questions?

7
  • Use of evaluation models/approaches
  • Objectives-oriented are goals defined and to
    what extent are they achieved?
  • Management-oriented questions about CIPP
    context (need), input (design), process
    (implementation), product (outcomes)
  • Participant-oriented consider all stakeholders
    and listen to what they have to say. Process of
    program is critical
  • Consumer-oriented checklists and sets of
    criteria to help determine what to study what
    standards to apply
  • Expertise-oriented standards and critiques that
    reflect the view of the experts in the field

8
  • Findings and issues raised in the literature in
    the field of the program
  • Evaluator should be conversant with salient
    issues in the programs area
  • Use existing literature to help develop causative
    models and questions to guide the evaluations
  • Literature search may be a useful start to the
    planning process

9
  • Professional standards, checklists, instruments,
    and criteria developed or used elsewhere
  • Standards for practice exist in many fields,
    including PE and athletics
  • Views and knowledge of expert consultants
  • If expertise in the content area, they may
    provide a neutral and broader view
  • They can be asked to generate a list of questions
    and can identify previous evaluations of similar
    programs

10
  • Evaluators own professional judgment (p. 244)
  • Trained to raise thoughtful questions
  • Is the purpose of the program really serving an
    important purpose?
  • Are goals and objectives consistent with
    documented needs?
  • What critical elements and events should be
    studied and observed?
  • Summarizing suggestions from multiple sources
  • P. 245-6

11
Convergent Phase
  • Three reasons to reduce to the range of variables
  • There will always be a budget limit
  • If the study gets very complicated, it gets
    harder and harder manage
  • Audience attention span is limited
  • Who should be involved?
  • Evaluator
  • Stakeholders
  • Sponsor
  • Parties affected by the evaluation

12
Determining Which Questions to Study (Cronbach,
1980)
  • Who would use the information? Who wants to know?
    Who will be upset if this question is dropped?
  • Would an answer to the question reduce
    uncertainty or give info not now available?
  • Would the answer to the question yield important
    information?
  • Is this question merely of passing interest or
    does it focus on critical issues of continued
    interest?
  • Would the scope of the evaluation be seriously
    limited if this question were dropped?
  • Is it feasible to answer this question given the
    available financial and human resources? Time?
    Methods? Technology?

13
Convergent Phase
  • Sit down with sponsor and/or client and review
    the laundry list and the items marked as doable
    (from 12.2 matrix)
  • Reduce the list via consensus
  • Advisory board typical format
  • Provide the new list with a short explanation
    indicating why each is important and share with
    stakeholders

14
Matrix for Selecting Questions Fig 12.2
Would the evaluation question. Be of interest to
key audiences? Reduce present uncertainty? Yield
important information Be of continuing (not
fleeting) interest? Be critical to the studys
scope? Have an impact on the course of events? Be
answerable in terms of , time,
methods/technology?
15
Criteria and Standards
  • Developed to reflect the degree of difference
    that would be considered meaningful enough to
    adopt the new program
  • Absolute Standard a defined level is met/not met
  • Learn stakeholders range of expectations and
    determine standards from that
  • Relative Standard comparison to other groups or
    standards
  • Typically use the statistical concept of
    significance and effect size to determine if the
    program is that much better than what is in
    place (p. 253 expl)

16
  • Flexible (not indecisive)
  • Allow new question, criteria, and standards to
    emerge
  • Remember, the goal for this step is to lay the
    foundation to create a meaningful and useful
    evaluation
Write a Comment
User Comments (0)
About PowerShow.com