Designing and Conducting Useful Self-Evaluations at UNESCO - PowerPoint PPT Presentation

Loading...

PPT – Designing and Conducting Useful Self-Evaluations at UNESCO PowerPoint presentation | free to download - id: 1fe96a-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Designing and Conducting Useful Self-Evaluations at UNESCO

Description:

Photography. Cartoons. Drama-Performance. Poetry. 31 ... See for standards of professional practice, ethics etc. Canadian Journal of Program Evaluation ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 36
Provided by: Hal105
Learn more at: http://portal.unesco.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Designing and Conducting Useful Self-Evaluations at UNESCO


1
Designing and Conducting Useful Self-Evaluations
at UNESCO
  • Hallie Preskill, Ph.D.
  • University of New Mexico USA
  • hpreskil_at_unm.edu
  • And
  • Brad Cousins, Ph.D.
  • University of Ottawa, CANADA
  • bcousins_at_uottawa.ca

June 2004
2
Workshop Objectives
  • As a result of having taken this workshop,
    participants will
  • Understand how this workshop fits in the broader
    scope of evaluation at UNESCO.
  • Understand how self-evaluation in UNESCO can be
    useful and potentially contribute to individual,
    team, and organizational learning.
  • Understand how to practically and realistically
    design, implement and use self-evaluation studies
    as a working tool in the current context of their
    projects or activities.
  • Have developed a self-evaluation plan for a
    project or activity in which they are involved.
  • Know how to integrate the self-evaluation
    activities into existing work structures and
    processes.

3
Agenda
  • Evaluation in UNESCO
  • Components of a Self-Evaluation Plan
  • Focusing Your Self-Evaluation
  • Choosing Among Data Collection Methods
  • Analysing Evaluation Data  
  • Communicating Reporting Evaluation Processes
    Findings
  • Reflecting on the Context of Self-Evaluations
  • Maximizing the Usefulness Impact of
    Self-Evaluations
  • Workshop evaluation and follow-up

4
Background for this Workshop UNESCO Evaluation
Strategy
  • The workshops are part of a set of capacity
    building activities in self-evaluation,
    implemented by Internal Oversight Service (IOS)
    on a pilot basis mainly in collaboration with the
    Education Sector. This initiative constitutes an
    important aspect in the implementation of the
    UNESCO Evaluation Strategy developed by IOS and
    endorsed by the Executive Board. The Evaluation
    Strategy (as well as other recent Audit and
    Evaluation reports) calls for self-evaluation as
    a necessary complement to external independent
    evaluation.

5
Definition of Evaluation
  • A systematic assessment of a planned, ongoing or
    completed intervention to determine its
    relevance, efficiency, effectiveness, impact and
    sustainability. The intent is to incorporate
    lessons learnt into the decision making process.
  • (Source adapted from OECD/DAC Glossary, 2002)

6
Judgement
  • Judgement implies comparison of program
    performance data against some standard
  • Performance in program at prior point in time
  • Performance of those receiving similar programs
    (comparative treatment)
  • Performance of those receiving no program
    (control)
  • External standard

7
Evaluation is the use of systematic inquiry to
make judgements about program merit, worth and
significance and to support program decision
making.
  • Summative evaluation (judgement)
  • Formative evaluation (improvement)
  • Who is the judge?

8
External Evaluation
  • OECD Glossary Definition of External Evaluation
    (2002, p. 23)
  • The evaluation of a development intervention
    conducted by entities and/or individuals outside
    the donor and implementing organisations.
  • Independent systematic approach to answering
    evaluative questions
  • Typically commissioned by senior management
  • Written into the C/5 or conducted upon donor
    demand
  • IOS facilitates the process and oversees the
    quality of the evaluations
  • Conducted by external (to UNESCO) evaluation
    experts
  • Selection of C/5 evaluations is presented to ExB

9
Self-Evaluation
  • OECD / DAC Glossary Definition of Self-Evaluation
    (2002)
  • An evaluation by those who are entrusted with the
    design and delivery of a development
    intervention.
  • In the context of the UNESCO Evaluation Strategy
  • Self-evaluations are small-scale evaluation
    projects carried out by staff and management as
    part of their every-day work activities, which
    help them collect and use monitoring and
    evaluation data to answer their own questions
    concerning the quality and direction of their
    work.

10
Purposes of Self-Evaluation
  • Provides opportunities for continuous reflection
    and learning (individual, group, organization)
  • Provides timely information for decision making
    and action on a day-to-day implementation level
  • Draws on organization members knowledge of the
    project and evaluation context
  • Results in useful findings recommendations meet
    specific information needs
  • If done well, results are from systematic, valid,
    and purposeful processes minimizes perceptive
    fallacies
  • Provides opportunity to share achievements
  • Documents what works, what does not, and possible
    reasons why

11
Benefits of Using a Collaborative Approach to
Self-Evaluation
  • Greater credibility to those involved
  • Shared work saves resources and creates team
    spirit
  • Increased learning using reflection and dialogue
    with others
  • More informed interpretations of findings
  • Greater breadth of recommendations
  • Enhanced stakeholder evaluation capacity

12
A Systems Framework for Evaluation
  • The Evaluation Process
  • The Evaluation Environment
  • The Organizations Environment
  • External Requirements and Demands

13
An evaluation use conceptual framework
Evaluation Resources and Context
Evaluation Practice
Decision or Policy Setting
14
Evaluation Practice
  • Planning (divergent / convergent)
  • Instrument development
  • Data collection, processing
  • Data analysis, interpretation
  • Reporting and follow up

15
Self-Evaluation Plan Components (Terms of
Reference)
  • Identifying Self-Evaluation Team Members
  • Focusing the Self-Evaluation
  • Background information (and logic model)
  • Purpose of the evaluation
  • Evaluation stakeholders (intended users of
    results)
  • Evaluation scope (key questions)
  • Designing and Implementing the Self-Evaluation
  • Data collection methods, instruments, sample
  • Evaluation timeline with specified roles and
    responsibilities
  • Communicating and reporting plan
  • Budget

16
Self-Evaluation Stakeholders
  • Users of the evaluation findings
  • Primary
  • Yourself/your team
  • Secondary
  • Implementers of projects/activities
  • Colleagues doing similar work
  • BSP (to feed into current reporting requirements)
  • Immediate or Intermediate Managers
  • Leadership of the organization

17
Evaluation Key Questions
  • Are the broad overarching questions that guide
    the evaluation
  • Form the boundaries and scope of the evaluation
  • Are typically written in an open-ended format
  • Guide the choice of data collection methods
  • Reflect the stakeholders information needs

18
Sample Self-Evaluation Key Questions
  • To what extent does the project bring about the
    intended changes in its target group?
  • How can this project benefit from enhanced
    collaboration with partners?
  • Why does this activity work well in one region,
    but not in the other?
  • For whom is this project working best? Why?
  • What additional services, materials, and/or
    activities are needed to reach better outcomes?
  • What are the unintended consequences of this
    activity?

19
Using a Programs Logic Model to Focus a
Self-Evaluation
  • A logic model
  • Articulates a programs theory of action how it
    is supposed to work.
  • Is a systematic and visual way to represent a
    programs underlying theory.
  • Helps focus an evaluation by making assumptions
    and expectations explicit.
  • Increases stakeholders understanding about a
    program and its evaluation.

20
Logic Model Template
Assumptions The underlying assumptions that influence the projects design, implementation or objectives Resources Human, financial, organizational community resources needed to achieve the projects objectives Activities Things the project does with the resources to meet its objectives Outputs Products of implementing the activities, which are necessary but not sufficient indications of achieving the projects objectives Short-term Outcomes Short-term intended and unintended changes (e.g., in knowledge, attitudes, skills) as a result of the project Long-term Outcomes Long-term intended and unintended changes (e.g., in behavior, status, systems) as a result of the project

21
Developing a Logic Model for Your
Self-Evaluation - Activity
  • Think of a project or work activity that you
    would like to self-evaluate. It should be an
    evaluation
  • That is narrow in scope
  • That is doable
  • Where there is an intended use of findings
  • Where there are realistic opportunities for using
    the findings
  • You may work in groups of 1-3, depending on how
    your work is actually organized.
  • Using the Logic Model Template worksheet, begin
    to develop a logic model for your
    program/activity.
  • Try to make a few notes in each of the columns.

22
Focusing Your Self-Evaluation Activity
  • Revisit the Logic Model you began to draft.
  • Using the worksheet, Focusing Your
    Self-Evaluation,
  • Make some notes regarding the background of the
    program/activity
  • Write an evaluation purpose statement
  • Develop 2-3 evaluation questions
  • Identify potential self-evaluation stakeholders
  • Describe the intended use of the
    self-evaluation's findings

23
Criteria for Choosing Among Data Collection
Methods
  • Evaluation questions
  • Stakeholder preferences
  • Respondent characteristics
  • Respondent availability/accessibility
  • Level of acceptable intrusiveness
  • Validity (trustworthiness of data)
  • Costs (time, materials, subject matter experts)
  • Organizations experience

24
A Menu of Data Collection Methods
  • Surveys (mail, online, phone open-ended, closed
    questions)
  • Interviews (individual, focus group
    conversational, semi-structured, structured)
  • Observations (quantitative-structured
    qualitative-unstructured)
  • Records and Documents (e.g.,meeting minutes,
    emails, technical reports, existing databases)
  • Tests (paper, simulation, computer)

25
Enhancing the Validity of Data
  • Pilot testing
  • Try out interview protocol, survey, or
    observation form with a sample similar to
    respondent population or have it critiqued by
    colleagues and/or experts.
  • Triangulation
  • Multiple methods, data sources, evaluators,
    and/or theories
  • Sampling
  • Random/Probability generalizable
  • Nonrandom/Nonprobability not generalizable

26
Designing Your Self-Evaluation Activity
  • Transfer your evaluation questions to
  • the worksheet (top row).
  • Discuss and note which data collection methods
    might be most appropriate and feasible for your
    self-evaluation study.
  • Discuss and note who the respondents might be and
    whether you will include the entire population,
    or will select a sample (indicate how many you
    would like to include in your sample).

27
Considerations for Analyzing Data
  • Evaluation Key Questions
  • Stakeholders understanding of, and experience
    with, data analysis methods
  • Types of data (quantitative, qualitative)
  • Levels of quantitative data (nominal, ordinal,
    interval)
  • Choices for analyzing quantitative data
  • Choices for analyzing qualitative data
  • Evaluator skills and time budget implications

28
Why Communicate and Report?
  • To help organization members learn from one
    another and jointly improve their work
  • To build internal capacities - learn about
    UNESCOs substantive work and evaluation practice
  • To inform decision making by program staff and
    management about changes that will improve their
    own, as well as, overall organizational
    performance

29
Why Communicate and Report?
  • To inform funders, community members, clients,
    customers, program staff, management, other parts
    of the organization, and other organizations
  • To demonstrate results, accountability
  • To build awareness and support within your unit,
    division, sector or across sectors and other
    organizational entities
  • To reflect jointly with others on findings and
    derive future actions
  • To aid decision making about continued
    implementation and funding, as well as
    replication at other sites

30
Communicating and Reporting Strategies
  • Facilitates Individual Learning
  • Short communications Memos, email, postcards
  • Interim reports
  • Final reports
  • Executive summaries
  • Newsletters, Bulletins, Briefs, Brochures
  • Newsmedia
  • Website communications
  • Facilitates Interactive (Group) Learning
  • Verbal presentations
  • Videotape/Computer generated presentations
  • Posters and Poster Sessions
  • Working sessions
  • Synchronous electronic communications
  • Personal discussions
  • Photography
  • Cartoons
  • Drama-Performance
  • Poetry

31
Developing Your Communicating and Reporting
Plan Activity
  • Using the Communicating and Reporting Plan
    worksheet, work on Steps 1-6.
  • Steps 7-8 can be completed when more of your
    self-evaluation plan has been developed.

32
How Can We Maximize the Usefulness and Impact of
Our Self-Evaluations?
  • Hold meetings with each other to discuss
    progress, ask questions, seek feedback
  • Use the evaluation planning worksheets provided
    in this workshop
  • Record questions and lessons learned throughout
    the process (email)
  • Make use of IOS resource person specifically
    available to support self-evaluation projects
  • Consider linkages between this self-evaluation
    work and RBM reporting requirements
  • Participate in a poster session in mid-September
    to share findings from the planned
    self-evaluations

33
USE
Mistaken Use Mischievous Use
Ideal Use
LEGIT USE
MISUSE
Rational non-use Political non-use
Abuse
NON-USE
34
Workshop Follow up
  • Current status of Learning from Evaluation
    survey process (with Education Sector staff)
  • Follow-up to this workshop
  • Support for self-evaluation projects (IOS
    contact Sandy Taut)
  • Online support materials slides, handouts,
    workshop audiotape transcription
  • Ongoing assessment of self-evaluation processes
    based on observations and discussions

35
Additional Resources
  • Canadian Evaluation Society
  • www.evaluationcanada.ca
  • American Evaluation Association
  • www.eval.org
  • Australasian Evaluation Society
  • www.aes.asn.au
  • European Evaluation Society
  • www.europeanevaluation.org
  • Société Française de l'Évaluation
  • www.sfe.asso.fr
  • See for standards of professional practice,
    ethics etc.
  • Canadian Journal of Program Evaluation
  • www.cjpe.ca
About PowerShow.com