Title: Evaluation Knowledge Management and Utilization: Findings and Lessons from International Case Studie
1Evaluation Knowledge Management and Utilization
Findings and Lessons from International Case
Studies
- Maximilien Tereraho, Ph.D., Adm.A.HRSDC and
University of Quebec (UQO) - 2007 AfrEA Conference
- Niamey, January 15-21, 2007
2INTRODUCTION1. Is Evaluation in the Business of
Knowledge Management?
- We can learn to do things well or we can learn
to do things poorly. Best practice is about
learning to do the right things right, at the
right time. - Evaluation is about doing the right thing
- Audit and some other oversight functions are
about doing the thing right
32. Presentation Objectives
- Highlight the distinctiveness of evaluation role
in corporate knowledge management and support for
evidence-based policy/ program - Review of good practices in evaluation knowledge
management (EKM) and utilization - Discuss lessons and challenges for EKM and
utilization practice in Africa
4Presentation Outline
- Introduction
- Approach and methodology
- Key review findings
- Next steps and challeges
- Conclusion
- EKMU in Africa
53. Why do we conduct evaluations?
- To improve the design and delivery of our
programs and projects - To inform (evidence-based) organizational policy
- To meet Central Agency policy requirements
- Transfer Payments Policy (national/international)
- Evaluation Policy
- To report to stakeholder groups and the general
public
64. Evaluation and Other Oversight Functions
- We evaluate for both Accountability and Learning
purposes - Obligation to perform up to agreed upon
expectations, show results clearly and strive
constantly for improvement - Feedback and knowledge sharing and transfer to
inform evidence-based policy/program development
and implementation. - Other functions of the oversight spectrum mostly
play a significant role in support of one or
another.
75. Why This Study?
- Evaluation supply does not necessary induce
evaluation demand, nor evaluation demand induces
supply - Generally observed significant gap between
potential and actual use of evaluation - There is an emerging consensus that evaluation
should provide not only oversight but also
insight and foresight for policy and program work.
8APPROACH AND METHODOLOGY1. What is meant by
evaluation knowledge management and utilization?
- EKM is workably defined as the process of
generating, gathering, organizing, sharing, using
and exploiting what the organization knows from
evaluations it works, how and why. - Evaluation knowledge utilization is both a
product and a process for mainstreaming
evaluation lessons and findings into
policy/program cycle management. - Types of evaluation utilization include
- Forms Instrumental use Conceptual use
Political/symbolic use Imposed use - Influence source Results or process-based use.
92. Evaluation is political, but one source of
evidence and evidence but one input into policy
- Policy is itself but one influence on practice
and practice is but one influence on outcome - Evaluation is political, because it generates
knowledge, and knowledge is power - Further focus of evaluation on organizational
learning is a cultural shift that appeals for
deliberate and formal tools and processes - Good/best practices are themselves a contigent
reality.
103. How best to manage evaluation use and measure
evaluation utilization?
- Evaluation findings and process affect thoughts
and actions (influence process) at the
individual, interpersonal, organizational and
societal levels, along with interactions between
these levels - Organizational strategic configuration
perspective used - EKM strategic positioning
- Overall goal
- Location in the organisation structure, including
stakeholder relationships (e.g. audit, policy /
program, research, monitoring, etc.) - EKM processes Building, accumulation,
dissemination, support for utilization
114. How best to manage evaluation use and measure
evaluation utilization? -continued
- Forms of evaluation knowledge utilization (above)
- Reported conditions for effective EK utilization
- Evaluation results relevance, validity and
reliability independence and objectivity,
timeliness and accessibility (marketing and
distribution) - Absence of threats to openness and the sharing of
opinions - Availability of intellectual and/or practical
challenges - Opportunities for practical follow-up.
- Common highlighted possible inhibiting factors
- Lack of institutionalisation
- Evaluation results either too project/program-spec
ific or too general to be helpful - Workload pressures
- Unbalanced view of the dual role of evaluation as
both a management learning and accountability
function.
125. Good practices identified through three lines
of evidence
- Brief review of recent specialized literature on
evaluation utilization - Interviews in leading national and international
organizations based in New-York and Washington
D.C. - Review of documents from 15 cases
- 4 United Nations agencies,
- 7 bilateral and multilateral development
agencies, - 4 national governments.
13KEY REVIEW FINDINGS1. EKM Strategic Positioning
- Evaluation results transformed into used
knowledge when analyzed, systematized,
disseminated and internalized within an
organization through participatory evaluation
processes and appropriate dissemination
strategies. - Organizations have moved or are undertaking a
shift towards standalone independent evaluation
functions, coordinated at the highest level
(direct reporting to the head or the board of
directors of the agency). - For promoting lessons from evaluations, at the
same time, Evaluation is connected with
policy/program by closer links with Knowledge and
Performance reporting functions. - Audit is generally seen as a bad cousin for
evaluation.
142. EKM Strategic Positioning- continued
- In few cases, evaluation knowledge management was
systematically implemented to better incorporate
evaluation results into future program/project
planning. - Processes and products supporting evaluation
dissemination and utilization are more structured
and comprehensive in more autonomous and/or
business-oriented agencies. - Some organizations have a distinct evaluation
section or staff dedicated to knowledge building,
dissemination and application, including a
reference/help desk service (e.g., UNDP GAO
World Bank Group/IFC, IADB, EBRD USAID).
153. Evaluation Knowledge Building
- Carry out thematic or strategic evaluations that
can facilitate learning across programs /
policies / jurisdictions through extraction of
lessons learned from experience. - Lessons learned series
- Periodical review of evaluation findings, e.g.,
annual/biennial report on evaluation activities
and results. - Real time evaluations and policy abstracts,
selectively based on the potential for
retrospective examination or in-depth program
reviews to inform strategic planning. - Use recognized professional standards for
systematic and transparent reviews of evaluation
research (The Cochrane Collaboration Model).
164. Evaluation Knowledge Dissemination
- Make the knowledge from evaluations user friendly
and easily accessible based on user needs and
priorities and the latest technologies and
diversified approaches - Adaptable risk-based evaluation plan that
identifies policy priorities, reporting
priorities, and level of risk to the agency - Common dissemination channels include
- Posting reports on the agency Website
- Workshops/training
- Formal and informal help desk (advice)
- Newsletters or notes series
- Participation internal and external knowledge
management systems and communities of practice.
175. Evaluation Knowledge Accumulation
- The need for a streamlined and centralized
knowledge repository that integrates all of the
oversight bodys information is recognized, in
place or under consideration. - For example, the US DOL Annual Report
incorporates a review of self-evaluations and all
of the audits, reviews and evaluations conducted
by the DOL Office of Inspector General, GAO and
other external evaluators.
186. Support for Evaluation Knowledge Utilization
- Support the effective use of evaluation knowledge
for policy/program through - An established system for management response and
follow-up - Decision forums for senior management to discuss
evaluation findings and lessons to identify
implications for existing and future policies and
programs - Better incorporation of evaluation knowledge into
existing tools and processes.
197. Evaluation Knowledge Utilization
- Evaluation knowledge utilisation allows for
continuous improvement in operations, programs
and policies, but policy decisions are less
directly informed by evaluation, especially in
political decision-making-driven programs /
organizations. - In the latter organisations, involved actors
emphasize the relative importance of indirect and
process-based use of evaluation.
207. Evaluation Knowledge Utilization- continued
- Instrumental utilization for decision-making on
policy or program direction at the senior
management or cabinet level is usually tied to a
formal utilization process. - Check whether the lessons learned adequately are
used during the various stages of the
project/program cycle is made and reported (World
Bank/EBRD, IFC, and as part of the US OMB/PART
process) - Performance-based incentives are used to promote
the use of evaluation in some organizations (e.g.
IFC, WBI, US GOV under PART), or being
contemplated in others (e.g. USAID).
218. Support for Evaluation Knowledge Utilization-
continued
- Most of emerging factors fostering the use of
evaluation are those found in the specialized
literature including timing and purpose senior
management support / leadership evaluation
process and report quality monitoring and
follow-up performance-based rewards. - However, the creation of an evaluation culture
remains essential for organizational learning as
the use of evaluation appears to be more
determined by the overall organizational
arrangements for dissemination, consultation and
routine liaison between evaluators and other
operational and policy colleagues.
22NEXT STEPS AND CHALLENGES
- Strengthen and expand dissemination and transfer
strategies - Take advantage of technologies
- Further link with policy/program clients and
stakeholders - Further coordinate/integrate with other knowledge
functions - Manage risks associated with possible high
expectations created by Central agency Evaluation
Policy. - Capacity of the evaluation function to retain
required specialized and qualified staff and to
recruit in a highly competitive labor market. - Potential increased tension between
accountability and learning functions of
evaluation from further focus on learning from
evaluations, especially in the current era of
accountability.
23CONCLUSION
- Feed-back to the decision-making processes along
policy and program cycle is now recognized as an
essential and integral part of Evaluation
function. - Further focus of evaluation on organizational
learning is a cultural shift that appeals for
deliberate and formal tools and processes. - Agencys necessarily continuous effort to ensure
more systematic use of evaluation knowledge for
improvement of planning and subsequent
activities, should keepg in mind that - Evaluation is but one source of evidence and
evidence is but one input into policy - Good practices are not necessary transferable
accross different organizational contexts.
24Conclusion - continued
- Potential areas for further investigation
include - Influence of evaluation control by competing
interests in the policy decision-making process
on effective EKMU - How knowledge management techniques may be better
adapted to assist in evaluation utilization in
different organizational contexts.
25EKM IN AFRICA POTENTIAL FOCUS FOR DISCUSSION
- Reported conditions for (in) effective EK
utilization - Implications for capacity building
- Needs for a tailored/balanced EKMU model for
Africa - Usefulness and drawbacks of systematic
review/meta-evaluation of external evaluative
studies
26KEY PEER REVIEWED LITERATURE
- Amara, N. al. 2004
- Brown, L. N. Kiernam. 2000
- Carden, F. 2004
- Chelimsky, E. 1984
- Cook, T. W. Wittmann. 1998
- Cousins, B. Lee, L. 2004
- Cummings, R. 2002
- Dahler-Larsen, P. 1998
- Davies, P. 2004
- de Lancer, J. P. M. Holzer. 2001
- Dubois, N. al. 2005
- List of reviewed corporate documents and
interviewees also available
27Key Peer Reviewed Literature- continued
- Engel, P. al. 2003
- Feinstein, O. 2002
- Forss, K. al. 2002
- Ginsburg, A. N. Rhett. 2003
- Grasso, P. 2003
- Greenberg, D. al. 2000
- Hahn, E. al. 1984
- Henry, G. 2003
- Henry, G. M. Mark. 2002, 2004
- Kirkhart, K. 2000
- Kool, D. 2004
28Key Peer Reviewed Literature- continued
- Leviton, L.C., Hughes, E.F.X. 1981
- Lipton, D. 1992
- Mackay, K. 2006
- Marra, M. 2000
- McClintock, C. S. Lower. 2001
- Neilson, S. 2001
- Nutley, S. al. 2003
- Patton, Q. 1998, 2001
- Preskill, H. al. 2003
- Rebolloso, E. al. 2005
29Key Peer Reviewed Literature- continued
- Russ-Eft, D. al. 2002
- Schaumburg-Müller, H. 2005
- Simons, H. 2004
- Stern, E. 2002
- Shula, L. Cousins, B. 1997
- Thompson and King 1981
- Torres, P. 2002
- Valovirta, V. 2002
- Vingilis, E. al. 2003
- Weiss, C. 1999, 1982, 2005
- Widmer, T. P. Neuenschwander. 2004