Title: From Impact Assessment to Learning: Experience with Institutional Learning and Change in the CGIAR
1From Impact Assessment to LearningExperience
with Institutional Learning and Change in the
CGIAR
- Jamie Watts and Doug Horton
- Presentation by the ILAC Initiative to the PGRA
Impact Assessment Workshop October 19-21,
2005Mexico
2Topics Addressed
- ILAC Concepts and Ideas
- Implications of ILAC for impact assessment
- Questions for consideration
3The rapidly changing context
- Greater focus on poverty
- Better understanding of poverty and development
processes - Dire state of the poor, particularly in Africa
- Concerns about our inability to address these
problems - Change in institutions and technology
- Emergence of more diverse range of organizations
involved - New modes of working participation, partnership,
alliances, networks - Globalization of markets
- Rapid developments in biotechnology and ICT
- Increasingly rapid diffusion of information and
expanded potentials for learning - Rapidly accelerating pace of change
environmental, social, technological
4All of this change means that
- Our preconceptions about what makes a successful
programme probably no longer hold true - Future is unknown and possibly even unknowable
We need to re-engineer our brains for more
continuous learning for rapid uptake of lessons
5What is ILAC?
- Institutions
- Agricultural innovation involves diverse actors
at different levels, and norms and rules that
govern their interactions - Experiential Learning
- Analyzing and understanding the work we do
- Learning as a (social) process of reflection and
analysis - Change
- Applying lessons learned to improve our
programmes
6Institutions Innovation system perspective
- Agricultural research is one part of a complex,
adaptive system, with multiple sources of
innovation - Innovation is a social and technological process
- Innovation emerges at the interface of knowledge
production / dissemination and economic activity
7Actor network maps Farmer Research Groups in
Honduras
Source Douthwaite 2004.
8Concreteexperience
Experientiallearningcycle
Application
Reflection
Conceptualization
9Constructivist Transformational Learning
- Constructivist learning A social process in
which individuals groups learn by interpreting,
understanding making sense of their experience. - Can lead to
- Transformational learning Mental transformations
that enable a break from traditional knowledge,
beliefs practices, and the adoption of new ones.
10Single-Loop and Double-Loop Learning
Match
Goals assumptions
Actions
Consequences
Mismatch
Single-loop learning
Double-loop learning
Source Argyris, 1977.
11Use of Impact Assessment for Learning and Change
- 2 major types of evaluation
- Formative summative
- 3 major uses of evaluation findings
- Direct, indirect symbolic use
- Importance of process use
- Learning to learn
- Developing work-related networks
- Forging common understandings
- Strengthening the project or program
- Appreciating how the project relates to the
organizations mission goals - Boosting morale and confidence
Source Patton 2003
12Use of AI for Learning
Learning from involvement
Learning
Learning from reports
Reach
13Elements of a learning organization
- Systematically gathering information
- Making sense of information
- Sharing knowledge and learning
- Drawing conclusions and developing guidelines for
action - Implementing action plans
- Institutionalizing lessons learned and applying
them to new and on-going work
14Implications of ILAC for Impact Assessment
- Institutions
- If agricultural innovation (improvement) takes
place within systems of multiple players at
different levels, and norms and rules that govern
their interactions then IA must redefine impact
within a partnership context - Learning
- If IA is to maximize its contribution to
learning, then IA must promote a process of
reflection and analysis among those responsible
for the programme - Change
- If IA is to contribute to the direct uptake of
lessons to programme improvement, then it must be
oriented towards this as its objective and
designed accordingly
15Redefining Impact
- Science quality and impacts should be defined
by a broader range of actors - Questions should be broadened beyond Did we do
what we intended to do, efficiently and
effectively? to Are we on the right course? Are
we asking the right questions? Are our
assumptions still appropriate? Is our basic
approach still valid? - Both quantitative and qualitative approaches add
value - IA should build knowledge of process and
institutional issues - Greater emphasis should be on the processes by
which impact is achieved - Roles of all actors should be considered
(throughout the innovation system). This
implies less emphasis on attribution.
16Redefining Impact Which is the hammer and which
is the nail?
17Processes of reflection, analysis and change
- Participatory reviews and evaluations have
spin-off benefits (e.g., a common understanding
of research effectiveness, expected impacts,
goals and objectives) - Self assessment can be employed as a means of
promoting experiential learning - Developing consensus with partners promotes
relevance as well as buy-in by political
constituencies - Analysis of causes of errors and unexpected
outcomes - Lessons (about what worked, what didnt work and
why) need to feed into decisions on programmes
and activities
18(No Transcript)
19Ex Post Impact Assessments
- Multidisciplinary teams and stakeholders reflect
on research and technology promotion initiatives
and explore sources of success and failure.
Indicators jointly defined by scientists and
other stakeholders. Analysis conducted in a
timely manner, so that it be used to improve
emerging programmes - Innovation histories reflection and learning
workshops mixed methods from various
disciplines impact pathway analysis
just-in-time impact analysis Outcome mapping
20Adoption Studies
- Track the uptake of technology but expand the
analysis to explore the whole process of
innovation associated with a technology including
the institutional context. - Farmer surveys as part of innovation histories
- Institutional analysis of innovation process
- Impact pathway analysis
- Outcome Mapping
21IA for ILAC
- Ensure that IA has learning and programme
improvement objectives - Focus on questions of target audiences.
- Select from a wide variety of methods to address
the questions of relevance. - Use collaborative approaches to interpret
findings and develop recommendations - Report in ways that facilitate understanding /
assimilation suggest practical uses. - Assess the processes by which impact is (or
isnt) achieved, as well as the magnitude of
impacts. - Assess the roles that different agents play in
achieving impact. - Broaden the scope of impact assessments to
include changes in institutions, policies and
capacity.
22Lessons from Experience
- We learn most from our errors, but there are
seldom incentives to admit them and learn from
experience - We learn most in the field but seldom get there
- Organizations often have learning disabilities
- Center of Excellence Complex
- Inverse relationship between position in the
organization and learning the higher you are,
the less you can afford to learn - New Boss Syndrome
- Staff turnover and knowledge loss
- Impact Assessments seldom support organizational
learning change - Organizational learning requires TLC
23Cornerstones of Support
Capacity
Impact Assessment for ILAC
Management
Donors
24Implications for CG Managers
- IA for ILAC is only effective within a learning /
risk-taking culture - Its essential to ensure that impact assessors
have a clear / formal mandate to support
organizational learning and change (not just the
production of reports) - Support training of staff (e.g. facilitation
skills, participatory process management,
monitoring and evaluation skills, diagnostic
skills) - Dedicate time and resources to learning
25Focus of a recent donor sponsored Impact
Assessment
- Synergies between the project and other actors
and their roles - Domains of impact
- Causal chain by which results have come about
- Useful lessons learned that can be applied to
future projects
26Some Questions
- Are CGIAR centers ready to openly and critically
assess the strengths and weaknesses of their
activities? Do they have the capability to do so
(methods, resources, disciplinary balance,
facilitation skills, etc) - Is the CGIAR system ready for ILAC? What are the
implications for standards and guidelines for
planning, impact assessment, external reviews,
and performance measurement? - Are donors prepared to reward the identification
of failures and learning from them? Are they
willing to pay the associated costs?