A Continuous Quality Improvement Approach to Evaluation - PowerPoint PPT Presentation

Loading...

PPT – A Continuous Quality Improvement Approach to Evaluation PowerPoint presentation | free to download - id: f7ce0-ZjU0Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

A Continuous Quality Improvement Approach to Evaluation

Description:

A Continuous Quality Improvement Approach to Evaluation. Smaller World ... 2. Participates will be able to explain the benefits of ... Brassard M., and ... – PowerPoint PPT presentation

Number of Views:215
Avg rating:3.0/5.0
Slides: 94
Provided by: braz3
Learn more at: http://www.thcu.ca
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: A Continuous Quality Improvement Approach to Evaluation


1
A Continuous Quality Improvement Approach to
Evaluation
  • Smaller World Communications
  • Barb van Maris

Health Communications Unit Special Topics
Workshop January 29, 2002
2
Learning Objectives
  • 1. Participants will understand the similarities
    between evaluation and quality improvement.
  • 2. Participates will be able to explain the
    benefits of approaching evaluation from a quality
    improvement perspective
  • 3. Participants will understand how evaluation
    fits-in to quality improvement
  • 4. Participations will be able to describe the
    Model for Improvement
  • 5. Participations will identify different
    methods and tools used in
  • a CQI approach to evaluation

3
Learning Objectives
  • Tools we will cover today
  • Needs assessment
  • SWOT analysis
  • Flow charts
  • Fishbone diagram
  • Affinity Diagram
  • Brainstorming
  • Prioritization Matrix
  • Various measurement tools

4
Presentation Outline
  • Overview of evaluation and quality improvement
  • Discussion…..how does this relate to your
    practice?
  • Challenges experienced with traditional approach
    to program evaluation
  • Benefits
  • Creating a CQI approach to evaluation
  • The Model for Improvement
  • Drawbacks
  • Conclusions

5
Smaller World Communications
  • Performance measurement and evaluation
  • Primarily in the health care and health promotion
    field
  • Clients include (health units, hospitals, not for
    profit organizations, professional colleges,
    funding agencies)
  • Some work in the private sector

6
The evolution of an evaluation practice
Program Evaluation
Clinical Research
  • Applied research skills to evaluating health
    promotion programs
  • Evaluation Design
  • Indicators
  • Goals/objectives/outcomes
  • Internal evaluation
  • Lack of experimental control
  • Controlled experiments in the lab
  • Clinical trials in hospitals
  • Research Design
  • Threats to validity/reliability
  • Highly controlled
  • Mutli-variate statistics

7
The evolution of an evaluation practice
Evaluation of PT Clinics/ QA Programs for
Professional Colleges
Performance Measurement
  • Adapted to language of standardsand RHPA
  • Move from QA to Continuous Learning
  • Demonstrate effectiveness for accountability
    (Insurance Co.)
  • Demonstrate evaluating practice for College
    Requirements
  • Applied market research skills and program
    evaluation in hospital setting
  • Indicators
  • Client and employee opinions
  • Clinical outcomes
  • Balance Score Cards

8
The evolution of an evaluation practice
Organizational Development
  • Natural progression to increase utilization of
    performance measurement data
  • Facilitation skills
  • Team building
  • Business strategy
  • Leadership
  • Continuous Quality Improvement

9
Overview Evaluation and CQI
  • The two disciplines developed in parallel with
    the same goal to improve our services and
    programs.
  • But, they stemmed from two different areas of
    study
  • Program evaluation - social sciences research
  • CQI - organizational development/business

10
Evaluation
  • Application of social science methods
  • Emerged from the Education and Public Health
    fields prior to WWI
  • By 1930s, applied social science and program
    evaluation grew at a rapid rate
  • Evaluation of government programs first took-off
    in the U.S. with Canada not far behind

11
Evaluation
  • Evaluation research is the systematic
    application of social research procedures for
    assessing the conceptualization, design,
    implementation, and utility of social
    intervention programs.
  • Rossi and Freeman, 1993

12
Evaluation
  • The increase in popularity of program evaluation
    emerged from the need for government programs to
    be accountable
  • Although most text books state there are many
    purposes or uses to evaluation the underlying
    tone is still to demonstrate effectiveness or
    proper use of resources

13
Evaluation
  • Treasury Board Policy on the Evaluation of
    Programs….Guidelines
  • Program evaluation in federal departments and
    agencies should involve the systematic gathering
    of verifiable information on a program and
    demonstrate evidence on its results and
    cost-effectiveness. Its purpose should be to
    periodically produce credible, timely, useful and
    objective findings on programs appropriate for
    resource allocation, program improvement and
    accountability.
  • (1981)

Treasury Board of Canada (1981 ). Guide on the
Program Evaluation Function
14
Evaluation
  • Evaluation helps you to make decisions about
  • - the optimal use of time and resources
  • - determining if the program/service is meeting
    the needs of participants
  • - ways of improving a program/service
  • - demonstrating the effectiveness of a program
    to funders and other stakeholder groups

15
Evaluation
  • Evaluations focus is on measurement
  • In order to collect valid and reliable
    information, evaluators utilize, as much as
    possible the scientific method for collecting
    information.
  • It was quickly recognized what we need to measure
    depends on the programs stage of development

16
Evaluation
  • Types of Program Evaluation
  • - Formative
  • - Process
  • - Summative (outcome)

17
Evaluation
  • Formative - information you collect to help you
    plan or implement a program/service
  • - needs assessments
  • - pre-testing program materials
  • - audience analysis

18
Evaluation
  • Process - studies program implementation
  • - tracking quantity and description of people
    who are reached
  • - tracking quantity and types of services
  • - description of how services are provided
  • - quality of services provided
  • - was everything implemented the way you
    thought

19
Evaluation
  • Outcome - studies the outcomes of the
    program/service
  • - changes in attitudes, knowledge or behaviour
  • - changes in morbidity or mortality rates
  • - changes in policy
  • - program results in relation to program costs

20
Evaluation
  • Different approaches emerged
  • External evaluation - highly rigorous and
    objective
  • Internal evaluation
  • Participatory evaluation
  • Empowerment evaluation
  • Utilization-focused evaluation

21
Program Evaluation
  • Internal Evaluation Approach
  • Definition Carried out by persons who are
    responsible for the evaluation process in an
    organization
  • (John Mayne, 1992)
  • Benefits
  • An organization better understands their own
    programs and environment by doing the evaluation
  • There is greater acceptance for the changes
    required

22
Evaluation
  • Participatory Evaluation
  • engages stakeholders in all or key aspects of the
    evaluation
  • collective-learning
  • sharing of power

23
Steps in Evaluating HP Programs
Step 1 Clarify your program Step 2 Engage
Stakeholders Step 3 Assess Resources for the
Evaluation Step 4 Design the Evaluation Step
5 Determine Appropriate Methods of
Measurement and Procedures
Introduction to Evaluating Health Promotion
Programs - HCU
24
Steps in Evaluating HP Programs
Step 6 Develop Work Plan, Budget and
Timeline for Evaluation Step 7 Collect the
Data Using Agreed-upon Methods and
Procedures Step 8 Process and Analyze the
Data Step 9 Interpret and Disseminate the
Results Step 10 Take Action
Introduction to Evaluating Health Promotion
Programs - HCU
25
Continuous Quality Improvement
  • Stems from work in the organizational development
    field
  • 1950s - Dr. E. Deming introduced Total Quality
    Control to Japanese manufacturers
  • 1980s - Total Quality Management begins in the
    U.S.
  • 1990s - TQM fades as a fad, yet focus is still
    on continually improving products and services
    (CQI)

26
Continuous Quality Improvement
  • Principles of CQI
  • Develop a strong customer focus
  • Continually improve all processes
  • Involve employees
  • Mobilize both data and team knowledge to improve
    decision making

Brassard M., and Ritter D., 1994. A Pocket Guide
of Tools for Continuous Improvement and Effective
Planning
27
Continuous Quality Improvement
  • 3 Key Questions
  • What are we trying to accomplish?
  • How will we know that a change is an improvement
  • What changes can we make that will result in an
    improvement

28
Plan- Do - Study - Act
Integrate the lessons learned and adjust the
program. Do we need to reformulate the theory?
Identify what more we need to learn.
Identify purpose and goals, formulate theory.
Define how to measure. Plan activities.
Plan
Act
Study
Do
Monitor the outcomes. We study the results for
signs of progress or success or unexpected
outcomes.
Execute plan, applying our best knowledge to the
pursuit of our desired purpose and goals
Scholtes, 1998. The Leaders Handbook (Based on
the work of Dr. W. Edwards Deming)
29
PDSA cycle creates continuous learning
Act- Plan
Act- Plan
Plan
Theories
Application
Do-Study
Do-Study
Do-Study
The nature of true learning………….
30
Activity 1
  • Select someone in your group to
  • facilitate the discussion
  • record your ideas on the flip chart
  • keep track of time

31
Activity 1
  • Identify where group members are already doing
    program evaluation.
  • What are some of the challenges you are facing?

32
Challenges to the Traditional Approach
  • Program staff are resistant!
  • Program staff focus on showing effectiveness
    rather than looking at what needs to be improved
  • They also get hung-up on what design and
    statistical techniques are needed, which in many
    cases are beyond their skills or necessary for
    the evaluation
  • Programs are expected to be effective in an
    unrealistic time frame…..it takes time for
    programs to evolve

33
Levels of Accomplishment
  • Levels of accomplishment
  • Issue mapping
  • Capacity building
  • Environmental shift
  • Behaviour change
  • Within each Level of Accomplishment, identify
    relevant performance indicators that might signal
    progress toward health promotion goals.

Michael Hayes (1999) Ontario Tobacco Research Unit
34
Programs Evolve
2. Quality and Effectiveness
1. Relationships Capacity
3. Magnitude Satisfaction
IMPACT
Intermediate term Outcomes
Short term Outcomes
Long term Outcomes
NEED
Activities
Extended impact analysis
Formative Process
Some summative
Summative
Realistic Evaluation
Kellogg Foundation - CES Conference 1999
35
A CQI approach to evaluation
  • Focus is not on showing what we did well, or
    whether the program passed or failed but what we
    can do better and the changes we can make to
    improve our work!
  • You measure what you need to know to improve your
    program and to determine whether it works(process
    and outcome)
  • All evaluation becomes formative in some way
  • Staff are encouraged to look for what is not
    working and why not

36
A CQI approach to evaluation
  • Methods of measurement are still the same, but
    there are additional tools we can adapt
  • root cause analysis, flow diagrams, affinity
    diagrams, etc...
  • Measurement would become a continuous aspect of
    any program where staff could utilize results and
    see the benefits
  • Measurement - decision making cycle is faster
  • The approach and in some cases the language used
    is different

37
A CQI approach to evaluation
  • A CQI approach doesnt mean an ineffective
    program should not be terminated.

38
A CQI approach to evaluation
  • Need to create a learning culture
  • safe
  • increase understanding and benefits of ongoing
    measurement
  • debunk the myth that measurement is difficult
  • begin by measuring what you can in the best way
    possible
  • then improve on it as you go
  • utilize staff observations and hard data
  • empower staff to critically assess and observe
    their programs

39
A CQI approach to evaluation
  • Focus staff on the positive change they are
    trying to create and not on their defined program
    and activities
  • Key short term evaluation questions
  • What information will help us improve our
    program?
  • Think about this month or the next 6 months

40
Activity 2
  • Review Case Study
  • Each table is going to focus on one of the key
    elements we just discussed and brainstorm about
    strategies or ways the staff of this program
    could incorporate them
  • 5 minutes to write ideas down independently on
    post-it notes
  • 10 minutes to put ideas on flip chart

41
Activity 2
  • 1. Making it safe to measure
  • 2. Increase understanding and benefits of
    measurement
  • 3. Debunk the myth that measurement is difficult
  • 4. Empower staff to critically assess and observe
    their program
  • 5. Focus staff on the change they are trying to
    create and not on the defined program and
    activities
  • 6. What would some of your key short term
    evaluation questions

42
A CQI approach to evaluation
  • Small scale changes to make improvements
  • Measure both processes and monitor outcomes
  • Built in process for changing program based on
    what is learned

43
What to Measure
  • Audience Reached
  • Who are you reaching/who are you not reaching…
  • Numbers reached
  • Activities implemented
  • What activities did you do? What did you not do
    and..
  • How well did you do them?
  • What could you have done better?
  • Challenges to implementation
  • Client satisfaction
  • Outcomes achieved and not achieved
  • Effect of program on those reached, were there
    any unintended effects?
  • What is it changing? What is it not changing?….
  • Costs (in-kind, staff time and )
  • External influences on program success

WHY NOT?
WHY NOT?
WHY NOT?
44
Program Evaluation
  • Treasury Board Evaluation Policy
  • The Government of Canada is committed to becoming
    a learning organization. Evaluation supports
    this aim by helping to find out what works and
    what does not and by identifying cost-effective
    alternative ways of designing and improving
    policies, programs and initiatives. (Febru
    ary 1, 2001)

Treasury Board of Canada Secretariate (2001 ).
Evaluation Policy
45
Key Definitions
  • Evaluation
  • Performance measurement
  • Benchmarking
  • Results based management
  • Quality improvement

46
Continuous Quality Improvement
Organization/Program/Policy Development
Clinical Research or Academic Research
Informs
Performance Measurement
Benchmarking
Evaluation
47
The Improvement Model
48
Improvement Model
  • What are we trying to accomplish?
  • AIM
  • How will we know that a change is an improvement?
  • INDICATORS
  • What changes will result in an improvement?
  • ACTIVITIES or SOLUTIONS

Improvement or program development Cycle
Plan
Act
Do
Study
Langley, Nolan et. Al. The Improvement Guide
49
Setting Aims
What are we trying to accomplish?
  • 1a. Understand your program/service
  • -client needs and expectations
  • -goals and objectives
  • -what are you currently doing?
  • -strengths - what is working
  • -challenges - what is not working...why not?

50
Setting Aims
What are we trying to accomplish?
  • Tools
  • -needs assessment
  • -SWOT analysis
  • -Flowcharting (illustrates a process)
  • -Fishbone diagram - Cause/Effect (getting to the
    root causes of a problem)

51
Setting Aims
What are we trying to accomplish?
  • How do you conduct a needs assessment?
  • -find out from clients and/or stakeholders what
    they need and expect out of the program or
    service
  • -define clients and stakeholders...who do you
    need to talk to?
  • -hold focus groups, informal interviews, conduct
    a survey
  • -how many people do you ask?

52
Steps in doing a needs assessment
  • 1. Decide who your program/service is intended
    for. Be as specific as possible.
  • 2. Identify what you know about them and what you
    dont know about them.
  • Demographics (age, gender, education, culture,
    etc)
  • Where are they currently accessing services?
  • What are their service needs?
  • What are some of their barriers to receiving
    service?
  • What is the best way to communicate with them?

53
Steps in doing a needs assessment
  • 3. Who else is involved with their care? How are
    they involved? What are their needs?
  • 4. Decide on the best way to collect information
    from them
  • Can you reach them through another program or
    service?
  • Can you reach them through the mail? Internet?
    Phone? In-person?
  • Decide on a method of measurement (way to collect
    information)

54
Steps in doing a needs assessment
  • 5. Formulate your questions based on what you
    dont know about them and what would help you
    plan your program/service
  • 6. Put the questions into the data collection
    format you plan to use
  • 7. Recruit people to answer your questions or
    participate in a focus group
  • 8. Collect and analyze the data

55
Tips for conducting focus groups
  • Hold a group at a time that is convenient for
    your target audience
  • Provide refreshments, daycare, transportation
    etc.
  • Recruit more people than you need (10-12)
  • Call a day ahead to confirm participation
  • Facilitator should not be expected to record
  • Audio-taping is best but may not be necessary
  • Run at least 2 groups and ideally 3-5 groups
  • Develop and use a moderators guide which outlines
    the questions you need answered
  • Analyze the groups independently and then look
    for what is common across groups

56
Setting Aims
What are we trying to accomplish?
  • SWOT Analysis
  • looks at your organizations capabilities and the
    current environment
  • -Strengths
  • -Weaknesses
  • -Opportunities
  • -Threats

57
Setting Aims
What are we trying to accomplish?
  • Flowcharting
  • -Determine boundaries...identify where the
    process starts and ends
  • -Identify major steps in the process
  • -Sequence the steps
  • -unless it is a new process document what is
    happening not what you would like to be happening

58
Setting Aims
What are we trying to accomplish?
  • Flowcharting - the basics
  • Used to show starts or ends in the process
  • Used to show a task or activity
  • Used to show points in the process where
    decisions are made
  • Used to show direction or flow

59
Setting Aims
What are we trying to accomplish?
  • Fishbone diagram - find the causes

inconvenient
uncomfortable
Why?
Teenagers not attending the clinic
Why?
inconvenient
I dont need it
uncomfortable
Cost
I dont need it
Cost
Time
60
Setting Aims
What are we trying to accomplish?
  • Fishbone diagram - find the causes

Uncomfortable
Inconvenient
environment
Not able to get transportation
Time
Teenagers not attending the clinic
Staff
Cultural beliefs
Location of clinic
Cultural beliefs
Not sure what you do
Location of clinic
Programs are not for me
inconvenient
Not able to get transportation
I dont need it
uncomfortable
Cost
I dont need it
Cost
Time
61
Activity 3
  • Read the case study
  • Clarify the program or issue by listing on a flip
    chart what the main concerns are
  • Review the flow chart
  • Brainstorm about possible opportunities for
    improvement

62
Setting Aims
What are we trying to accomplish?
  • 1b. Decide on your Aim..........give a general
    description of what you are going to do
  • -focus on the results you want, not the
    activities you will do to achieve them
  • -explain in broad terms what is to be
    accomplished and why
  • you may or may not decide to use numerical targets

63
Setting Aims
What are we trying to accomplish?
  • 1c. Guidance statements..........include
    information that helps to answer the fundamental
    questions about the program or the improvement
    effort.
  • -suggestions for measures
  • -aspects of system you will focus on
  • -boundaries in which changes are developed

64
Improvement Model
  • What are we trying to accomplish?
  • AIM
  • How will we know that a change is an improvement?
  • INDICATORS
  • What changes will result in an improvement?
  • ACTIVITIES or SOLUTIONS

Improvement or program development Cycle
Plan
Act
Do
Study
65
How will you know you made an improvement?
  • Select 6 or fewer measures
  • Keep the interests of the customer represented in
    the measurements chosen
  • Comparison data from before and after is helpful
    but not always possible or necessary
  • Sometimes the data needed to measure the impact
    of a change is not available for a long time.
    Select intermediate measures.
  • Use multiple measures to balance competing
    interests and to assure the system as a whole is
    improved

66
How will you know you made an improvement?
  • Levels of measurement
  • Global measures - relates directly to the aim of
    the study. Big system changes. Improvement
    signifies accomplishment of aim.
  • Intermediate measures (milestones) - related to
    the global measures but is not sufficient to
    ensure the accomplishment of the aim.
  • Process measures - assess whether actions were
    implemented as planned. Helps to explain why or
    why not the intermediate and global measures are
    reached.

67
How will you know you made an improvement?
  • Examples of measures
  • Global measures
  • low birth weight babies
  • teen pregnancies
  • population smoking
  • accidents due to drinking and driving

68
How will you know you made an improvement?
  • Examples of measures
  • Intermediate measures
  • increase in awareness
  • increase in knowledge
  • participation in physical activity
  • use of birth control
  • attendance at quit smoking clinic

69
How will you know you made an improvement?
  • Examples of measures
  • Process measures
  • participants
  • demographics of participants
  • client satisfaction
  • employee satisfaction
  • partnerships

70
How will you know you made an improvement?
  • Examples of process measures
  • Efficiency and costs
  • Time to receive service
  • Time for tests results
  • Cost per client visit

71
How will you know you made an improvement?
  • Examples of process measures
  • Resource utilization
  • staff hours
  • amount of non productive hours
  • medications dispensed/day

72
How will you know you made an improvement?
  • Examples of process measures
  • Stakeholders perceptions/satisfaction
  • client satisfaction
  • employee satisfaction
  • complaints
  • repeat clients or requests

73
Activity 4 - Selecting indicators
  • Identify
  • 3 global measures
  • 3 intermediate measures
  • 3 process measures

74
Improvement Model
  • What are we trying to accomplish?
  • AIM
  • How will we know that a change is an improvement?
  • INDICATORS
  • What changes will result in an improvement?
  • ACTIVITIES or SOLUTIONS

Improvement or program development Cycle
Plan
Act
Do
Study
75
What changes will result in improvement?
  • For simple systems and/or programs a list of
    changes or activities could be developed and
    tested quickly
  • For more complex systems which require
    fundamental redesign or multiple program
    components consider
  • developing the design concept first
  • then identify detailed activities/changes for
    each component

76
What changes will result in improvement?
  • What do other organizations or programs do?
  • Research other programs
  • Search the internet for ideas
  • What does the literature suggest?
  • What do the stakeholders suggest?

77
What changes will result in improvement?
  • Making changes to existing systems
  • Critically think about the current system
  • Flow charting and documenting where the
    challenges arise can quickly highlight
    opportunities
  • Consider automating/eliminating redundancies
  • Creative thinking - Tool (Brainstorming)
  • Use Change Concepts

78
Affinity Diagram
Brainstorm the issue, problem, opportunity...
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified inthe brainstorm session
items identified in the brainstorm session
items identified in the brainstorm session
items identified inthe brainstorm session
items identified in the brainstorm session
items identified in the brainstorm session
items identified in the brainstorm session
79
Activity 5 - Brainstorming solutions
  • -Brainstorming exercise
  • -Affinity diagram

80
Setting Priorities
  • Voting Tools
  • Prioritization matrix
  • Multi-voting - open
  • Multi-voting - closed

81
Prioritization Matrix Steps
  • Generate or clarify the options
  • Determine and agree on criteria
  • Compare the options to each other using the
    criteria
  • Vote on the best option

82
Prioritization Matrix
  • 1. Create a matrix with the options on the
    vertical and the criteria across the top.
  • 2. Assign a score for each option for each
    criteria.
  • 3. Tally totals to determine the score for each
    option.
  • 4. Each participant could do this independently
    and then take the average score for each option

83
Plan- Do - Study - Act
Integrate the lessons learned and adjust the
program. Do we need to reformulate the theory?
Identify what more we need to learn.
Identify purpose and goals, formulate theory.
Define how to measure. Plan activities.
Plan
Act
Study
Do
Monitor the outcomes. We study the results for
signs of progress or success or unexpected
outcomes.
Execute plan, applying our best knowledge to the
pursuit of our desired purpose and goals
84
Testing out changes or a new program
PDSA Cycles -allows you to study your new ideas
on a small scale -learn from tests and then
build on them
85
Linking Cycles on a Ramp
Changes that result in improvement
Knowledge
Theories and ideas
Time
86
The Plan - Do - Study - Act Cycle
  • Primary means for turning the ideas into action
    and for connecting action to learning
  • Can use the cycle to build knowledge
  • when you do not have sufficient information to
    answer one or more of the three questions
  • Can use the cycle to test improvements or
    programs on small scale
  • based on a trial and learning approach to
    improvement or development
  • Allows you to gradually implement a program or
    change

87
Multiple tests at one time
Run clinic at one highschool
Increase promotion
Peer counselling
Selecting physician
88
What if you are developing a new program?
  • Creating new programs
  • Identify aim or goals
  • Identify indicators
  • Identify components needed to achieve program
    goal
  • What activities are needed for each component
  • Look for best practice examples
  • What do other organizations do?
  • What does the theory suggest is needed?
  • Prepare a program logic model - diagram of your
    program

89
Benefits
  • Staff are more open to collecting information on
    how to improve their program.
  • Less threatening for staff
  • Increases likelihood results will be used
  • Program planners can be more responsive to what
    is working and not working.
  • Creates a learning environment for both program
    staff and Funders

90
Drawbacks
  • May be criticized for not being objective
    enough
  • More challenging
  • More skills needed
  • Need to develop a culture of critical assessment
    and quality improvement in order for the
    evaluation to be as objective as possible
  • Requires staff time and training

91
Conclusions
  • The techniques used in program evaluation are not
    that different to continuous quality improvement
  • They evolved from completely separate disciplines
    but even the language is not that far off
  • The main difference is
  • program evaluation stemmed from the need to
    demonstrate effectiveness - accountability
  • CQI stemmed from the need to find ways to make
    programs/systems better

92
Conclusions
  • Most programs take time to evolve before we can
    expect them to be effective.
  • A continuous quality improvement approach to
    evaluation will facilitate that evolution and
    increase the likelihood that a program, given
    time, will be effective and have an impact.

93
Conclusions
  • A CQI approach can Transform Evaluation Practice
    to Meet New Challenges
  • Opportunity to learn from the organizational
    development field
  • Opportunity to close the gap between
    management/business strategy and program planning
    and evaluation
About PowerShow.com