Some Practical Tips for Measuring Financial Success Dr' Angela Lyons University of Illinois - PowerPoint PPT Presentation

Loading...

PPT – Some Practical Tips for Measuring Financial Success Dr' Angela Lyons University of Illinois PowerPoint presentation | free to download - id: 1c1a38-N2YwZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Some Practical Tips for Measuring Financial Success Dr' Angela Lyons University of Illinois

Description:

Some Practical Tips for Measuring Financial Success Dr' Angela Lyons University of Illinois – PowerPoint PPT presentation

Number of Views:121
Avg rating:3.0/5.0
Slides: 143
Provided by: jlh7
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Some Practical Tips for Measuring Financial Success Dr' Angela Lyons University of Illinois


1
Some Practical Tips for Measuring Financial
Success Dr. Angela LyonsUniversity of Illinois
Measuring Financial Success Using Program
Evaluation Presented by Dr. Angela
Lyons University of Illinois November 2008
2
The Million Dollar Question
  • At the end of the day, does
    financial education make a difference?

3
Lessons Learned from Current Research
  • Jumptart Survey of Financial Literacy Among High
    School Students Captures knowledge levels
  • NEFE High School Financial Planning Program
    Impact of formal financial education on
    confidence levels and behaviors of high school
    students
  • Bernheim, Garrett, and Maki (2001) Effect of
    mandated financial education during high school
    (longitudinal study)
  • FDICs Money Smart Program Moving the unbanked
    into the financial mainstream
  • See Reading List for recent research on
    financial education and program evaluation.

4
Becoming a critical evaluator is important!
  • Read media stories carefully.
  • Look at the samples being used.
  • Information vs. education.
  • Planned behavior vs. actual behavior.
  • Avoid focusing only on the successes.
  • Think beyond participants finances.
  • Be aware of the barriers and challenges related
    to measuring program impact.

5
An Overview of the Training Session
  • Setting the Stage for Program Success
  • The Evaluation Process
    Creating Your Toolkit
  • Putting It All Together
    Sample Evaluations
  • NEFE Financial Education and
    Program Evaluation Toolkit
  • Barriers and Challenges to
    Building Successful Programs
  • Building Program Success
    Reporting Program Impact

6
Part ISetting the Stage for Program Success
7
Current State of Program Evaluation
  • Current evaluation efforts are still far from
    satisfactory.
  • General lack of evaluation capacity and
    understanding of how to conduct effective
    evaluations.
  • Evaluation is still often treated as an after
    thought needs to be built into the design of the
    program upfront.
  • Lack of attention given to evaluation at all
    levels.
  • Need for industry standards for program
    evaluation.

Source Lyons, A. C., Palmer, L., Jayaratne,
K.S.U., and Scherpf, E.  (2006). "Are We Making
the Grade? A National Overview of Financial
Education and Program Evaluation. The Journal of
Consumer Affairs, 40(2), 208-235.
8
One non-profit administrator commented.
  • The people that typically end up being told
    that they have to do evaluation, its dumped on
    them and its usually not a person that has any
    experience with financial education or expertise
    in evaluation. Theyre pretty much told heres
    your new hat, weve been told we have to do this
    and heres your new hat, and they dont know.
    Its not for lack of wanting to do a good
    evaluation or trying to do a good evaluation.
    They just dont knowits not the right person
    trying to oversee it.

9
On the front lines.
  • What even is an evaluation?
  • What do we mean by evaluation?
  • How do we know if participants are getting
    better? Its difficult to assess.
  • What are we trying to measure? Theres a lot
    of confusion out there.
  • What constitutes a successful, or even
    acceptable, evaluation?

10
Getting Started Thinking like an
evaluator.(Program Planning Guide)
  • Take stock of who you are What is your
    vision?
  • Conduct a needs assessment.
  • Collect baseline information from your target
    audience.
  • Identify your signature program(s).
  • Identify your program objectives. Be realistic!
  • Create an evaluation action plan.
  • What do you want to accomplish? At the end of
    the day, what do you want to show?

11
What is an outcome-based evaluation?
  • Outcomes are benefits to clients from
    participating in the program.
  • What do you want your participants to know or be
    able to do when they have finished the program?
  • Outcomes are usually in terms of enhanced
    learning and improved behaviors.
  • Outcomes are often confused with program outputs
    or units of service (e.g., number of clients who
    went through the program).

12
The Logic Model
  • A picture of the program.
  • Simple representation of the program theory or
    action which explains the program and what it
    is to accomplish.
  • Shows relationships between inputs, outputs, and
    outcomes.

13
The Logic Model (conti.)
  • INPUTS

OUTPUTS
OUTCOMES
Resources used to develop the program are called
inputs. Time and money are the most common
inputs needed to implement educational programs.
If inputs are invested into the financial
education program, then learning opportunities
will be created for the target audience. The
created educational materials, services, and
opportunities are called the program outputs.
Changes in participants perceptions, knowledge,
and behavior that represent real impact in their
lives. The benefits derived by the participants
from the program are called outcomes.
14
University of Wisconsin - Extensionhttp//www.uwe
x.edu/ces/pdande/evaluation/evallogicmodel.html
15
Impact Hierarchy of Outcomes
16
Another useful framework.Transtheoretical Model
of Behavior Change (TTM)
  • TTM integrates major psychological theories into
    a theory of behavior change.
  • Used to identify the state at which individuals
    are ready and able to change their financial
    behaviors.
  • Appropriate educational interventions are then
    tailored to meet individuals specific needs at
    that particular stage.

17
5 Stages of Change
  • Precontemplation
  • Individual not ready to take action and change
    behavior in the immediate future.
  • Rarely seeks help and rarely uses information.
  • Contemplation
  • Individual is getting ready to take action and
    intends to change behavior in next
    6 months.
  • Open to education.
  • Preparation
  • Individual is ready to take action and intends to
    change behavior in next 30 days.
  • Practices behavior by taking small steps towards
    the goal.
  • Seeks information and support, but often
    concerned that changing behavior may be too
    difficult and they may not succeed.

18
5 Stages of Change (conti.)
  • Action
  • Individual changes behavior and maintains
    behavior for at least 6 months.
  • Believes they can change.
  • Can control triggers that cause them to relapse
    into old behaviors.
  • Has a support system to get them through
    challenging times.
  • Maintenance
  • Individual has changed behavior and it has lasted
    for more than 6 months.
  • May relapse into old behaviors, but can overcome
    temptations so that behavior becomes permanent.
  • Can assess the conditions under which relapse
    might occur.
  • Can establish successful coping strategies.

19
Example
20
Identifying Program Objectives
  • Objectives should be
  • Specific
  • Measurable
  • Achievable and observable
  • Reasonable
  • Time specific
  • S.M.A.R.T. objective statements should clearly
    define what you want to achieve with your
    program.
  • They should list the end outcomes the program
    intends to affect or change.

21
Writing objective statements
  • First-time home buyer education program
  • The objectives of this program are to
  • Develop first-time home buyers ability to shop
    for the lowest mortgage interest rate.
  • Teach first-time home buyers how to save money
    for closing costs.
  • Teach first-time home buyers how to assess
    affordable housing.
  • Debt reduction education program
  • The objectives of this program are to
  • Develop participants ability to identify needs
    and wants separately.
  • Develop participants ability to control wants
    to reduce expenditures.
  • Develop participants ability to avoid impulse
    and emotional spending.

22
Achieving your objectivesSelecting appropriate
indicators
  • General Indicators (objective and subjective)
  • Number of programs, participants, etc.
  • Knowledge gains
  • Changes in attitudes and satisfaction
  • Changes in skills and confidence
  • Changes in intended and actual behaviors
  • Specific Indicators (objective)
  • Actual dollar changes (reduce debt, increase
    savings)
  • Development of financial plans
  • Changes in spending habits
  • Building or rebuilding credit reports and credit
    scores

23
(No Transcript)
24
ACTIVITY Your Evaluation Road Map
25
ACTIVITY Defining Your Objectives(Evaluation
Action Plan Part A)
  • What is your signature program
  • (e.g., course, workshop, educational materials,
    initiative, campaign)?

26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
  • Who will use the evaluation and how?

31
Part IIThe Evaluation Planning ProcessCreating
Your Toolkit
32
What evaluation method should you use to collect
impact data?
  • Surveys
  • Focus groups
  • Interviews
  • Observations
  • Case studies
  • Tests of ability
  • Some examples from financial education.

33
Questions to ask yourself .
  • What are the pros and cons of each method?
  • What is the purpose of the evaluation?
  • Who will use the information and how?
  • What information do you want to collect?
  • Who is your target audience?
  • What is your primary delivery method?
  • What are your available resources (i.e., time,
    money, and staff)?
  • What is your timeline?
  • What is your expertise and evaluation capacity?
  • Who are your partners, funders, and stakeholders?

34
Common survey methods used to collect impact data
  • Post evaluation only
  • Retrospective pre-test (RPT)
  • Pre and post evaluation
  • Follow-up
  • Stages to Change (TTM)
  • Control groups and longitudinal studies
  • Key question to ask
  • What is the length of your program?

35
Post evaluation only
  • When to use Short programs that are less than 2
    hours
  • Advantages
  • Only need to survey group once.
  • Good for limited-resource audiences and groups
    that are transient.
  • Relatively inexpensive and less time intensive.
  • Can document participants levels of knowledge,
    skills, and planned behaviors at the end of the
    program.
  • Disadvantages
  • With no pre assessment, its difficult to
    document potential and actual changes in
    knowledge, attitudes, and behavior.
  • Retrospective pre-tests (RPTs) The Post-Then-Pre
    Evaluation

36
Retrospective pre-test (RPTs)
  • When to use Any program, but typically 2 hours
    or less
  • Advantages
  • Only need to survey group once.
  • Good for limited-resource audiences and groups
    that are transient.
  • Controls for response shift bias.
  • Can document relative change.
  • Disadvantages
  • Potential for respondent bias (social
    desirability factor).
  • Self-assessment measures are subjective.

37
RPTs (continued)
  • Examples and more info on RPTs
  • Collecting Evaluation Data End-of-Session
    Questionnaires. University of Wisconsin-Extension
    . www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • Lyons, A. C., Y. Chang, and E. Scherpf. 
    Translating Financial Education into Behavior
    Change for Low-Income Populations. Financial
    Counseling and Planning Journal, 17(2) 27-45.
  • Chang, Y., and A. C. Lyons. Are Financial
    Education Programs Meetings the Needs Financially
    Disadvantaged Consumers? Networks Financial of
    Institute, Indiana State University, 2007-WP-02.

38
Pre and post evaluations
  • When to use Programs that are 2 hours or longer
  • Advantages
  • Can compare pre and post responses and document
    changes in knowledge, attitudes, and behavior.
  • Can be used to document immediate changes in
    knowledge, skills and planned behaviors following
    the program.
  • Disadvantages
  • More time intensive.
  • Identification numbers are needed to match pre
    and post surveys.
  • May be difficult to show actual behavior change.
  • May be difficult to show that the intervention
    caused the change.
  • Doesnt account for other possible reasons for
    change.
  • Transient populations may lead to low unmatched
    evaluations.

39
Follow-ups
  • When to use
  • Program is comprehensive enough to potentially
    result in intermediate and long-term impact.
  • Must have adequate resources and evaluation
    capacity.
  • Usually administered three to six months after
    the program.
  • Can document changes in actual financial
    behaviors, ability to achieve financial goals,
    and overall financial position.

40
Delivery methods for follow-ups
  • Face-to-face
  • Mail (paper survey, post cards)
  • Telephone
  • Internet (e-mail, website)
  • Group interviews

41
Stages to Change (TTM)
  • When to use Programs that have multiple
    sessions
  • Advantages
  • Can document intermediate and long-term change.
  • Easier to measure actual behavior change and to
    control for other factors that may lead to change
    over time.
  • Can identify stage at which individual is ready
    and able to change behavior.
  • Behaviors can be recorded at the beginning,
    middle, and end of the program so that changes in
    actual behavior can be observed.
  • Disadvantages
  • Time and resource intensive.
  • May require additional progress reporting and
    long-term follow-up.
  • Can only be used with multi-session programs.

42
Train-the-trainer evaluations
  • Similar to pre and post evaluation, but more
    content specific.
  • Covers subject material in more detail to ensure
    that trainers have an adequate level of knowledge
    to teach the program to others.
  • Can be used to document changes in both the
    instructors teaching skills and personal
    financial behaviors.
  • Follow-ups can document how the curriculum
    materials are being used and identify additional
    programming needs.

43
Designing the evaluation instrumentSurvey
content
  • General reactions to the session
  • Changes in knowledge
  • Changes in motivation, confidence, and abilities
  • Intended changes in behavior
  • Actual changes in behavior
  • Future programming needs and preferences
  • Demographics
  • Qualitative / open-ended responses

44
General reactions to the session
  • Please rate the instructor(s), materials, and the
    overall program
  • by checking the box that best applies.

45
Measuring changes in knowledge
  • Testing Knowledge
  • Please circle your answer to each of the
    following statements.

46
Measuring changes in knowledge (conti.)
  • Format can be True/False or multiple choice.
  • True/False is reliable indicator for low literacy
    audiences and youth.
  • The more questions you ask, the greater the
    reliability measure.
  • May include a dont know option to control for
    guessing.
  • Post-test 10 questions (established standard)
  • Pre- and post-test 10-20 questions
  • Train-the-trainer 10-25 questions

47
Changes in motivation, confidence, and abilities
  • Building Skills/Confidence Indicators
  • Please check the box that best describes your
    confidence to do
  • the following

48
Changes in motivation, confidence, and abilities
(conti.)
  • Recording Participants Attitudes
  • Please check the box that best describes how much
    you agree
  • with the following statements.

49
Intended changes in behavior
  • Taking Charge Indicators
  • Please check the box that best describes your
    answer.

50
Actual changes in behavior
  • Financial Behavior Indicators
  • Please indicate how often you are currently doing
    each of the following
  • financial practices. There is no right or
    wrong answer. (Choose only one)

51
Using TTM scale (general categories)
  • Financial Behavior Indicators
  • For each financial practice, please check the box
    that best describes
  • your current behavior.

52
Using TTM scale (specific categories)
  • Financial Behavior Indicators
  • For each financial practice, please check the box
    that best describes
  • your current behavior.

53
Capturing behavior change with follow-ups
  • Since completing the program, please check the
    box that best describes
  • how often you are doing each financial practice.
    There is no right or
  • wrong answer. (Choose only one)

54
Capturing behavior change with follow-ups
(conti.)
Financial Progress Indicators Please indicate how
the following numbers have changed for you
personally since completing the program.
55
Capturing behavior change with follow-ups
(conti.)
  • Progress Reporting
  • Please record your financial position based on
    your current progress in
  • the program.

56
A few words about train-the-trainer programs.
  • Testing knowledge
  • Building teaching skills
  • Shaping personal skills
  • Taking action for teaching
  • Taking action for personal financial success
  • Follow-ups

57
Qualitative / Open-Ended Questions (common
examples)
  • Post Evaluation Only and Pre and Post
    Evaluation
  • What did you like the most about this program?
  • What did you like the least about this program?
  • How could this program be improved?
  • Would you recommend this program to others?
  • Stages to Change Evaluation
  • What has made it easier for you to improve your
    financial practices?
  • What has prevented you from improving your
    financial practices?
  • With respect to the overall program, what did you
    like the most?
  • What did you like the least?
  • How could this program be improved?
  • Have you shared what you learned with others?
  • Would you recommend this program to others?

58
Qualitative / Open-Ended Questions (conti.)
  • Train-the-Trainer Evaluation
  • What was the most helpful information you
    received during this training program?
  • How could this training program be improved?
  • How do you plan to share this information with
    your target audience(s)?
  • What information and materials from this training
    do you plan to share with your target
    audience(s)?
  • Will you share what you learned with other
    instructors and colleagues?
  • Would you recommend this training program to
    other instructors and colleagues?

59
Demographic Questions
  • Age
  • Gender
  • Race, Ethnicity, and Language
  • Marital Status
  • Education
  • Employment
  • Family Structure
  • Health Status
  • Income, Assets, and Debts
  • Region/Location
  • Financial Experience
  • Students/Youth
  • Instructors/Educators

60
Common types of survey questions
  • Yes/No questions
  • True/False
  • Agree/Disagree
  • Multiple choice
  • One best answer
  • Multiple answers
  • Rating and ranking questions
  • Qualitative / open-ended questions

61
Choosing measurement scales and scoring
  • Example
  • Resource
  • Collecting Evaluation Data End-of-Session
    Questionnaires.
  • University of Wisconsin-Extension, p. 62-64.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html

62
Other helpful tips on survey design
  • Think carefully about how to write the questions
    given your target audience. Use plain language.
  • Make the evaluation form easy to complete
    (i.e., white space and font).
  • Include simple instructions.
  • Start with non-threatening questions.
  • Keep the evaluation as short as possible.
  • Cluster similar items to save time and space.
  • Protect the participants identity.

63
A few words of caution when selecting
indicators.
  • Measurement error and validity of indicators.
  • Financial knowledge
  • Confidence level
  • Financial behavior
  • EXAMPLE Which of the following are valid
    indicators of behavior change?
  • Participant opened a bank account.
  • Participant increased savings.
  • Participant avoided bankruptcy.
  • Participant did not default on mortgage payments.

64
Other measurement issues
  • Self-reports are subject to bias.
  • Social desirability
  • Norms and rules of thumb
  • Misperceptions and over-optimism
  • Memory distortion and recall bias
  • Samples may not be representative.
  • Non-response bias
  • Program attrition
  • Self-selection
  • Low response rates (e.g., follow-ups)

65
  • Environmental factors may affect outcomes.
  • Unexpected life events
  • Program incentives (e.g., rewards, special
    benefits, enrollment programs)
  • Individualized financial advice or coaching
  • Psychological factors.
  • Inherent motivation
  • Ability
  • Attitudes

66
Solutions for measurement issues
  • Longitudinal data?
  • Control groups?
  • Randomized experiments?

67
There are numerous behavior indicators.Here are
some examples.
  • Increases in savings.
  • Decreases in debt.
  • Maintaining a regular budget.
  • Comparison shopping.
  • Increases in new accounts opened.
  • Improved credit scores.
  • Improved communication with spouse/partner/parents
    about finances.
  • Other common indicators?

68
  • How do these indicators change for various target
    populations?
  • Youth?
  • Underserved?
  • Adults?
  • Members of your organization?

69
Part III Putting It All Together! Sample
Evaluations
70
ACTIVITY Selecting Your Evaluation
Methods (Evaluation Action Plan Part B)
71
Think about your signature program, what is the
most appropriate evaluation method?
  • Post-test only
  • Retrospective pre-test
  • Pre and post-test
  • Follow-up survey
  • Stages-to-change
  • Focus groups
  • Interviews
  • Case studies
  • Observations
  • Stories/anecdotal evidence
  • Tests of ability
  • Other

72
  • What types of questions will
  • the evaluation seek to answer?

73
What types of indicators will you use to document
this impact?
  • Changes in satisfaction levels
  • Changes in knowledge
  • Changes in skills and confidence levels
  • Changes in attitudes
  • Changes in aspirations
  • Anticipated or intended changes in behavior
  • Actual changes in behavior
  • Socio-economic changes
  • Other

74
(No Transcript)
75
(No Transcript)
76
(No Transcript)
77
Useful references for evaluation design
  • NEFE Financial Education Evaluation Toolkit
  • http//www2.nefe.org/eval/index.php
  • Collecting Evaluation Data End-of-Session
    Questionnaires.
  • University of Wisconsin-Extension.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • A Step-by-Step Guide to Developing Effective
    Questionnaires
  • and Survey Procedures for Program Evaluation
    Research.
  • Rutgers Cooperative Research Extension, FS995.
  • www.rcre.rutgers.edu/pubs/publication.asp?pidFS99
    5

78
Part VI NEFE Financial Education Evaluation
Toolkit http//www2.nefe.org/eval/intro.html
79
NEFE Financial Education Evaluation Toolkit
  • Database
  • Post evaluation only with option for follow-up
  • Pre and post evaluation with option for follow-up
  • Stages to Change Evaluation
  • Train-the-Trainer
  • Testing Knowledge
  • Building Skills
  • Taking Charge
  • Manual
  • How-to-guide for grass-roots level organizations
  • Examples (survey instruments, executive summary,
    reports)
  • Guidance on how to organize and present impact
    data

80
Manual http//www2.nefe.org/eval/manual.html
81
(No Transcript)
82
Part I Financial Education Overview
83
Part II Understanding Program Evaluation
84
(No Transcript)
85
(No Transcript)
86
Part III The Evaluation Planning Process
87
(No Transcript)
88
Part IV Using the Evaluation Database
89
Part V Reporting Program Impact
90
(No Transcript)
91
(No Transcript)
92
Appendix Sample Evaluation Instruments
93
Database http//www2.nefe.org/eval/index.php
94
Step 1 Program Info and Follow-up
95
Step 2 Knowledge Questions
96
Step 2a Selecting Questions
97
Step 2b Customizing Questions
98
Step 3 Confidence and Behavior Indicators
99
Step 4a Recommendations
100
Step 4b Selecting Statements
101
Step 4c Customizing Statements
102
Step 5 Qualitative Data
103
Step 6 Demographics
104
(No Transcript)
105
Step 7 Follow-Up Financial Progress Indicators
106
Step 8 Follow-Up Personal Achievements
107
Step 9 Follow-Up Demographics
108
Step 10 Finalizing Evaluation
109
Part VImplementing Your Evaluation Putting
Your Tools into Action
110
5 Biggest Evaluation Challenges
  • 1. Identifying the ideal approach to
    evaluation.
  • Evaluation methods and measures vary widely
    across programs and academic disciplines.
  • Wide variation in financial outcomes across
    programs.
  • Significant differences in financial
    needs across consumers.
  • Some participants unable to
  • implement certain financial behaviors.

111
5 Biggest Challenges (conti.)
  • 2. Defining program success.
  • Setting realistic expectations for program
    participants.
  • Choosing appropriate outcomes and indicators
    based on participants financial situation or
    other external constraints.
  • Identifying participants individual financial
    needs and applying appropriate educational
    interventions.
  • Finding the teachable moment.

112
5 Biggest Challenges (conti.)
  • What is driving this financial education
    movement? Why is it so important? What are we
    ultimately trying to address? Is it reducing the
    poverty gap in this country? Between those that
    have and those that dont have. And its
    widening. And those at the bottom end of the
    spectrum.what were asking them is to build
    wealth. And at the same time, what were asking
    people in this country who make 20,000 or less
    is Absent us raising your wages in this
    country, were asking you to build wealth, to
    participate in IDA programs. Were asking you to
    save with the little amount of money youre
    making. Were asking you to reduce your debt
    burden, learn how to manage your money, and clean
    up your credit history with the little amount of
    money youre working with. And we want you to
    get from point A to point B with all those
    constraints.

Source Lyons, A. C., Palmer, L., Jayaratne,
K.S.U., and Scherpf, E.  (2006). "Are We Making
the Grade? A National Overview of Financial
Education and Program Evaluation. The Journal of
Consumer Affairs, 40(2), 208-235.
113
5 Biggest Challenges (conti.)
  • 3. Collecting impact data from
    program participants.
  • Little incentive to complete evaluations (like
    pulling teeth).
  • Reluctance to divulge personal information
    (surveys too personal lack of trust).
  • High drop out rates, low response rates, and
    difficult to track.
  • Literacy levels (i.e., ESL, reading level).
  • Collecting sensitive data and information.
  • Tradeoff between participation and evaluation
    rigor.

114
5 Biggest Challenges (conti.)
  • 4. Designing and implementing effective program
    evaluations.
  • Evaluation process is cumbersome.
  • Lack of time, staff, and financial resources.
  • The PUSH for increased rigor and the rush to
    the finish line.
  • Rigor vs. Reality (e.g., measurement issues)
  • The limitations of one-shot evaluations.
  • (pre- and post-tests intended vs. actual
    behavior change)
  • The reality of conducting longitudinal studies
    with control groups.
  • (follow-ups and tracking of program
    participants)

115
5 Biggest Challenges (conti.)
  • 5. Conducting more rigorous, theory-based
    evaluation research.
  • We need to back up and spend more time trying to
    understand financial behavior and why people do
    what they do.
  • Until then, financial education will only
  • serve as a band-aid rather than
  • a long-term solution, and we will
  • continue to struggle with how
  • to define financial success.

116
Simple Steps to Overcoming the Barriers
  • Increase rigor by planning more strategically.
  • Focus on signature programs and on multi-session
    programs.
  • Partner and pool resources.
  • Were jumping into evaluating everything,
    instead oftaking a couple of projected outcomes
    or a subset of all that we work with and trying
    to do evaluations with those.

117
Overcoming the Barriers (conti.)
  • Identify available resources financial and
    non-financial.
  • Understand funders needs and how they fit into
    your evaluation plan.
  • Take into consideration program delivery methods.

118
Overcoming the Barriers (conti.)
  • Establish a consistent and workable set of
    standards for measuring program impact.
  • Create evaluation tools that are flexible to
    account for the wide range in programs (i.e.,
    one-stop shop with survey instruments, best
    practices, online training workshops, etc.)
  • Reality of program evaluation at all levels
    (disconnect need better awareness of resource
    constraints continued recognition of traditional
    evaluation methods).

119
You first need to ask.
  • If resources were not a constraint,
  • what would your ideal program evaluation look
    like?

120
Then, the reality check.
  • What challenges do you face
  • in trying to implement
  • your ideal evaluation?

121
Thinking outside of the box.
  • How can you overcome these challenges?
  • What financial and non-financial resources are
    available (e.g., time, money, staff, expertise)?
  • Are there others who can help (e.g., partners,
    stakeholders, funders, volunteers)?
  • What financial and non-financial resources do
    they have available?
  • Given constraints, what can you realistically do?

122
ACTIVITY Overcoming Your Challenges(Evaluation
Action Plan - Part C)
123
Part VIBuilding Program SuccessReporting
Program Impact
124
The common fear of evaluation
  • It will show what were doing wrong!
  • Learning from the successes and the failures.

125
Putting it all together
  • Look for themes.
  • Work with what youve got.
  • Learn as you go and be flexible.
  • Tell the story, which can be the most powerful
    depiction of the benefits and services of your
    program.
  • Use the findings to improve your program.

126
Tips for telling your story
  • Know your audience.
  • Use simple descriptive statistics (i.e., counts,
    percentages, and averages) when analyzing and
    interpreting data.
  • Dont use jargon. Be straightforward and
    clearly state major findings.
  • Use language that is suggestive rather
    than decisive (i.e., the data suggest
    rather than the data show). Be
    careful not to overstate your
    findings.

127
  • Blend the presentation with quantitative and
    qualitative data.
  • Do not generalize the findings to the entire
    group. Report the results in terms of the
    program participants rather than all U.S.
    families or all New York residents.
  • Clearly describe who the results represent.
    Provide information and demographics on the
    sample of program participants.
  • Be honest about your programs strengths and
    weaknesses, while highlighting the positive.

128
Writing Impact Statements - Examples
  • Statements that reflect intentions
  • As a result of participating in this financial
    education program, X percent reported that
    they.
  • plan to do/use/adopt
  • are more knowledgeable
  • are more confident in their ability to do
  • are more likely to do/use/adopt
  • will do/use/adopt
  • .a particular attitude, piece of information,
    or behavior.

129
  • Statements that reflect actual actions
  • As a result of participating in this financial
    education program, X percent reported that
    they.
  • are now doing
  • did
  • used
  • increased knowledge of
  • adopted
  • .a particular attitude, piece of information,
    or behavior.

130
Analyzing the findings
  • How will you use the findings for program
    improvement and internal reporting?
  • How will the evaluation findings be communicated
    and shared with others?

131
Disseminating the findings
  • Written reports
  • Short summary statements
  • Media releases
  • Internet postings
  • Graphs and visuals
  • Presentations
  • Displays, posters, etc.

132
ACTIVITY Analyzing and Reporting Your
Findings(Evaluation Action Plan - Part D)
  • How will you analyze the data?
  • And, how will you use the findings?

133
  • What do you hope to learn from the findings?
  • What are the potential impacts?

134
  • How will you disseminate the findings?
  • Who will you share the findings with?

135
Useful references for reporting impact
  • Collecting Evaluation Data Surveys.
  • University of Wisconsin-Extension.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • Taking Stock A Practical Guide to Evaluating
    Your Own
  • Programs.
  • Horizon Research, Inc.
  • www.horizon-research.com/reports/1997/stock.pdf
  • Tipsheets 66, 80, 81.
  • Penn State Cooperative Extension.
  • www.extension.psu.edu/evaluation/titles.html

136
Where do we go from here?Online resources at
your fingertips
137
University of Wisconsin-Extension http//www.uwex
.edu/ces/pdande/evaluation/index.html
138
Cornell University Extension http//staff.cce.cor
nell.edu/administration/program/evaluation/evalref
s.htm
139
Penn State Extension http//www.extension.psu.edu
/evaluation/
140
Reading List
  • Lyons, A. C., Palmer, L., Jayaratne, K.S.U., and
    Scherpf, E.  (2006). "Are We Making the Grade? A
    National Overview of Financial Education and
    Program Evaluation. The Journal of Consumer
    Affairs, 40(2), 208-235.
  • Lyons, A. C. (2005). Financial Education and
    Program Evaluation The Challenges and
    Potentials for Financial Professionals. Journal
    of Personal Finance, 4(4), 56-68.
  • US Government Accountability Office. (2004). The
    Federal Governments Role in Improving Financial
    Literacy, GAO-05-93SP.
  • Financial Literacy Education Commission.
    (2006). Taking Ownership of the Future The
    National Strategy for Financial Literacy.
    www.mymoney.gov

141
Contact Information
  • Dr. Angela Lyons
  • Associate Professor
  • Department of Agricultural and Consumer Economics
  • University of Illinois
  • Phone 217-244-2612
  • E-mail anglyons_at_illinois.edu

142
Questions
About PowerShow.com