Title: Evaluating HRD Programs
1Evaluating HRD Programs
2Effectiveness
- The degree to which a training (or other HRD
program) achieves its intended purpose - Measures are relative to some starting point
- Measures how well the desired goal is achieved
3Evaluation
4HRD Evaluation
- Textbook definition
- The systematic collection of descriptive and
judgmental information necessary to make
effective training decisions related to the
selection, adoption, value, and modification of
various instructional activities.
5In Other Words
- Are we training
- the right people
- the right stuff
- the right way
- with the right materials
- at the right time?
6Evaluation Needs
- Descriptive and judgmental information needed
- Objective and subjective data
- Information gathered according to a plan and in a
desired format - Gathered to provide decision making information
7Purposes of Evaluation
- Determine whether the program is meeting the
intended objectives - Identify strengths and weaknesses
- Determine cost-benefit ratio
- Identify who benefited most or least
- Determine future participants
- Provide information for improving HRD programs
8Purposes of Evaluation 2
- Reinforce major points to be made
- Gather marketing information
- Determine if training program is appropriate
- Establish management database
9Evaluation Bottom Line
- Is HRD a revenue contributor or a revenue user?
- Is HRD credible to line and upper-level managers?
- Are benefits of HRD readily evident to all?
10How Often are HRD Evaluations Conducted?
- Not often enough!!!
- Frequently, only end-of-course participant
reactions are collected - Transfer to the workplace is evaluated less
frequently
11Why HRD Evaluations are Rare
- Reluctance to having HRD programs evaluated
- Evaluation needs expertise and resources
- Factors other than HRD cause performance
improvements e.g., - Economy
- Equipment
- Policies, etc.
12Need for HRD Evaluation
- Shows the value of HRD
- Provides metrics for HRD efficiency
- Demonstrates value-added approach for HRD
- Demonstrates accountability for HRD activities
- Everyone else has it why not HRD?
13Make or Buy Evaluation
- I bought it, therefore it is good.
- Since its good, I dont need to post-test.
- Who says its
- Appropriate?
- Effective?
- Timely?
- Transferable to the workplace?
14Evolution of Evaluation Efforts
- Anecdotal approach talk to other users
- Try before buy borrow and use samples
- Analytical approach match research data to
training needs - Holistic approach look at overall HRD process,
as well as individual training
15Models and Frameworks of Evaluation
- Table 7-1 lists six frameworks for evaluation
- The most popular is that of D. Kirkpatrick
- Reaction
- Learning
- Job Behavior
- Results
16Kirkpatricks Four Levels
- Reaction
- Focus on trainees reactions
- Learning
- Did they learn what they were supposed to?
- Job Behavior
- Was it used on job?
- Results
- Did it improve the organizations effectiveness?
17Issues Concerning Kirkpatricks Framework
- Most organizations dont evaluate at all four
levels - Focuses only on post-training
- Doesnt treat inter-stage improvements
- WHAT ARE YOUR THOUGHTS?
18A Suggested Framework 1
- Reaction
- Did trainees like the training?
- Did the training seem useful?
- Learning
- How much did they learn?
- Behavior
- What behavior change occurred?
19Suggested Framework 2
- Results
- What were the tangible outcomes?
- What was the return on investment (ROI)?
- What was the contribution to the organization?
20Data Collection for HRD Evaluation
- Possible methods
- Interviews
- Questionnaires
- Direct observation
- Written tests
- Simulation/Performance tests
- Archival performance information
21Interviews
- Advantages
- Flexible
- Opportunity for clarification
- Depth possible
- Personal contact
- Limitations
- High reactive effects
- High cost
- Face-to-face threat potential
- Labor intensive
- Trained observers needed
22Questionnaires
- Advantages
- Low cost to administer
- Honesty increased
- Anonymity possible
- Respondent sets the pace
- Variety of options
- Limitations
- Possible inaccurate data
- Response conditions not controlled
- Respondents set varying paces
- Uncontrolled return rate
23Direct Observation
- Advantages
- Nonthreatening
- Excellent way to measure behavior change
- Limitations
- Possibly disruptive
- Reactive effects are possible
- May be unreliable
- Need trained observers
24Written Tests
- Advantages
- Low purchase cost
- Readily scored
- Quickly processed
- Easily administered
- Wide sampling possible
- Limitations
- May be threatening
- Possibly no relation to job performance
- Measures only cognitive learning
- Relies on norms
- Concern for racial/ ethnic bias
25Simulation/Performance Tests
- Advantages
- Reliable
- Objective
- Close relation to job performance
- Includes cognitive, psychomotor and affective
domains
- Limitations
- Time consuming
- Simulations often difficult to create
- High costs to development and use
26Archival Performance Data
- Advantages
- Reliable
- Objective
- Job-based
- Easy to review
- Minimal reactive effects
- Limitations
- Criteria for keeping/ discarding records
- Information system discrepancies
- Indirect
- Not always usable
- Records prepared for other purposes
27Choosing Data Collection Methods
- Reliability
- Consistency of results, and freedom from
collection method bias and error - Validity
- Does the device measure what we want to measure?
- Practicality
- Does it make sense in terms of the resources used
to get the data?
28Type of Data Used/Needed
- Individual performance
- Systemwide performance
- Economic
29Individual Performance Data
- Individual knowledge
- Individual behaviors
- Examples
- Test scores
- Performance quantity, quality, and timeliness
- Attendance records
- Attitudes
30Systemwide Performance Data
- Productivity
- Scrap/rework rates
- Customer satisfaction levels
- On-time performance levels
- Quality rates and improvement rates
31Economic Data
- Profits
- Product liability claims
- Avoidance of penalties
- Market share
- Competitive position
- Return on investment (ROI)
- Financial utility calculations
32Use of Self-Report Data
- Most common method
- Pre-training and post-training data
- Problems
- Mono-method bias
- Desire to be consistent between tests
- Socially desirable responses
- Response Shift Bias
- Trainees adjust expectations to training
33Research Design
- Specifies in advance
- the expected results of the study
- the methods of data collection to be used
- how the data will be analyzed
34Research Design Issues
- Pretest and Posttest
- Shows trainee what training has accomplished
- Helps eliminate pretest knowledge bias
- Control Group
- Compares performance of group with training
against the performance of a similar group
without training
35Recommended Research Design
- Pretest and posttest with control group
- Whenever possible
- Randomly assign individuals to the test group and
the control group to minimize bias - Use time-series approach to data collection to
verify performance improvement is due to training
36Ethical Issues Concerning Evaluation Research
- Confidentiality
- Informed consent
- Withholding training from control groups
- Use of deception
- Pressure to produce positive results
37Assessing the Impact of HRD
- Money is the language of business.
- You MUST talk dollars, not HRD jargon.
- No one (except maybe you) cares about the
effectiveness of training interventions as
measured by and analysis of formal pretest,
posttest control group data.
38HRD Program Assessment
- HRD programs and training are investments
- Line managers often see HR and HRD as costs
i.e., revenue users, not revenue producers - You must prove your worth to the organization
- Or youll have to find another organization
39Evaluation of Training Costs
- Cost-benefit analysis
- Compares cost of training to benefits gained such
as attitudes, reduction in accidents, reduction
in employee sick-days, etc. - Cost-effectiveness analysis
- Focuses on increases in quality, reduction in
scrap/rework, productivity, etc.
40Return on Investment
- Return on investment Results/Costs
41Calculating Training Return On Investment
Results
Results
Operational
How
Before
Expressed
After
Differences
Results Area
Measured
Training
( or )
in
Training
1.5 rejected
.5
720 per day
Quality of panels
rejected
2 rejected
1,440 panels
1,080 panels
360 panels
172,800
per day
per day
per year
Housekeeping
Visual
10 defects
2 defects
8 defects
Not measur-
inspection
(average)
(average)
able in
using
20-item
checklist
Preventable
Number of
24 per year
16 per year
8 per year
accidents
accidents
Direct cost
144,000
96,000 per
48,000
48,000 per
of each
per year
year
year
accident
Return Investment
Total savings 220,800.00
Operational Results Training Costs
ROI
220,800 32,564
6.8
SOURCE From D. G. Robinson J. Robinson (1989).
Training for impact. Training and Development
Journal, 43(8), 41. Printed by permission.
42Types of Training Costs
- Direct costs
- Indirect costs
- Development costs
- Overhead costs
- Compensation for participants
43Direct Costs
- Instructor
- Base pay
- Fringe benefits
- Travel and per diem
- Materials
- Classroom and audiovisual equipment
- Travel
- Food and refreshments
44Indirect Costs
- Training management
- Clerical/Administrative
- Postal/shipping, telephone, computers, etc.
- Pre- and post-learning materials
- Other overhead costs
45Development Costs
- Fee to purchase program
- Costs to tailor program to organization
- Instructor training costs
46Overhead Costs
- General organization support
- Top management participation
- Utilities, facilities
- General and administrative costs, such as HRM
47Compensation for Participants
- Participants salary and benefits for time away
from job - Travel, lodging, and per-diem costs
48Measuring Benefits
- Change in quality per unit measured in dollars
- Reduction in scrap/rework measured in dollar cost
of labor and materials - Reduction in preventable accidents measured in
dollars - ROI Benefits/Training costs
49Utility Analysis
- Uses a statistical approach to support claims of
training effectiveness - N Number of trainees
- T Length of time benefits are expected to
last - dt True performance difference resulting from
training - SDy Dollar value of untrained job performance
(in standard deviation units) - C Cost of training
- ?U (N)(T)(dt)(Sdy) C
50Critical Information for Utility Analysis
- dt difference in units between
trained/untrained, divided by standard deviation
in units produced by trained - SDy standard deviation in dollars, or overall
productivity of organization
51Ways to Improve HRD Assessment
- Walk the walk, talk the talk MONEY
- Involve HRD in strategic planning
- Involve management in HRD planning and estimation
efforts - Gain mutual ownership
- Use credible and conservative estimates
- Share credit for successes and blame for failures
52HRD Evaluation Steps
- Analyze needs.
- Determine explicit evaluation strategy.
- Insist on specific and measurable training
objectives. - Obtain participant reactions.
- Develop criterion measures/instruments to measure
results. - Plan and execute evaluation strategy.
53Summary
- Training results must be measured against costs
- Training must contribute to the bottom line
- HRD must justify itself repeatedly as a revenue
enhancer, not a revenue waster