Evaluating HRD Programs - PowerPoint PPT Presentation

About This Presentation
Title:

Evaluating HRD Programs

Description:

Why HRD Evaluations are Rare Need for HRD Evaluation Make or Buy Evaluation Models and Frameworks of Evaluation Kirkpatrick s Four Levels Issues Concerning ... – PowerPoint PPT presentation

Number of Views:242
Avg rating:3.0/5.0
Slides: 42
Provided by: wells3
Category:

less

Transcript and Presenter's Notes

Title: Evaluating HRD Programs


1
Evaluating HRD Programs
  • Chapter 7

2
Learning Objectives
  • Define evaluation and explain its role/purpose
    in HRD.
  • Compare different models of evaluation.
  • Discuss the various methods of data collection
    for HRD evaluation.
  • Explain the role of research design in HRD
    evaluation.
  • Describe the ethical issues involved in
    conducting HRD evaluation.
  • Identify and explain the choices available for
    translating evaluation results into dollar terms.

3
Effectiveness
  • The degree to which a training (or other HRD
    program) achieves its intended purpose
  • Measures are relative to some starting point
  • Measures how well the desired goal is achieved

4
Evaluation
5
HRD Evaluation
  • It is the systematic collection of descriptive
    and judgmental information necessary to make
    effective training decisions related to the
    selection, adoption, value, and modification of
    various instructional activities.

6
In Other Words
  • Are we training
  • the right people
  • the right stuff
  • the right way
  • with the right materials
  • at the right time?

7
Evaluation Needs
  • Descriptive and judgmental information needed
  • Objective and subjective data
  • Information gathered according to a plan and in a
    desired format
  • Gathered to provide decision making information

8
Purposes of Evaluation
  • Determine whether the program is meeting the
    intended objectives
  • Identify strengths and weaknesses
  • Determine cost-benefit ratio
  • Identify who benefited most or least
  • Determine future participants
  • Provide information for improving HRD programs

9
Purposes of Evaluation 2
  • Reinforce major points to be made
  • Gather marketing information
  • Determine if training program is appropriate
  • Establish management database

10
Evaluation Bottom Line
  • Is HRD a revenue contributor or a revenue user?
  • Is HRD credible to line and upper-level managers?
  • Are benefits of HRD readily evident to all?

11
How Often are HRD Evaluations Conducted?
  • Not often enough!!!
  • Frequently, only end-of-course participant
    reactions are collected
  • Transfer to the workplace is evaluated less
    frequently

12
Why HRD Evaluations are Rare
  • Reluctance to having HRD programs evaluated
  • Evaluation needs expertise and resources
  • Factors other than HRD cause performance
    improvements e.g.,
  • Economy
  • Equipment
  • Policies, etc.

13
Need for HRD Evaluation
  • Shows the value of HRD
  • Provides metrics for HRD efficiency
  • Demonstrates value-added approach for HRD
  • Demonstrates accountability for HRD activities

14
Make or Buy Evaluation
  • I bought it, therefore it is good.
  • Since its good, I dont need to post-test.
  • Who says its
  • Appropriate?
  • Effective?
  • Timely?
  • Transferable to the workplace?

15
Models and Frameworks of Evaluation
  • Table 7-1 lists six frameworks for evaluation
  • The most popular is that of D. Kirkpatrick
  • Reaction
  • Learning
  • Job Behavior
  • Results

16
Kirkpatricks Four Levels
  • Reaction
  • Focus on trainees reactions
  • Learning
  • Did they learn what they were supposed to?
  • Job Behavior
  • Was it used on job?
  • Results
  • Did it improve the organizations effectiveness?

17
Issues Concerning Kirkpatricks Framework
  • Most organizations dont evaluate at all four
    levels
  • Focuses only on post-training
  • Doesnt treat inter-stage improvements
  • WHAT ARE YOUR THOUGHTS?

18
Data Collection for HRD Evaluation
  • Possible methods
  • Interviews
  • Questionnaires
  • Direct observation
  • Written tests
  • Simulation/Performance tests
  • Archival performance information

19
Interviews
  • Advantages
  • Flexible
  • Opportunity for clarification
  • Depth possible
  • Personal contact
  • Limitations
  • High reactive effects
  • High cost
  • Face-to-face threat potential
  • Labor intensive
  • Trained observers needed

20
Questionnaires
  • Advantages
  • Low cost to administer
  • Honesty increased
  • Anonymity possible
  • Respondent sets the pace
  • Variety of options
  • Limitations
  • Possible inaccurate data
  • Response conditions not controlled
  • Respondents set varying paces
  • Uncontrolled return rate

21
Direct Observation
  • Advantages
  • Nonthreatening
  • Excellent way to measure behavior change
  • Limitations
  • Possibly disruptive
  • Reactive effects are possible
  • May be unreliable
  • Need trained observers

22
Written Tests
  • Advantages
  • Low purchase cost
  • Readily scored
  • Quickly processed
  • Easily administered
  • Wide sampling possible
  • Limitations
  • May be threatening
  • Possibly no relation to job performance
  • Measures only cognitive learning
  • Relies on norms
  • Concern for racial/ ethnic bias

23
Simulation/Performance Tests
  • Advantages
  • Reliable
  • Objective
  • Close relation to job performance
  • Includes cognitive, psychomotor and affective
    domains
  • Limitations
  • Time consuming
  • Simulations often difficult to create
  • High costs to development and use

24
Archival Performance Data
  • Advantages
  • Reliable
  • Objective
  • Job-based
  • Easy to review
  • Minimal reactive effects
  • Limitations
  • Criteria for keeping/ discarding records
  • Information system discrepancies
  • Indirect
  • Not always usable
  • Records prepared for other purposes

25
Choosing Data Collection Methods
  • Reliability
  • Consistency of results, and freedom from
    collection method bias and error
  • Validity
  • Does the device measure what we want to measure?
  • Practicality
  • Does it make sense in terms of the resources used
    to get the data?

26
Type of Data Used/Needed
  • Individual performance
  • Systemwide performance
  • Economic

27
Individual Performance Data
  • Individual knowledge
  • Individual behaviors
  • Examples
  • Test scores
  • Performance quantity, quality, and timeliness
  • Attendance records
  • Attitudes

28
Systemwide Performance Data
  • Productivity
  • Scrap/rework rates
  • Customer satisfaction levels
  • On-time performance levels
  • Quality rates and improvement rates

29
Economic Data
  • Profits
  • Product liability claims
  • Avoidance of penalties
  • Market share
  • Competitive position
  • Return on investment (ROI)
  • Financial utility calculations

30
Use of Self-Report Data
  • Most common method
  • Pre-training and post-training data
  • Problems
  • Mono-method bias
  • Desire to be consistent between tests
  • Socially desirable responses
  • Response Shift Bias
  • Trainees adjust expectations to training

31
Research Design
  • Specifies in advance
  • the expected results of the study
  • the methods of data collection to be used
  • how the data will be analyzed

32
Assessing the Impact of HRD
  • Money is the language of business.
  • You MUST talk dollars, not HRD jargon.
  • No one (except maybe you) cares about the
    effectiveness of training interventions as
    measured by and analysis of formal pretest,
    posttest control group data.

33
HRD Program Assessment
  • HRD programs and training are investments
  • Line managers often see HR and HRD as costs
    i.e., revenue users, not revenue producers
  • You must prove your worth to the organization
  • Or youll have to find another organization

34
Two Basic Methods for Assessing Financial Impact
  • Evaluation of training costs
  • Utility analysis

35
Evaluation of Training Costs
  • Cost-benefit analysis
  • Compares cost of training to benefits gained such
    as attitudes, reduction in accidents, reduction
    in employee sick-days, etc.
  • Cost-effectiveness analysis
  • Focuses on increases in quality, reduction in
    scrap/rework, productivity, etc.

36
Return on Investment
  • Return on investment Results/Costs

37
Calculating Training Return On Investment
 
 
Results
Results
 
 
Operational
How
Before
Expressed
After
Differences
Results Area
Measured
Training
( or )
in
Training
1.5 rejected
.5
720 per day
Quality of panels
rejected
2 rejected
 
 
1,440 panels
1,080 panels
360 panels
172,800
 
 
  per day
  per day
 
  per year
Housekeeping
Visual
10 defects
2 defects
8 defects
Not measur-
  inspection
  (average)
  (average)
  able in
 
 
  using
 
 
 
 
 
  20-item
  checklist
 
 
 
 
 
 
 
 
 
 
Preventable
Number of
24 per year
16 per year
8 per year
 
  accidents
  accidents
 
 
 
 
 
Direct cost
144,000
96,000 per
48,000
48,000 per
  of each
  per year
  year
  year
 
 
  accident
  Return Investment
 
 
 
 
 
 
 
 
Total savings 220,800.00
Operational Results Training Costs
ROI
 
 

 
 
220,800 32,564


6.8
 
 




 
 

SOURCE From D. G. Robinson J. Robinson (1989).
Training for impact. Training and Development
Journal, 43(8), 41. Printed by permission.
38
Measuring Benefits
  • Change in quality per unit measured in dollars
  • Reduction in scrap/rework measured in dollar cost
    of labor and materials
  • Reduction in preventable accidents measured in
    dollars
  • ROI Benefits/Training costs

39
Ways to Improve HRD Assessment
  • Walk the walk, talk the talk MONEY
  • Involve HRD in strategic planning
  • Involve management in HRD planning and estimation
    efforts
  • Gain mutual ownership
  • Use credible and conservative estimates
  • Share credit for successes and blame for failures

40
HRD Evaluation Steps
  1. Analyze needs.
  2. Determine explicit evaluation strategy.
  3. Insist on specific and measurable training
    objectives.
  4. Obtain participant reactions.
  5. Develop criterion measures/instruments to measure
    results.
  6. Plan and execute evaluation strategy.

41
Summary
  • Training results must be measured against costs
  • Training must contribute to the bottom line
  • HRD must justify itself repeatedly as a revenue
    enhancer, not a revenue waster
Write a Comment
User Comments (0)
About PowerShow.com