Session P10 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Sept 22nd, Online Learning 2002) - PowerPoint PPT Presentation

About This Presentation
Title:

Session P10 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Sept 22nd, Online Learning 2002)

Description:

Is the image of the training/learning function lower than you want? Part II ... TACTIX from Fastrak Consulting. ... http://fastrak-consulting.co.uk/tactix ... – PowerPoint PPT presentation

Number of Views:267
Avg rating:3.0/5.0
Slides: 178
Provided by: Vanessa144
Category:

less

Transcript and Presenter's Notes

Title: Session P10 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Sept 22nd, Online Learning 2002)


1
Session P10 Evaluating Online Learning
Frameworks and Perspectives(Workshop Sunday
Sept 22nd, Online Learning 2002)
  • Dr. Curtis J. Bonk
  • President, CourseShare.com
  • Associate Professor, Indiana University
  • http//php.indiana.edu/cjbonk,
  • cjbonk_at_indiana.edu
  • Dr. Vanessa Paz Dennen
  • Assistant Professor, San Diego State University
  • vdennen_at_mail.sdsu.edu
  • http//edweb.sdsu.edu/people/vdennen

2
Workshop Overview
  • Part I The State of Online Learning
  • Part II. Evaluation Purposes, Approaches, and
    Frameworks
  • Part III. Applying Kirkpatricks 4 Levels
  • Part IV. ROI and Online Learning
  • Part V. Collecting Evaluation Data Online
    Evaluation Tools

3
Sevilla Wells (July, 2001), e-learning
  • We could be very productive by ignoring
    assessment altogether and assume competence if
    the learner simply gets through the course.

4
Why Evaluate?
  • Cost-savings
  • Becoming less important reason to evaluate as
    more people recognize that the initial expense is
    balanced by long-term financial benefits
  • Performance improvement
  • A clear place to see impact of online learning
  • Competency advancement

5
16 Evaluation Methods
  • 1. Formative Evaluation
  • 2. Summative Evaluation
  • 3. CIPP Model Evaluation
  • 4. Objectives-Oriented Evaluation
  • 5. Marshall Shriver's 5 Levels of Evaluation
  • 6. Bonks 8 Part Evaluation Plan
  • ( the Ridiculous Model)
  • 7. Kirkpatricks 4 Levels
  • 8. Return on Investment (ROI)
  • 9. K-Level 6 budget and stability of e-learning
    team.
  • 10. K-Level 7 whether e-learning champion(s) are
    promoted
  • 11. Cost/Benefit Analysis (CBA)
  • 12. Time to Competency
  • 13. Time to Market
  • 14. Return on Expectation
  • 15. AEIOU Accountability, Effectiveness, Impact,
    Organizational Context, U Unintended
    Consequences
  • 16. Consumer-Oriented Evaluation

6
Part I. The State of Online Learning
7
Survey of 201 Trainers, Instructors, Managers,
Instructional Designers, CEOs, CLOs, etc.
8
Survey Limitations
  • Sample poole-PostDirect
  • The Web is changing rapidly
  • Lengthy survey, low response rate
  • No password or keycode
  • Many backgroundshard to generalize
  • Does not address all issues (e.g., ROI
    calculations, how trained supported, specific
    assessments)

9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
Why Interested in E-Learning?
  • Mainly cost savings
  • Reduced travel time
  • Greater flexibility in delivery
  • Timeliness of training
  • Better allocation of resources, speed of
    delivery, convenience, course customization,
    lifelong learning options, personal growth,
    greater distrib of materials

13
(No Transcript)
14
A Few Assessment Comments
15
Level 1 Comments. Reactions
  • We assess our courses based on participation
    levels and online surveys after course
    completion. All of our courses are
    asynchronous.
  • I conduct a post course survey of course
    material, delivery methods and mode, and
    instructor effectiveness. I look for suggestions
    and modify each course based on the results of
    the survey.
  • We use the Halo Survey process of asking them
    when the course is concluding.

16
Level 2 Comments Learning
  • We use online testing and simulation frequently
    for testing student knowledge.
  • Do multiple choice exams after each section of
    the course.
  • We use online exams and use level 2 evaluation
    forms.

17
Level 3 Comment Job Performance
  • I feel strongly there is a need to measure the
    success of any training in terms of the
    implementation of the new behaviors on the job.
    Having said that, I find there is very limited by
    our clients in spending the dollars required

18
More Assessment CommentsMultiple Level Evaluation
  • Using Level One Evaluations for each session
    followed by a summary evaluation. Thirty days
    post-training, conversations occur with learners
    managers to assess Level 2 (actually Level 3).
  • We do Level 1 measurements to gauge student
    reactions to online training using an online
    evaluation form. We do Level 2 measurements to
    determine whether or not learning has occurred
  • Currently, we are using online teaching and
    following up with manager assessments that the
    instructional material is being put to use on the
    job.

19
Who is Evaluating Online Learning?
  • 59 of respondents said they did not have a
    formal evaluation program
  • At Reaction level 79
  • At Learning level 61
  • At Behavior/Job Performance level 47
  • At Results or Return on Investment 30

20
(No Transcript)
21
Assessment Lacking or Too Early
  • We are just beginning to use Web-based
    technology for education of both associates and
    customers, and do not have the metric to measure
    our success. However, we are putting together a
    focus group to determine what to measure (and)
    how.
  • We have no online evaluation for students at
    this time.
  • We lack useful tools in this area.

22
Limitations with Current System
  • I feel strongly there is a need to measure the
    success of any training in terms of the
    implementation of the new behaviors on the job.
    Having said that, I find there is very limited by
    our clients in spending the dollars required
  • We are looking for better ways to track learner
    progress, learner satisfaction, and retention of
    material.
  • Have had fairly poor ratings on reliability,
    customer support, and interactivity

23
PauseHow and What Do You Evaluate?
24
Readiness Checklist
  • 1.      ___ Is your organization undergoing
    significant change, in part related to
    e-learning?
  • 2.      ___ Is there pressure from senior
    management to measure the results of e-learning?
  • 3.      ___ Has your company experienced one or
    more training/learning disasters in the past?
  • 4.      ___ Is the image of the training/learning
    function lower than you want?

25
Part II Evaluation Purposes, Approaches and
Frameworks
26
What is Evaluation???
  • Simply put, an evaluation is concerned with
    judging the worth of a program and is essentially
    conducted to aid in the making of decisions by
    stakeholders. (e.g., does it work as
    effectively as the standard instructional
    approach).
  • (Champagne Wisher, in press)

27
What is assessment?
  • Assessment refers toefforts to obtain info about
    how and what students are learning in order to
    improveteaching efforts and/or to demo to others
    the degree to which students have accomplished
    the learning goals for a course. (Millar, 2001,
    p. 11).
  • It is a way of using info obtained through
    various types of measurement to determine a
    learners performance or skill on some task or
    situation (Rosenkrans, 2000).

28
Who are you evaluating for?
  • The level of evaluation will depend on
    articulation of the stakeholders. Stakeholders
    of evaluation in corporate settings may range
    from???

29
Evaluation Purposes
  • Determine learner progress
  • What did they learn?
  • Document learning impact
  • How well do learners use what they learned?
  • How much do learners use what they learn?

30
Evaluation Purposes
  • Efficiency
  • Was online learning more effective than another
    medium?
  • Was online learning more cost-effective than
    another medium/what was the return on investment
    (ROI)?
  • Improvement
  • How do we do this better?

31
Evaluation Purposes
  • An evaluation plan can evaluate the delivery of
    e-learning, identify ways to improve the online
    delivery of it, and justify the investment in the
    online training package, program, or initiative.
    (Champagne Wisher, in press)

32
Evaluation Plans
  • Does your company have a training evaluation plan?

33
Steps to Developing an OL Evaluation Program
  • Select a purpose and framework
  • Develop benchmarks
  • Develop online survey instruments
  • For learner reactions
  • For learner post-training performance
  • For manager post-training reactions
  • Develop data analysis and management plan

34
1. Formative Evaluation
  • Formative evaluations focus on improving the
    online learning experience.
  • A formative focus will try to find out what
    worked or did not work.
  • Formative evaluation is particularly useful for
    examining instructional design and instructor
    performance.

35
Formative Questions
  • -How can we improve our OL program?
  • -How can we make our OL program more efficient?
  • -More effective?
  • -More accessible?

36
2. Summative Evaluation
  • Summative evaluations focus on the overall
    success of the OL experience (should it be
    continued?).
  • A summative focus will look at whether or not
    objectives are met, the training is
    cost-effective, etc.

37
Course Completion
  • Jeanne Meister, Corporate University Xchange,
    found a 70 percent drop out rate compared to
    classroom rates of 15.
  • Perhaps need new metrics. Need to see if they
    can test out.
  • Almost any measure would be better than course
    completion, which is not a predictor of
    anything. Tom Kelly, Cisco, March 2002,
    e-Learning.

38
What Can OL Evaluation Measure?
  • Categories of Evaluation Info (Woodley and
    Kirkwood, 1986)
  • Measures of activity
  • Measures of efficiency
  • Measures of outcomes
  • Measures of program aims
  • Measures of policy
  • Measures of organizations

39
Typical Evaluation Frameworks for OL
  • Commonly used frameworks include
  • CIPP Model
  • Objectives-oriented
  • Marshall Shrivers 5 levels
  • Kirkpatricks 4 levels
  • Plus a 5th level
  • AEIOU
  • Consumer-oriented

40
3. CIPP Model Evaluation
  • CIPP is a management-oriented model
  • C context
  • I input
  • P process
  • P product
  • Examines the OL within its larger system/context

41
CIPP OL Context
  • Context Addresses the environment in which OL
    takes place.
  • How does the real environment compare to the
    ideal?
  • Uncovers systemic problems that may dampen OL
    success.
  • Technology breakdowns
  • Inadequate computer systems

42
CIPP OL Input
  • Input Examines what resources are put into OL.
  • Is the content right?
  • Have we used the right combination of media?
  • Uncovers instructional design issues.

43
CIPP OL Process
  • Process Examines how well the implementation
    works.
  • Did the course run smoothly?
  • Were there technology problems?
  • Was the facilitation and participation as
    planned?
  • Uncovers implementation issues.

44
CIPP OL Product
  • Product Addresses outcomes of the learning.
  • Did the learners learn? How do you know?
  • Does the online training have an effect on
    workflow or productivity?
  • Uncovers systemic problems.

45
4. Objectives-Oriented Evaluation
  • Examines OL training objectives as compared to
    training results
  • Helps determine if objectives are being met
  • Helps determine if objectives, as formally
    stated, are appropriate
  • Objectives can be used as a comparative benchmark
    between online and other training methods

46
Evaluating Objectives OL
  • An objectives-oriented approach can examine two
    levels of objectives
  • Instructional objectives for learners (did the
    learners learn?)
  • Systemic objectives for training (did the
    training solve the problem?)

47
Objectives OL
  • Requires
  • A clear sense of what the objectives are (always
    a good idea anyway)
  • The ability to measure whether or not objectives
    are met
  • Some objectives may be implicit and hard to state
  • Some objectives are not easy to measure

48
5. Marshall Shriver's Five Levels of Evaluation
  • Performance-based evaluation framework
  • Each level examines a different areas of
    performance
  • Requires demonstration of learning

49
Marshall Shriver's 5 Levels
  • Level I Self (instructor)
  • Level II Course Materials
  • Level II Course Curriculum
  • Level IV Course Modules
  • Level V Learning Transfer

50
6. Bonks Evaluation Plan
51
What to Evaluate?
  1. Learnerattitudes, learning, use, performance.
  2. Instructorpopularity, course enrollments.
  3. Traininginternal and external components.
  4. Task--relevance, interactivity, collaborative.
  5. Tool--usable, learner-centered, friendly,
    supportive.
  6. Courseinteractivity, participation, completion.
  7. Programgrowth, long-range plans.
  8. Organizationcost-benefit, policies, vision.

52
RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm
    unity

53
RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

54
7. Kirkpatricks 4 Levels
  • A common training framework.
  • Examines training on 4 levels.
  • Not all 4 levels have to be included in a given
    evaluation.

55
The 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

56
8. Return on Investment (ROI) A 5th Level
  • Return on Investment is a 5th level
  • It is related to results, but is more clearly
    stated as a financial calculation
  • How to calculate ROI is the big issue here

57
Is ROI the answer?
  • Elise Olding of CLK Strategies suggests that we
    shift from looking at ROI to looking at time to
    competency.
  • ROI may be easier to calculate since concrete
    dollars are involved, but time to competency may
    be more meaningful in terms of actual impact.

58
Example Call Center Training
  • Traditional call center training can take 3
    months to complete
  • Call center employees typically quit within one
    year
  • When OL was implemented, the time to train (time
    to competency) was reduced
  • Benchmarks for success time per call number of
    transfers

59
Example Circuit City
  • Circuit City provided online product/sales
    training
  • What is more useful to know
  • The overall ROI or break-even point?
  • How much employees liked the training?
  • How many employees completed the training?
  • That employees who completed 80 of the training
    saw an average increase of 10 in sales?

60
Matching Evaluation Levels with Objectives Pretest
  • Instructions For each statement below, indicate
    the level of evaluation at which the objective is
    aimed.
  • 1.      ___ Show a 15 percent decrease in errors
    made on tax returns by staff accountants
    participating in the e-learning certificate
    program.
  • 2.      ___ Increase use of conflict resolution
    skills, when warranted, by 80 percent of
    employees who had completed the first eight
    modules of the online training. (see handout for
    more)

61
9. A 6th Level?Clark Aldrich (2002)
  • Adding Level 6 which relates to the budget and
    stability of the e-learning team.
  • Just how respected and successful is the
    e-learning team.
  • Have they won approval from senior management for
    their initiatives.
  • Aldrich, C. (2002). Measuring success In a
    post-Maslow/Kirkpatrick world, which metrics
    matter? Online Learning, 6(2), 30 32.

62
10. And Even a 7th Level?Clark Aldrich (2002)
  • At Level 7 whether the e-learning sponsor(s) or
    champion(s) are promoted in the organization.
  • While both of these additional levels address the
    people involved in the e-learning initiative or
    plan, such recognitions will likely hinge on the
    results of evaluation of the other five levels.

63
11. ROI AlternativeCost/Benefit Analysis (CBA)
  • ROI may be ill-advised since not all impacts hit
    bottom line, and those that do take time.
  • Shifts the attention from more long-term results
    and quantifying impacts with numeric values, such
    as
  • increased revenue streams,
  • increased employee retention, or
  • reduction in calls to a support center.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

64
Cost/Benefit Analysis (CBA)
  • To both qualitative and quantitative measures
  • job satisfaction ratings,
  • new uses of technology,
  • reduction in processing errors,
  • quicker reactions to customer requests,
  • reduction in customer call rerouting,
  • increased customer satisfaction,
  • enhanced employee perceptions of training,
  • global post-test availability.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

65
Cost/Benefit Analysis (CBA)
  • In effect, CBA asks how does the sum of the
    benefits compare to the sum of the costs.
  • Yet, it often leads to or supports ROI and other
    more quantitatively-oriented calculations.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

66
Other ROI Alternatives
  • 12. Time to competency (need benchmarks)
  • online databases of frequently asked questions
    can help employees in call centers learn skills
    more quickly and without requiring temporary
    leaves from their position for such training
  • 13. Time to market
  • might be measured by how e-learning speeds up the
    training of sales and technical support
    personnel, thereby expediting the delivery of a
    software product to the market
  • Raths, D. (2001, May). Measure of success.
    Online Learning, 5(5), 20-22, 24.

67
Still Other ROI Alternatives
  • 14. Return on Expectation
  • Asks employees a series of questions related to
    how training met expectations of their job
    performance.
  • When questioning is complete, they place a
    figure on that.
  • Correlate or compare such reaction data with
    business results or supplement Level 1 data to
    include more pertinent info about the
    applicability of learning to employee present job
    situation.
  • Raths, D. (2001, May). Measure of success.
    Online Learning, 5(5), 20-22, 24.

68
15. AEIOU
  • Provides a framework for looking at different
    aspects of an online learning program
  • Fortune Keith, 1992 Sweeney, 1995 Sorensen,
    1996

69
A Accountability
  • Did the training do what it set out to do?
  • Data can be collected through
  • Administrative records
  • Counts of training programs ( of attendees, of
    offerings)
  • Interviews or surveys of training staff

70
E Effectiveness
  • Is everyone satisfied?
  • Learners
  • Instructors
  • Managers
  • Were the learning objectives met?

71
I Impact
  • Did the training make a difference?
  • Like Kirkpatricks level 4 (Results)

72
O Organizational Context
  • Did the organizations structures and policies
    support or hinder the training?
  • Does the training meet the organizations needs?
  • OC evaluation can help find when there is a
    mismatch between the training design and the
    organization
  • Important when using third-party training or
    content

73
U Unintended Consequences
  • Unintended consequences are often overlooked in
    training evaluation
  • May give you an opportunity to brag about
    something wonderful that happened
  • Typically discovered via qualitative data
    (anecdotes, interviews, open-ended survey
    responses)

74
16. Consumer-Oriented Evaluation
  • Uses a consumer point-of-view
  • Can be a part of vendor selection process
  • Can be a learner-satisfaction issue
  • Relies on benchmarks for comparison of different
    products or different learning media

See the vendors!
75
Part III
  • Applying Kirkpatricks 4 Levels to Online
    Learning Evaluation Evaluation Design

76
Why Use the 4 Levels?
  • They are familiar and understood
  • Highly referenced in the training literature
  • Can be used with 2 delivery media for comparative
    results

77
Conducting 4-Level Evaluation
  • You need not use every level
  • Choose the level that is most appropriate to your
    need and budget
  • Higher levels will be more costly and difficult
    to evaluate
  • Higher levels will yield more

78
Kirkpatrick Level 1 Reaction
  • Typically involves Smile sheets or
    end-of-training evaluation forms.
  • Easy to collect, but not always very useful.
  • Reaction-level data on online courses has been
    found to correlate with ability to apply learning
    to the job.
  • Survey ideally should be Web-based, keeping the
    medium the same as the course.

79
Kirkpatrick Level I Reaction
  • Types of questions
  • Enjoyable?
  • Easy to use?
  • How was the instructor?
  • How was the technology?
  • Was it fast or slow enough?

80
Kirkpatrick Level 2 Learning
  • Typically involves testing learners immediately
    following the training
  • Not difficult to do, but online testing has its
    own challenges
  • Did the learner take the test on his/her own?

81
Kirkpatrick Level 2 Learning
  • Higher-order thinking skills (problem solving,
    analysis, synthesis)
  • Basic skills (articulate ideas in writing)
  • Company perspectives and values (teamwork,
    commitment to quality, etc.)
  • Personal development

82
Kirkpatrick Level 2 Learning
  • Might include
  • Essay tests.
  • Problem solving exercises.
  • Interviews.
  • Written or verbal tests to assess cognitive
    skills.
  • Shepard, C. (1999b, July). Evaluating online
    learning. TACTIX from Fastrak Consulting.
    Retrieved February 10, 2002, from
    http//fastrak-consulting.co.uk/tactix/Features/ev
    aluate/eval01.htm.

83
Kirkpatrick Level 3 Behavior
  • More difficult to evaluate than Levels 1 2
  • Looks at whether learners can apply what they
    learned (does the training change their
    behavior?)
  • Requires post-training follow-up to determine
  • Less common than levels 1 2 in practice

84
Kirkpatrick Level 3 Behavior
  • Might include
  • Direct observation by supervisors or coaches
    (Wisher, Curnow, Drenth, 2001).
  • Questionnaires completed by peers, supervisors,
    and subordinates related to work performance.
  • On the job behaviors, automatically logged
    performances, or self-report data.
  • Shepard, C. (1999b, July). Evaluating online
    learning. TACTIX from Fastrak Consulting.
    Retrieved February 10, 2002, from
    http//fastrak-consulting.co.uk/tactix/Features/ev
    aluate/eval01.htm.

85
Kirkpatrick Level 4 Results
  • Often compared to return on investment (ROI)
  • In e-learning, it is believed that the increased
    cost of course development ultimately is offset
    by the lesser cost of training implementation
  • A new way of training may require a new way of
    measuring impact

86
Kirkpatrick Level 4 Results
  • Might Include
  • Labor savings (e.g., reduced duplication of
    effort or faster access to needed information).
  • Production increases (faster turnover of
    inventory, forms processed, accounts opened,
    etc.).
  • Direct cost savings (e.g., reduced cost per
    project, lowered overhead costs, reduction of bad
    debts, etc.).
  • Quality improvements (e.g., fewer accidents, less
    defects, etc.).
  • Horton, W. (2001). Evaluating e-learning.
    Alexandria, VA American Society for Training
    Development.

Of course, this assumes you have all the
documents!
87
Kirkpatrick Evaluation Design
  • Kirkpatricks 4 Levels may be achieved via
    various evaluation designs
  • Different designs help answer different questions

88
Pre/Post Control Groups
  • One group receives OL training and one does not
  • As variation try 3 groups
  • No training (control)
  • Traditional training
  • OL training
  • Recommended because it may help neutralize
    contextual factors
  • Relies on random assignment as much as possible

89
Multiple Baselines
  • Can be used for a program that is rolling out
  • Each group serves as a control group for the
    previous group
  • Look for improvement in subsequent groups
  • Eliminates need for tight control of control group

90
Time Series
  • Looks at benchmarks before and after training
  • Practical and cost-effective
  • Not considered as rigorous as other designs
    because it doesnt control for contextual factors

91
Single Group Pre/Post
  • Easy and inexpensive
  • Criticized for lack of rigor (absence of control)
  • Needs to be pushed into Kirkpatrick levels 3 and
    4 to see if there has been impact

92
Case Study
  • A rigorous design in academic practice, but often
    after-the-fact in corporate settings
  • Useful when no preliminary or baseline data have
    been collected

93
Matching Evaluation Levels with Objectives
Posttest
  • Instructions For each statement below, indicate
    the level of evaluation at which the objective is
    aimed.
  • 1. Union Pacific Railroad reported an increase in
    bottom-line performance--on-time delivery of
    goods--of over 35, which equated to millions of
    dollars in increased revenues and savings.
  • 2. They also reported that learners showed a 40
    increase in learning retention and improved
    attitudes about management and jobs.
  • (see handout for more)

94
Part IV
  • ROI and Online Learning

95
The Importance of ROI
  • OL requires a great amount of and other
    resources up front
  • It gives the promise of financial rewards later
    on
  • ROI is of great interest because of the
    investment and the wait period before the return

96
Calculating ROI
  • Look at
  • Hard cost savings
  • Hard revenue impact
  • Soft competitive benefits
  • Soft benefits to individuals
  • See Calculating the Return on Your eLearning
    Investment (2000) by Docent, Inc.

97
Possible ROI Objectives
  • Better Efficiencies
  • Greater Profitability
  • Increased Sales
  • Fewer Injuries on the Job
  • Less Time off Work
  • Faster Time to Competency

98
Factors Impacting ROI
  • of employees
  • Travel costs
  • Opportunity costs (e.g., what does it cost to
    pull off of job)
  • Online course development costs
  • Infrastructure costs

99
Hard Cost Savings
  • Travel
  • Facilities
  • Printed material costs (printing, distribution,
    storage)
  • Reduction of costs of business through increased
    efficiency
  • Instructor fees (sometimes)

100
The Cost of E-learning
  • Brandon-hall.com estimates that an LMS system for
    8,000 learners costs 550,000
  • This price doesnt include the cost of buying or
    developing content
  • Bottom line getting started in e-learning isnt
    cheap

101
Hard Revenue Impact
  • Consider
  • Opportunity cost of improperly or untrained
    personnel
  • Shorter time to productivity through shorter
    training times with OL
  • Increased time on job (no travel time)
  • Ease of delivering same training to partners and
    customers (for fee?)

102
Soft Competitive Benefits
  • Just-in-time capabilities
  • Consistency in delivery
  • Certification of knowledge transfer
  • Ability to track users and gather data easily
  • Increase morale from simultaneous roll-out at
    different sites

103
Individual Values
  • Less wasted time
  • Support available as needed
  • Motivation from being treated as an individual

104
Talking about ROI
  • As a percentage
  • ROI(Payback-Investment)/Investment100
  • As a ratio
  • ROIReturn/Investment
  • As time to break even
  • Break even time(Investment/Return)Time Period

105
Net Present Value
  • Need to discount the return to present dollars a
    100,000 project that yields 30,000/year for 5
    years, would have a new present value of 29,364
    at 8 interest (Horton, 2001, ASTD)

106
Benefit-Cost Ratio
  • Project cost of 100,000 that yields 150,000 of
    benefits would have a benefit-cost ratio of 1.5
    (Horton, 2001, ASTD)

107
Time to Payback
  • If cost is 100,000 and ROI is 10,000/month,
    then the time to payback is 10 months (Horton,
    2001, ASTD)

108
Learners to Payback
  • Training costs 100,000 to develop and
    100/person to offer. Assuming each person
    trained benefits the organization 300 (or 200
    net) development costs are repaid by training
    500 people (Horton, 2001, ASTD)

109
Classroom Training vs. ROI(William Horton)
  1. Per-course costs (course development costs)
  2. Per-class costs (instructor/facilitator, travel,
    and facilities)
  3. Per-learner costs (travel, salary,
    instructor/facilitator salary)

110
What is ROI Good For?
  • Prioritizing Investment
  • Ensuring Adequate Financial Support for Online
    Learning Project
  • Comparing Vendors

111
The Changing Face of ROI
  • Return-on-investment isnt what it used to be
    The R is no longer the famous bottom line and the
    I is more likely a subscription fee than a
    one-time payment (Cross, 2001)

112
More Calculations
  • Total Admin Costs of Former Program - Total
    Admin Costs of OL ProgramProjected Net Savings
  • Total Cost of Training/ of StudentsCost Per
    Student (CPS)
  • Total Benefits 100/Total Program CostROI

113
Pause How are costs calculated in online
programs?
114
ROI Calculators
  1. Mediapro (www.mediapro.com/roi)
  2. Mentergy (www.mentergy.com/roi)
  3. BNH Expert Software www.bnhexpertsoft.com (free
    trial version available)

115
(No Transcript)
116
ROI Calculators
117
Success Story 1 (Sitze, March 2002, Online
Learning)EDS and GlobalEnglish
  • Charge Reduce money on English training
  • Goal 80 online in 3 months
  • Result 12 use in 12 months
  • Prior Costs 1,500-5,000/student
  • New Cost 150-300/user
  • Notes Email to participants was helpful in
    expanding use rolling out other additional
    languages.

118
Success Story 2 (Overby, Feb 2002, CIO)Dow
Chemical and Offensive Email
  • Charge Train 40,000 employees across 70
    countries 6 hours of training on workplace
    respect and responsibility.
  • Specific Results 40,000 passed
  • Savings Saved 2.7 million (162,000 on record
    keeping, 300,000 on classrooms and trainers,
    1,000,000 on handouts, 1,200,000 in salary
    savings due to less training time).

119
Success Story 3 (Overby, Feb 2002, CIO)Dow
Chemical and Safety/Health
  • Charge Train 27,000 employees on environmental
    health and safety work processes.
  • Results Saved 6 million safety incidents have
    declined while the number of Dow employees have
    grown.

120
Success Story 4 (Overby, Feb 2002, CIO)Dow
Chemical and e-learning system
  • Charge 1.3 million e-learning system
  • Savings 30 million in savings (850,000 in
    manual record-keeping, 3.1 in training delivery
    costs, 5.2 in reduced classroom materials,
    20.8 in salaries since Web required 40-60 less
    training time).

121
Success Story 5 (Ziegler, e-learning, April
2002)British Telecom sales training
  • Costs Train 17,000 sales professionals to sell
    Internet services using Internet simulation.
  • Result Customer service rep training reduced
    from 15 days to 1 day Sales training reduced
    from 40 days to 9 days.
  • Savings Millions of dollars saved sales
    conversion went up 102 percent customer
    satisfaction up 16 points.

122
And Blended Learning Results???
123
Blended Learning Advantages
  1. Course access at ones convenience and flexible
    completion
  2. Reduction in physical class time
  3. Promotes independent learning
  4. Multiple ways to accomplish course objectives
  5. Increased opportunities for human interaction,
    communication, contact among students
  6. Less time commuting and parking
  7. Introverts participate more

124
Blended Learning Disadvantages
  1. Procrastination, procrastination, procrastination
  2. Students have trouble managing time
  3. Problems with technology at the beginning (try
    too much)
  4. Can be overwhelming or too novel
  5. Poor integration or planning
  6. Resistance to change
  7. Good ideas but lack of time, money, support

125
Success Story 6. Infusing E-Learning (Elliott
Masie, March 2002, e-learning Magazine)
  • A manufacturing company transformed a week-long
    safety program into a three-part offering
  • 1. One day in classroom
  • 2. Multiple online simulations and lessons.
  • 3. One final day of discussions and exams.
  • Must accomplish online work before phase 3
  • this raised success rate, transfer of skills,
    and lowered hours away from the job.

126
Success Story 7. Ratheon, Build Own LMS (John
Hartnett, Online Learning, Summer 2002)
  • SAP Training Choice Vendor (390,000) or Build
    Internally (136,000) or Cost of Instructor-led
    Training (388,000).
  • Note Saved 252,000
  • Five Training Components in 18 Weeks (within 6
    weeks, 4,000 courses were taken by 1,400
    students)
  • Role-based simulations
  • Audio walk-throughs
  • Online quick reference system
  • Live training support (special learning labs)
  • Online enrollment and tracking

127
Success Story 8IBMSpecial E-Learning Issue,
April 2001
  • 33,000 IBM managers have taken online courseware.
  • 5 times as much content at one-third the cost.
  • IBM reported 200 million in savings in one year.
  • Voided 80 million dollars in travel and housing
    expenses during 1999 be deploying online
    learning.

128
IBM Training of 6,600 New First-Line Managers
(Basic Blue)
  • Phase I 26 Weeks of Self-paced Online Learning
  • Cohorts of 24 managers
  • Lotus LearningSpace Forum
  • 2 hours/week 5 units/week
  • 18 mandatory and elective management topics
  • Need minimum score on mandatory topics
  • 14 real-life interactive simulations
  • LearningSpace tutor guides behavior
  • Karen Mantyla (2001), ASTD.

129
IBM Training of 6,600 New First-Line Managers
(Basic Blue)
  • Phase II In-class 5 day learning lab
  • Experiential higher order learning
  • Bring real-life activities from job
  • Focus on self-knowledge and to understand their
    roles as leaders and members of IBM
  • Harvard Business cases, leadership competency
    surveys, managerial style questionnaires, brain
    dominance inventories
  • Coached by a learner-colleague (teaming impt!)
  • Less than 1 hour of the 5 days is lecture

130
IBM Training of 6,600 New First-Line Managers
(Basic Blue)
  • Phase III 25 Weeks of Online Learning
  • Similar to Phase I but more complex and focuses
    on application
  • Creates individual development plan and
    organizational action plan
  • Managers reviews and signs off on these plans

131
IBM Training Results (Kirkpatrick Model)
  • Level 1
  • High satisfaction and enthusiasm for blended
  • Coaching and climate rated highest
  • Level 2
  • 96 displayed mastery in all 15 subject areas 5
    times as much content covered in this program
    compared to 5 days of live training
  • 150 Web page requests/learner

132
IBM Training Results (Kirkpatrick Model)
  • Level 3
  • Significant behavior change (in particular in
    coaching, styles, competencies, and climate)
  • Graduate had high self-efficacy and believed that
    they could make a difference
  • Level 4
  • Linkage bt leadership customer satisfaction
  • Leadership led to teamwork and satisfaction
  • Managers reported improvement on job
  • Improved morale and productivity reported

133
IBM Training Results (Kirkpatrick Model)
  • Level 5
  • Asked graduates to estimate the impact on their
    departments in dollars
  • 415,000 or ROI of 47 to 1.
  • Perceived real and lasting leadership increases

134
Blended Learning Advantages for IBM
  1. Greater consistency of language, knowledge, and
    corporate culture across the globe
  2. Blended approach to training now replicated in
    other units
  3. Market its e-learning design
  4. Cross functional understanding teamwork
  5. No risk trials and simplicity helps

135
Success Story 9. Three Phases of AC3-DL
  1. Asynchronous Phase 240 hours of instruction or 1
    year to complete must score 70 or better on
    each gate exam
  2. Synchronous Phase 60 hours of asynchronous and
    120 hours of synchronous
  3. Residential Phase 120 hours of training in 2
    weeks at Fort Knox

136
AC3-DL Course Tools
  • Asynchronous
  • Learning Management System
  • E-mail
  • Synchronous Virtual Tactical Operations Center
    (VTOC) (7 rooms 15 people/extension)
  • Avatar
  • Audio conference by extension/room (voice over
    IP)
  • Text Chat Windowsglobal and private
  • Special tools for collaboration

137
(No Transcript)
138
(No Transcript)
139
Success 10 Microsoft Excel Training(Jeff
Barbian, Blended Works, Summer 2002, Online
Learning)
  • Group One 5 scenario-based exercises that
    offered live use of Excel on real-world tasks,
    online mentors, FAQs, relevant Web sites, NETg
    Excel Fundamentals Learning Objects.
  • Group Two Same as Group One but without
    scenarios, but info in 5 scenarios were embedded
    in the learning objects.
  • Group Three No training control.

140
Success 10 Microsoft Excel Training(Thompson
Learning Company Study Jeff Barbian, Blended
Works, Summer 2002, Online Learning)
  • Group One (the blended group) 30 percent
    increase in accuracy over Group Two (the
    e-learning group) and were 41 percent faster
  • Group Two performed 159 more accurately than
    Group Three
  • Groups 1 and 2 relied on the online mentors for
    support
  • (Note with these results, Lockhead Martin became
    a blended learning convert.

141
Success 11 NCR Blended Approaches(Thompson
Learning Company Study Jeff Barbian, Blended
Works, Summer 2002, Online Learning)
  • Design of E-Learning (Various methods Web
    articles Synchronous points for team exercises)
  • Field Guide Binders (Web site guidance, live
    feedback on case studies, live kick off that
    promotes collaboration, hands-on role play)
  • Over 71 percent of learners were responding to
    customers more effectively (Kirkpatrick Level 3)

142
Success 12 Convergys Blended(Jeff Barbian,
Blended Works, Summer 2002, Online Learning)
  • Leadership Dev, Succession Planning, performance
    management, etc.
  • LMS from Knowledge Planet, 3 e-learning
    libraries, virtual classroom tools to 50
    locations in North America Europe
  • New managers received Readings, job aids,
    meeting checklists, 5 off-the-shelf courses from
    SkillSoft, virtual classes via LearnLinc (new
    recruits talk to experienced managers), and a 4
    day instructor-led seminar at HQ.

143
Success 13 Sallie Mae/USA Group (Blended
student loan provider program)(Jeff Barbian,
Blended Works, Summer 2002, Online Learning)
  • LEAD (Leadership and Education Development)
    Groom internal staff to fill supervisory-level
    positions
  • 4 hours/week in class with internal and external
    instructors learn trust, role of managers, etc.
  • First must complete 3 online management courses
    from SkillSoft and 6 online project management
    courses (includes panel presentation by IT
    Project Team to illustrate how projects are
    handled in the companys culture)
  • Findings increased temawork, camaraderie, shared
    understanding of concepts, respect for individual
    differences, social interaction, and
    reinforcement for class concepts.

144
Success 14 Proctor and Gamble(Jeff Barbian,
Blended Works, Summer 2002, Online Learning)
  • 1999 100,000 employees 20,000 trained/year
  • LMS from Saba, live training from Centra
  • CD-based training using Authorware,
    CourseBuilder, Dreamweaver
  • 2002 1,200 learning items 34 Web, 54 CD
  • Global English saved 2.5 million per year
  • Off-the-shelf courses in time management and
    managing for success

145
Proctor and Gamble(Jeff Barbian, Blended Works,
Summer 2002, Online Learning)
  • Given our learning objectives and needs, should
    we select Web-based live training, versus
    classroom, versus video-based, versus CBT, or
    some blended solution?It depends, on the
    resources you have, how far geographically you
    have to reach, or whether you can get your arm
    around them and pull them into a classroom. Art
    DiMartile, Senior IT Manager, Proctor and Gamble

146
The Worldwide Expansion of E-Learning!!!
  • Success 15 Circuit City is training 50,000
    employees from 600 stores using customized
    courses that are short, fun, flexible,
    interactive and instantly applicable on the job.
  • Success 16 The Armys virtual university
    offered online college courses to more than
    12,000 students located anywhere in the world in
    2001 in the first year of a 42 million
    e-learning program.
  • Dr. Sylvia Charp, Editor-in-Chief, T.H.E.
    Journal, March 2002.

147
Success 17 Community Health Network of Indiana
www.ehealthindiana.com (July 15, 2002, American
Hospital Association)
  • Named one of most wired hospitals and most
    improved hospital system nationwide in the use of
    technology in health care
  • Virtual nurse recruitment Web site (live chats
    with recruiters)
  • Video streams of nursing leaders
  • Virtual tours of individual nursing units
  • Online application and interactive job-posting
    databases
  • Web portal for physicians
  • First in nation to offer live Web cast of in
    vitro fertilization procedure
  • Real time clinical data repository

148
(No Transcript)
149
Success 18 Cisco and DigitalThink Course
(employees)
  • Sales training self-assessment
  • Ask via survey to estimate how much time training
    saved them on the job
  • Ask whether it improved performance
  • Select a percentage for each
  • ROI of 900 for every 1 spent on training,
    Cisco sees a gain of 900 in productivity

150
Success 18 Cisco and DigitalThink Course (Cisco
vendors)
  • Most saw significant growth in productivity
  • 74 reported improvement in ability to sell or
    service clients
  • Customer satisfaction jumped 50

151
And What about Higher Ed???
152
Success 19 Higher Education Student survey
results after a hybrid course
  • Student feedback N282
  • 69 felt they could control the pace of their own
    learning
  • 77 felt they could organize their time better
  • 16 felt the time spent online would have been
    better spent in class
  • 61 felt there should be more courses like this
  • www.uwsa.edu.ttt/articles/garnham.htm

153
At the End of the Day...
  • Are all training results quantifiable?
  • NO! Putting a price tag on some costs and
    benefits can be very difficult
  • NO! Some data may not have much meaning at face
    value
  • What if more courses are offered and annual
    student training hours drop simultaneously? Is
    this bad?

154
Evaluation Cases (homework)
  1. General Electric Case
  2. Financial Services Company
  3. Circuit Board Manufacturing Plant Safety
  4. Computer Company Sales Force
  5. National HMO Call Center

155
Part V
  • Collecting Evaluation Data Online Evaluation
    Tools

156
Collecting Evaluation Data
  • Learner Reaction
  • Learner Achievement
  • Learner Job Performance
  • Manager Reaction
  • Productivity Benchmarks

157
Forms of Evaluation
  • Interviews and Focus Groups
  • Self-Analysis
  • Supervisor Ratings
  • Surveys and Questionnaires
  • ROI
  • Document Analysis
  • Data Mining (Changes in pre and post-training
    e.g., sales, productivity)

158
How Collect Data?
  • Direct Observation in Work Setting
  • By supervisor, co-workers, subordinates, clients
  • Collect Data By Surveys, Interviews, Focus Groups
  • Supervisors, Co-workers, Subordinates, Clients
  • Self-Report by learners or teams
  • Email and Chat

159
Learner Data
  • Online surveys are the most effective way to
    collect online learner reactions
  • Learner performance data can be collected via
    online tests
  • Pre and post-tests can be used to measure
    learning gains
  • Learner post-course performance data can be used
    for Level 3 evaluation
  • May look at on-the-job performance
  • May require data collection from managers

160
Multiple Assessment Example Naval Training
Follow-Up Evaluation
  • A naval training unit uses an online
    survey/database system to track performance of
    recently trained physiologists
  • Learners self-report performance
  • Managers report on learner performance
  • Unit heads report on overall productivity

161
Learning System Data
  • Many statistics are available, but which are
    useful?
  • Number of course accesses
  • Log-in times/days
  • Time spent accessing course components
  • Frequency of access for particular components
  • Quizzes completed and quiz scores
  • Learner contributions to discussion (if
    applicable)

162
Computer Log DataChen, G. D., Liu, C. C., Liu,
B. J. (2000). Discovering decision knowledge from
Web log portfolio for managing classroom
processes by applying decision tree and data cute
tech. Journal of Educ Computing Research, 23(3),
305-332.
  • In a corp training situation, computer log data
    can correlate online course completions with
  • actual job performance improvements such as
  • fewer violations of safety regulations,
  • reduced product defects,
  • increased sales, and
  • timely call responses.

163
Learner System Data
  • IF learners are being evaluated based on number
    and length of accesses, it is only fair that they
    be told
  • Much time can be wasted analyzing statistics that
    dont tell much about the actual impact of the
    training
  • Bottom line Easy data to collect, but not always
    useful for evaluation purposes
  • Still useful for management purposes

164
Benchmark Data
  • Companies need to develop benchmarks for
    measuring performance improvement
  • Managers typically know the job areas that need
    performance improvement
  • Both pre-training and post-training data need to
    be collected and compared

165
Online Survey Tools for Assessment
166
Web-Based Survey Advantages
  • Faster collection of data
  • Standardized collection format
  • Computer controlled branching and skip sections
  • Easy to answer clicking
  • Wider distribution of respondents

167
Sample Survey Tools
  • Zoomerang (http//www.zoomerang.com)
  • IOTA Solutions (http//www.iotasolutions.com)
  • QuestionMark (http//www.questionmark.com/home.htm
    l)
  • SurveyShare (http//SurveyShare.com from
    Courseshare.com)
  • Survey Solutions from Perseus (http//www.perseusd
    evelopment.com/fromsurv.htm)
  • Infopoll (http//www.infopoll.com)

168
(No Transcript)
169
(No Transcript)
170
(No Transcript)
171
Online Testing Tools(see http//www.indiana.edu/
best/)
172
(No Transcript)
173
Test Selection Criteria (Hezel, 1999 Perry
Colon, 2001)
  • Easy to Configure Items and Test
  • Handle Symbols, Timed Tests
  • Scheduling of Feedback (immediate?)
  • Flexible Scoring and Reporting
  • (first, last, average, by individual or group)
  • Easy to Pick Items for Randomizing
  • Randomize Answers Within a Question
  • Weighting of Answer Options
  • Web Resource http//www.indiana.edu/best/

174
Tips on Authentification
  • Check e-mail access against list
  • Use password access
  • Provide keycode, PIN, or ID
  • (Futuristic Other Palm Print, fingerprint, voice
    recognition, iris scanning, facial scanning,
    handwriting recognition, picture ID)

175
Ziegler, April 2002, e-Learning
  • the key is not to measure every possible angle,
    but rather to focus on metrics that are pragmatic
    and relevant to both human and business
    performance at the same time.

176
E-Learning Evaluation Measures
  • So which of the 16 methods would you use???
  • Something ridiculous???

177
Some Final Advice
Or Maybe Some Questions???
Write a Comment
User Comments (0)
About PowerShow.com