Title: Session%20P16%20Evaluating%20Online%20Learning:%20Frameworks%20and%20Perspectives%20(Workshop:%20Sunday%20Feb%2017th,%20Training%202002)
1Session P16 Evaluating Online Learning
Frameworks and Perspectives(Workshop Sunday Feb
17th, Training 2002)
- Dr. Curtis J. Bonk
- President, CourseShare.com
- Associate Professor, Indiana University
- http//php.indiana.edu/cjbonk,
cjbonk_at_indiana.edu - Dr. Vanessa Paz Dennen
- Assistant Professor, San Diego State University
- vdennen_at_mail.sdsu.edu
- http//edweb.sdsu.edu/people/vdennen
2Workshop Overview
- Part I The State of Online Learning
- Part II. Evaluation Purposes, Approaches, and
Frameworks - Part III. Applying Kirkpatricks 4 Levels
- Part IV. ROI and Online Learning
- Part V. Collecting Evaluation Data Online
Evaluation Tools - (Time 830-1130 1230-330)
3Part I. The State of Online Learning
- Survey of Corporate Settings
- Whats Going On?
- And How Are We Evaluating It?
4Free Corporate Reports
- Corporate E-Learning Exploring a New Frontier,
Hambrecht and Co. (2000, March)
http//www.wrhambrecht.com/research/coverage/elear
ning/ir/ir_explore.pdf (95 pages) - Training Magazine Special Issue, September 2000,
37(9), The State of Online Learning - Fortune Special Issue, 142(13), Nov. 27, 2000,
Special Insert E-learning strategies for
executive education and corporate training.
http//www.fortuneelearning.com/topics/
5Survey of 201 Trainers, Instructors, Managers,
Instructional Designers, CEOs, CLOs, etc.
6Among the Key Goals
- To identify the resources, tools, and activities
desired in e-learning. - To document gaps between tools and resources
deemed useful and actual use. - To survey commitment to e-learning.
- To document practices related to e-learning
training and support. - To document pedagogical practices and
motivational techniques supported in e-learning.
7Survey Limitations
- Sample poole-PostDirect
- The Web is changing rapidly
- Lengthy survey, low response rate
- No password or keycode
- Many backgroundshard to generalize
- Does not address all issues (e.g., ROI
calculations, how trained supported, specific
assessments)
8(No Transcript)
9(No Transcript)
10(No Transcript)
11(No Transcript)
12Primary Job Function
- 84 Training (e.g., trainers, training
managers, training directors, or training
evaluators) - 30 Instructors or Trainers
- 27 Training Managers
- 20 Training Evaluators
- 14 Training Directors
- 45 Instructional Designers Program Devel.
- 5 Human Resources 5 Performance Managers
and 4 CLOs
13Categorized Job Titles
- 26 Trainers, Educators, or Instructors
- 20 Managers (e.g., Training, IT Programs,
Instructional Designers, or Quality Assurance) - 19 Directors (Director of Corp Education,
E-Learning, Professional Development, etc.) - 13 Instructional Designers or Technologists
- 13 High Ranking Administrators (CEO, President,
CLO, CTO) - 9 Consultants
14Professional Reading Interests
- 80 read magazines or journals related to
e-learning. - Nearly 100 read training related publications
15(No Transcript)
16(No Transcript)
17(No Transcript)
18(No Transcript)
19Why Interested in E-Learning?
- Mainly cost savings
- Reduced travel time
- Greater flexibility in delivery
- Timeliness of training
- Better allocation of resources, speed of
delivery, convenience, course customization,
lifelong learning options, personal growth,
greater distrib of materials
20Why Interested in E-Learning?
- Exploit the technology to deliver our
intellectual capital. - Reduce time to learn, reduce time to
productivity. - Cost reduction (write once, publish on different
platforms). - Invest less in expensive trips to train for 3
days without apparent results.
21(No Transcript)
22Blended Approach Is Most Common Ganzel, May
2001, Online learning Magazine
23Corporate Web Integration Continuum
- Level 1 Blended courseself-paced
- Level 2 Entire course online--self-paced
- Level 3 Tutored or mentored course
- Level 4 Blended courseinstructor led
- Level 5 Entire course online-synchronous
- Level 6 Entire course online-asynchronous
- Level 7 Entire course online-sync and
asynchronous - Level 8 Certificate program online
- Level 9 Degree online
- Level 10 Corporate university online
24(No Transcript)
25(No Transcript)
26Current Courseware SystemNegatives
- Slow development time.
- Not interactive.
- Low interactivity, boring.
- lack of bookmarking, tracking, evaluation,
etc. - Dont support the instructional design
processare course management systems. - XYZ,, presents obstacles in moving course
content from one server to another.
27Current Courseware System Negative and Positive
- does provide a number of excellent features,
yet development time is very clumsyit is not
very intuitive. - XYZ is powerful and intuitive. It is not always
reliable. - Fairly reliable, but not always. At times have
had to stop training and go back to the beginning
to start again as it seizes up. - From a cost posture, they are, quite simply,
unbeatable. Limitations Cant save whiteboard
presentations developed in virtual classroom.
28Current Courseware System Positives
- It is comprehensive, scalable, and intuitive.
- seems to be flexible.
- XYZ is simple to use clean in design.
- modify to suit individual course needs.
- Its reasonably inexpensive, there is a
Web-based template to design customized
courseseasily added to existing courseware.
29Delivery System
- 17 developed own systems or tools
- 15 did not know what system they were using
- 30 used Internet application tools (e.g.,
Designers Edge, Dreamweaver, Authorware) - 35 used presentation tools (e.g., Astound,
WebEx) - Many used existing courseware systems and tools
(e.g., WebBoard, Learning Space)
30What Vendors Select Why?
- Standardization vs. Innovation
- Standard Tool Advantages
- Training easier, jump started, common framework,
fixed costs - Disadvantages
- Tools do not fit all needs, need technical
training, lose control
31Web-Based Content
- Capella
- Click 2 Learn
- Colleges/Universities
- Digital Think
- Docent, Inc.
- Eduprise
- Element K
- eMind.com
- eSocrates
- ExecuTrain
- Freeskills.com
- Headlight.com
- Jones International University
- KnowledgeNet
- Knowledge Planet
- Mentergy--includes LearnLinc products
- Microsoft Training and Service
- Netg
- Prime Learning
- Saba
- Smart Force
- ThinQ (i.e., Trainingnet)
- TrainSeek
- Vcampus
- Viviance New Education
- Walden Univ./Institute
32(No Transcript)
33(No Transcript)
34Why Evaluate?
- Cost-savings
- Becoming less important reason to evaluate as
more people recognize that the initial expense is
balanced by long-term financial benefits - Performance improvement
- A clear place to see impact of online learning
- Competency advancement
35Pause How are costs calculated in online
programs?
36The Cost of E-learning
- Brandon-hall.com estimates that an LMS system for
8,000 learners costs 550,000 - This price doesnt include the cost of buying or
developing content - Bottom line getting started in e-learning isnt
cheap
37Evaluation Process
- Can be likened to ADDIE instructional design
model - ANALYSIS is needed to determine a purpose of the
evaluation - A DESIGN is needed to guide the process
- Instruments must be DEVELOPED
- Without IMPLEMENTATION you have no data
- In the end, the data are analyzed, and EVALUATED
38A Few Assessment Comments
39Level 1 Comments. Reactions
- We assess our courses based on participation
levels and online surveys after course
completion. All of our courses are
asynchronous. - I conduct a post course survey of course
material, delivery methods and mode, and
instructor effectiveness. I look for suggestions
and modify each course based on the results of
the survey. - We use the Halo Survey process of asking them
when the course is concluding.
40Level 2 Comments Learning
- We use online testing and simulation frequently
for testing student knowledge. - Do multiple choice exams after each section of
the course. - We use online exams and use level 2 evaluation
forms.
41Level 3 Comment Job Performance
- I feel strongly there is a need to measure the
success of any training in terms of the
implementation of the new behaviors on the job.
Having said that, I find there is very limited by
our clients in spending the dollars required
42More Assessment CommentsMultiple Level Evaluation
- Using Level One Evaluations for each session
followed by a summary evaluation. Thirty days
post-training, conversations occur with learners
managers to assess Level 2 (actually Level 3). - We do Level 1 measurements to gauge student
reactions to online training using an online
evaluation form. We do Level 2 measurements to
determine whether or not learning has occurred - Currently, we are using online teaching and
following up with manager assessments that the
instructional material is being put to use on the
job.
43Who is Evaluating Online Learning?
- 59 of respondents said they did not have a
formal evaluation program - At Reaction level 79
- At Learning level 61
- At Behavior/Job Performance level 47
- At Results or Return on Investment 30
44(No Transcript)
45Assessment Lacking or Too Early
- We are just beginning to use Web-based
technology for education of both associates and
customers, and do not have the metric to measure
our success. However, we are putting together a
focus group to determine what to measure (and)
how. - We have no online evaluation for students at
this time. - We lack useful tools in this area.
46Limitations with Current System
- I feel strongly there is a need to measure the
success of any training in terms of the
implementation of the new behaviors on the job.
Having said that, I find there is very limited by
our clients in spending the dollars required - We are looking for better ways to track learner
progress, learner satisfaction, and retention of
material. - Have had fairly poor ratings on reliability,
customer support, and interactivity
47PauseHow and What Do You Evaluate?
48What else did the corporate training survey show?
49(No Transcript)
50(No Transcript)
51(No Transcript)
52(No Transcript)
53(No Transcript)
54(No Transcript)
55(No Transcript)
56(No Transcript)
57Sample Reasons for Obstacles
- Skepticism on the benefits within the Healthcare
environment. - Ignorance about the advantages of using the
Internet to save money. - Generation gap and bias against anything not
face to face. - Poor support from IT managers to support
organizational goals. - Lack of foresight in the industry/no ability to
see the big pic!
58(No Transcript)
59Just Why is Bandwidth So Darn Important???
60(No Transcript)
61Obstacles Technology Comments
- Lack of hardware to efficiently use Web-based
technology. - Systems infrastructure.
- Huge diversity in hardware.
- Reliable Web access of our training audiences.
- Caught up in the tech not the instruction!
62(No Transcript)
63(No Transcript)
64Obstacles Problems in Delivery Methods
- Students needs hands on.
- High rate of change in IT materialsnever
mature. - Effectiveness of this method.
- Some courses are better delivered in traditional
classrooms.
65(No Transcript)
66(No Transcript)
67(No Transcript)
68(No Transcript)
69(No Transcript)
70(No Transcript)
71(No Transcript)
72(No Transcript)
73(No Transcript)
74(No Transcript)
75(No Transcript)
76Issues Raised in Survey
- Increases in Web instruction anticipated
- Better tools needed
- Perceived high cost
- Need clearer vision manage support
- Lots of money being spent
- Low course completion rates
- Limited organizational support
77So, any questions about the state of things?
78What do we need???
- Part II
- Evaluation Purposes, Approaches and Frameworks
79One Area in Need of Frameworks is Evaluation of
Online Learning
80What is Evaluation???
- Simply put, an evaluation is concerned with
judging the worth of a program and is essentially
conducted to aid in the making of decisions by
stakeholders. (e.g., does it work as
effectively as the standard instructional
approach). - (Champagne Wisher, in press)
81But who are the evaluators?
- The level of evaluation will depend on
articulation of the stakeholders. Stakeholders
of evaluation in corporate settings may range
from???
82What is assessment?
- Assessment refers toefforts to obtain info about
how and what students are learning in order to
improveteaching efforts and/or to demo to others
the degree to which students have accomplished
the learning goals for a course. (Millar, 2001,
p. 11). - It is a way of using info obtained through
various types of measurement to determine a
learners performance or skill on some task or
situation (Rosenkrans, 2000).
83Why Evaluate?
84Evaluation Purposes
- Assessing learner progress
- What did they learn?
- Assessing learning impact
- How well do learners use what they learned?
- How much do learners use what they learn?
85Evaluation Purposes
- Efficiency
- Was online learning more effective than another
medium? - Was online learning more cost-effective than
another medium/what was the return on investment
(ROI)? - Improvement
- How do we do this better?
86Evaluation Purposes
- An evaluation plan can evaluate the delivery of
e-learning, identify ways to improve the online
delivery of it, and justify the investment in the
online training package, program, or initiative
(Champagne Wisher, in press).
87Evaluation Purposes
- Evaluation can help quantify the return on
investment allowing one to compare the costs of
acquiring, developing, and implementing
e-learning to actual savings, revenue impact, and
other competitive advantages that are
translatable into monetary values.
88Contextual Factors
- Learner progress, impact of training and
efficiency all may be affected by other
contextual factors - Contextual factors unique to online learning
- Technology breakdowns
- Inadequate computer systems (learners cant
access multimedia components -- and dont know
that theyre missing anything)
89Evaluation Plans
- Does your company have a training evaluation plan?
90Formal Evaluation Programs
- Most training evaluation data are not used for
evaluation or performance improvement purposes. - Why? There is no plan for using the data and no
one has the time. - Why does it matter in online learning? Need to be
sure that the development expense is justified.
91Steps to Developing an OL Evaluation Program
- Select a purpose and framework
- Develop benchmarks
- Develop online survey instruments
- For learner reactions
- For learner post-training performance
- For manager post-training reactions
- Develop data analysis and management plan
92What Are Your Evaluation Questions?
- What does your employer want to know about online
learnings impact? - How interested is your employer in evaluation
results?
93Formative Evaluation
- Formative evaluations focus on improving the
online learning experience. - A formative focus will try to find out what
worked or did not work. - Formative evaluation is particularly useful for
examining instructional design and instructor
performance.
94Formative Questions
- -How can we improve our OL program?
- -How can we make our OL program more efficient?
- -More effective?
- -More accessible?
95Summative Evaluation
- Summative evaluations focus on the overall
success of the OL experience (should it be
continued?). - A summative focus will look at whether or not
objectives are met, the training is
cost-effective, etc.
96What Can OL Evaluation Measure?
- Categories of Evaluation Info (Woodley and
Kirkwood, 1986) - .Measures of activity
- .Measures of efficiency
- .Measures of outcomes
- .Measures of program aims
- .Measures of policy
- .Measures of organizations
97Typical Evaluation Frameworks for OL
- Commonly used frameworks include
- CIPP Model
- Objectives-oriented
- Marshall Shrivers 5 levels
- Kirkpatricks 4 levels
- Plus a 5th level
- AEIOU
- Consumer-oriented
98CIPP Model Evaluation
- CIPP is a management-oriented model
- C context
- I input
- P process
- P product
- Examines the OL within its larger system/context
99CIPP OL Context
- Context Addresses the environment in which OL
takes place. - How does the real environment compare to the
ideal? - Uncovers systemic problems that may dampen OL
success.
100CIPP OL Input
- Input Examines what resources are put into OL.
- Is the content right?
- Have we used the right combination of media?
- Uncovers instructional design issues.
101CIPP OL Process
- Process Examines how well the implementation
works. - Did the course run smoothly?
- Were there technology problems?
- Was the facilitation and participation as
planned? - Uncovers implementation issues.
102CIPP OL Product
- Product Addresses outcomes of the learning.
- Did the learners learn? How do you know?
- Does the online training have an effect on
workflow or productivity? - Uncovers systemic problems.
103Objectives-Oriented Evaluation
- Examines OL training objectives as compared to
training results - Helps determine if objectives are being met
- Helps determine if objectives, as formally
stated, are appropriate - Objectives can be used as a comparative benchmark
between online and other training methods
104Evaluating Objectives OL
- An objectives-oriented approach can examine two
levels of objectives - Instructional objectives for learners (did the
learners learn?) - Systemic objectives for training (did the
training solve the problem?)
105Objectives OL
- Requires
- A clear sense of what the objectives are (always
a good idea anyway) - The ability to measure whether or not objectives
are met - Some objectives may be implicit and hard to state
- Some objectives are not easy to measure
106Marshall Shriver's 5 Levels of Evaluation
- Performance-based evaluation framework
- Each level examines a different areas of
performance - Requires demonstration of learning
107Marshall Shriver's 5 Levels
- Level I Self (instructor)
- Level II Course Materials
- Level II Course Curriculum
- Level IV Course Modules
- Level V Learning Transfer
108Kirkpatricks 4 Levels
- A common training framework.
- Examines training on 4 levels.
- Not all 4 levels have to be included in a given
evaluation.
109The 4 Levels
- Reaction
- Learning
- Behavior
- Results
110A 5th Level
- Return on Investment is a 5th level
- It is related to results, but is more clearly
stated as a financial calculation - How to calculate ROI is the big issue here
111Is ROI the answer?
- Elise Olding of CLK Strategies suggests that we
shift from looking at ROI to looking at time to
competency. - ROI may be easier to calculate since concrete
dollars are involved, but time to competency may
be more meaningful in terms of actual impact.
112Example Call Center Training
- Traditional call center training can take 3
months to complete - Call center employees typically quit within one
year - When OL was implemented, the time to train (time
to competency) was reduced - Benchmarks for success time per call number of
transfers
113Example Circuit City
- Circuit City provided online product/sales
training - What is more useful to know
- The overall ROI or break-even point?
- How much employees liked the training?
- How many employees completed the training?
- That employees who completed 80 of the training
saw an average increase of 10 in sales?
114A 6th Level?Clark Aldrich (2002)
- Adding Level 6 which relates to the budget and
stability of the e-learning team. - Just how respected and successful is the
e-learning team. - Have they won approval from senior management for
their initiatives. - Aldrich, C. (2002). Measuring success In a
post-Maslow/Kirkpatrick world, which metrics
matter? Online Learning, 6(2), 30 32.
115And Even a 7th Level?Clark Aldrich (2002)
- At Level 7 whether the e-learning sponsor(s) or
champion(s) are promoted in the organization. - While both of these additional levels address the
people involved in the e-learning initiative or
plan, such recognitions will likely hinge on the
results of evaluation of the other five levels.
116ROI AlternativeCost/Benefit Analysis (CBA)
- ROI may be ill-advised since not all impacts hit
bottom line, and those that do take time. - Shifts the attention from more long-term results
and quantifying impacts with numeric values, such
as - increased revenue streams,
- increased employee retention, or
- reduction in calls to a support center.
- Reddy, A. (2002, January). E-learning ROI
calculations Is a cost/benefit analysis a better
approach? e-learning. 3(1), 30-32.
117Cost/Benefit Analysis (CBA)
- To both qualitative and quantitative measures
- job satisfaction ratings,
- new uses of technology,
- reduction in processing errors,
- quicker reactions to customer requests,
- reduction in customer call rerouting,
- increased customer satisfaction,
- enhanced employee perceptions of training,
- global post-test availability.
- Reddy, A. (2002, January). E-learning ROI
calculations Is a cost/benefit analysis a better
approach? e-learning. 3(1), 30-32.
118Cost/Benefit Analysis (CBA)
- In effect, CBA asks how does the sum of the
benefits compare to the sum of the costs. - Yet, it often leads to or supports ROI and other
more quantitatively-oriented calculations. - Reddy, A. (2002, January). E-learning ROI
calculations Is a cost/benefit analysis a better
approach? e-learning. 3(1), 30-32.
119Other ROI Alternatives
- Time to competency (need benchmarks)
- online databases of frequently asked questions
can help employees in call centers learn skills
more quickly and without requiring temporary
leaves from their position for such training - Time to market
- might be measured by how e-learning speeds up the
training of sales and technical support
personnel, thereby expediting the delivery of a
software product to the market - Raths, D. (2001, May). Measure of success.
Online Learning, 5(5), 20-22, 24.
120Still Other ROI Alternatives
- Return on Expectation
- Asks employees a series of qs related to how
training met expectations of their job
performance. - When qing is complete, they place a figure on
that. - Correlate or compare such reaction data with
business results or supplement Level 1 data to
include more pertinent info about the
applicability of learning to employee present job
situation. - Raths, D. (2001, May). Measure of success.
Online Learning, 5(5), 20-22, 24.
121AEIOU
- Provides a framework for looking at different
aspects of an online learning program - Fortune Keith, 1992 Sweeney, 1995 Sorensen,
1996
122A Accountability
- Did the training do what it set out to do?
- Data can be collected through
- Administrative records
- Counts of training programs ( of attendees, of
offerings) - Interviews or surveys of training staff
123E Effectiveness
- Is everyone satisfied?
- Learners
- Instructors
- Managers
- Were the learning objectives met?
124I Impact
- Did the training make a difference?
- Like Kirkpatricks level 4 (Results)
125O Organizational Context
- Did the organizations structures and policies
support or hinder the training? - Does the training meet the organizations needs?
- OC evaluation can help find when there is a
mismatch between the training design and the
organization - Important when using third-party training or
content
126U Unintended Consequences
- Unintended consequences are often overlooked in
training evaluation - May give you an opportunity to brag about
something wonderful that happened - Typically discovered via qualitative data
(anecdotes, interviews, open-ended survey
responses)
127Consumer-Oriented Evaluation
- Uses a consumer point-of-view
- Can be a part of vendor selection process
- Can be a learner-satisfaction issue
- Relies on benchmarks for comparison of different
products or different learning media
128What About Evaluation Issues in Higher
Education???
129My Evaluation Plan
130What to Evaluate?
- Studentattitudes, learning, jobs.
- Instructorpopularity, survival.
- Trainingeffectiveness, integratedness.
- Task--relevance, interactivity, collab.
- Tool--usable, learner-centered, friendly,
supportive. - Courseinteractivity, completion.
- Programgrowth, model(s), time to build.
- Universitycost-benefit, policies, vision.
131Measures of Student Success(Focus groups,
interviews, observations, surveys, exams, records)
- Positive Feedback, Recommendations
- Increased Comprehension, Achievement
- High Retention in Program
- Completion Rates or Course Attrition
- Jobs Obtained, Internships
- Enrollment Trends for Next Semester
1321. Student Basic Quantitative
- Grades, Achievement
- Number of Posts
- Participated
- Computer Log Activitypeak usage, messages/day,
time of task or in system - Attitude Surveys
1331. Student High-End Success
- Message complexity, depth, interactivity, qing
- Collaboration skills
- Problem finding/solving and critical thinking
- Challenging and debating others
- Case-based reasoning, critical thinking measures
- Portfolios, performances, PBL activities
134Focus of Assessment?
- Basic Knowledge, Concepts, Ideas
- Higher-Order Thinking Skills, Problem Solving,
Communication, Teamwork - Both of Above!!!
- Other
135Assessments Possible
- Online Portfolios of Work
- Discussion/Forum Participation
- Online Mentoring
- Weekly Reflections
- Tasks Attempted or Completed, Usage, etc.
136More Possible Assessments
- Quizzes and Tests
- Peer Feedback and Responsiveness
- Cases and Problems
- Group Work
- Web Resource Explorations Evaluations
137Increasing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
- http//www.academictermpapers.com/
- http//www.termpapers-on-file.com/
- http//www.nocheaters.com/
- http//www.cheathouse.com/uk/index.html
- http//www.realpapers.com/
- http//www.pinkmonkey.com/
- (youll never buy Cliffnotes again)
138(No Transcript)
139(No Transcript)
140(No Transcript)
141Reducing Cheating Online
- Ask yourself, why are they cheating?
- Do they value the assignment?
- Are tasks relevant and challenging?
- What happens to the task after submittedreused,
woven in, posted? - Due at end of term? Real audience?
- Look at pedagogy b4 calling plagiarism police!
142Reducing Cheating Online
- Proctored exams
- Vary items in exam
- Make course too hard to cheat
- Try Plagiarism.com (300)
- Use mastery learning for some tasks
- Random selection of items for item pool
- Use test passwords, rely on IP screening
- Assign collaborative tasks
143Reducing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
- http//www.plagiarism.org/ (resource)
- http//www.turnitin.com/ (software, 100, free 30
day demo/trial) - http//www.canexus.com/ (software essay
verification engine, 19.95) - http//www.plagiserve.com/ (free database of
70,000 student term papers cliff notes) - http//www.academicintegrity.org/ (assoc.)
- http//sja.ucdavis.edu/avoid.htm (guide)
144(No Transcript)
145Turnitin Testimonials
- "Many of my students believe that if they do not
submit their essays, I will not discover their
plagiarism. I will often type a paragraph or two
of their work in myself if I suspect plagiarism.
Every time, there was a "hit." Many students were
successful plagiarists in high school. A service
like this is needed to teach them that such
practices are no longer acceptable and certainly
not ethical!
146Part III
- Applying Kirkpatricks 4 Levels to Online
Learning Evaluation Evaluation Design
147Why Use the 4 Levels?
- They are familiar and understood
- Highly referenced in the training literature
- Can be used with 2 delivery media for comparative
results
148Conducting 4-Level Evaluation
- You need not use every level
- Choose the level that is most appropriate to your
need and budget - Higher levels will be more costly and difficult
to evaluate - Higher levels will yield more
149Kirkpatrick Level 1 Reaction
- Typically involves Smile sheets or
end-of-training evaluation forms. - Easy to collect, but not always very useful.
- Reaction-level data on online courses has been
found to correlate with ability to apply learning
to the job. - Survey ideally should be Web-based, keeping the
medium the same as the course.
150Kirkpatrick Level I Reaction
- Types of questions
- Enjoyable?
- Easy to use?
- How was the instructor?
- How was the technology?
- Was it fast or slow enough?
151Kirkpatrick Level 2 Learning
- Typically involves testing learners immediately
following the training - Not difficult to do, but online testing has its
own challenges - Did the learner take the test on his/her own?
152Kirkpatrick Level 2 Learning
- Higher-order thinking skills (problem solving,
analysis, synthesis) - Basic skills (articulate ideas in writing)
- Company perspectives and values (teamwork,
commitment to quality, etc.) - Personal development
153Kirkpatrick Level 2 Learning
- Might include
- Essay tests.
- Problem solving exercises.
- Interviews.
- Written or verbal tests to assess cognitive
skills. - Shepard, C. (1999b, July). Evaluating online
learning. TACTIX from Fastrak Consulting.
Retrieved February 10, 2002, from
http//fastrak-consulting.co.uk/tactix/Features/ev
aluate/eval01.htm.
154Kirkpatrick Level 3 Behavior
- More difficult to evaluate than Levels 1 2
- Looks at whether learners can apply what they
learned (does the training change their
behavior?) - Requires post-training follow-up to determine
- Less common than levels 1 2 in practice
155Kirkpatrick Level 3 Behavior
- Might include
- Direct observation by supervisors or coaches
(Wisher, Curnow, Drenth, 2001). - Questionnaires completed by peers, supervisors,
and subordinates related to work performance. - On the job behaviors, automatically logged
performances, or self-report data. - Shepard, C. (1999b, July). Evaluating online
learning. TACTIX from Fastrak Consulting.
Retrieved February 10, 2002, from
http//fastrak-consulting.co.uk/tactix/Features/ev
aluate/eval01.htm.
156Kirkpatrick Level 4 Results
- Often compared to return on investment (ROI)
- In e-learning, it is believed that the increased
cost of course development ultimately is offset
by the lesser cost of training implementation - A new way of training may require a new way of
measuring impact
157Kirkpatrick Level 4 Results
- Might Include
- Labor savings (e.g., reduced duplication of
effort or faster access to needed information). - Production increases (faster turnover of
inventory, forms processed, accounts opened,
etc.). - Direct cost savings (e.g., reduced cost per
project, lowered overhead costs, reduction of bad
debts, etc.). - Quality improvements (e.g., fewer accidents, less
defects, etc.). - Horton, W. (2001). Evaluating e-learning.
Alexandria, VA American Society for Training
Development.
158Kirkpatrick Evaluation Design
- Kirkpatricks 4 Levels may be achieved via
various evaluation designs - Different designs help answer different questions
159Pre/Post Control Groups
- One group receives OL training and one does not
- As variation try 3 groups
- No training (control)
- Traditional training
- OL training
- Recommended because it may help neutralize
contextual factors - Relies on random assignment as much as possible
160Multiple Baselines
- Can be used for a program that is rolling out
- Each group serves as a control group for the
previous group - Look for improvement in subsequent groups
- Eliminates need for tight control of control group
161Time Series
- Looks at benchmarks before and after training
- Practical and cost-effective
- Not considered as rigorous as other designs
because it doesnt control for contextual factors
162Single Group Pre/Post
- Easy and inexpensive
- Criticized for lack of rigor (absence of control)
- Needs to be pushed into Kirkpatrick levels 3 and
4 to see if there has been impact
163Case Study
- A rigorous design in academic practice, but often
after-the-fact in corporate settings - Useful when no preliminary or baseline data have
been collected
164Part IV
165The Importance of ROI
- OL requires a great amount of and other
resources up front - It gives the promise of financial rewards later
on - ROI is of great interest because of the
investment and the wait period before the return
166Calculating ROI
- Look at
- Hard cost savings
- Hard revenue impact
- Soft competitive benefits
- Soft benefits to individuals
- See Calculating the Return on Your eLearning
Investment (2000) by Docent, Inc.
167Possible ROI Objectives
- Better Efficiencies
- Greater Profitability
- Increased Sales
- Fewer Injuries on the Job
- Less Time off Work
- Faster Time to Competency
168Hard Cost Savings
- Travel
- Facilities
- Printed material costs (printing, distribution,
storage) - Reduction of costs of business through increased
efficiency - Instructor fees (sometimes)
169Hard Revenue Impact
- Consider
- Opportunity cost of improperly or untrained
personnel - Shorter time to productivity through shorter
training times with OL - Increased time on job (no travel time)
- Ease of delivering same training to partners and
customers (for fee?)
170Soft Competitive Benefits
- Just-in-time capabilities
- Consistency in delivery
- Certification of knowledge transfer
- Ability to track users and gather data easily
- Increase morale from simultaneous roll-out at
different sites
171Individual Values
- Less wasted time
- Support available as needed
- Motivation from being treated as an individual
172Talking about ROI
- As a percentage
- ROI(Payback-Investment)/Investment100
- As a ratio
- ROIReturn/Investment
- As time to break even
- Break even time(Investment/Return)Time Period
173What is ROI Good For?
- Prioritizing Investment
- Ensuring Adequate Financial Support for OL
Project - Comparing Vendors
174The Changing Face of ROI
- Return-on-investment isnt what it used to be
The R is no longer the famous bottom line and the
I is more likely a subscription fee than a
one-time payment (Cross, 2001)
175More Calculations
- Total Admin Costs of Former Program - Total
Admin Costs of OL ProgramProjected Net Savings - Total Cost of Training/ of StudentsCost Per
Student (CPS) - Total Benefits 100/Total Program CostROI
176At the End of the Day...
- Are all training results quantifiable?
- NO! Putting a price tag on some costs and
benefits can be very difficult - NO! Some data may not have much meaning at face
value - What if more courses are offered and annual
student training hours drop simultaneously? Is
this bad?
177Part V
- Collecting Evaluation Data Online Evaluation
Tools
178Collecting Evaluation Data
- Learner Reaction
- Learner Achievement
- Learner Job Performance
- Manager Reaction
- Productivity Benchmarks
179Forms of Evaluation
- Interviews
- Focus Groups
- Self-Analysis
- Supervisor Ratings
- Surveys and Questionnaires
- ROI
- Document Analysis
- Data Mining (Changes in pre and post-training
e.g., sales, productivity)
180How Collect Data?
- Direct Observation in Work Setting
- By supervisor, co-workers, subordinates, clients
- Collect Data By Surveys, Interviews, Focus Groups
- Supervisors, Co-workers, Subordinates, Clients
- Self-Report by learners or teams
181Learner Data
- Online surveys are the most effective way to
collect online learner reactions - Learner performance data can be collected via
online tests - Pre and post-tests can be used to measure
learning gains - Learner post-course performance data can be used
for Level 3 evaluation - May look at on-the-job performance
- May require data collection from managers
182Example Naval Phys. Training Follow-Up Evaluation
- A naval training unit uses an online
survey/database system to track performance of
recently trained physiologists - Learners self-report performance
- Managers report on learner performance
- Unit heads report on overall productivity
183Learning System Data
- Many statistics are available, but which are
useful? - Number of course accesses
- Log-in times/days
- Time spent accessing course components
- Frequency of access for particular components
- Quizzes completed and quiz scores
- Learner contributions to discussion (if
applicable)
184Learner System Data
- IF learners are being evaluated based on number
and length of accesses, it is only fair that they
be told - Much time can be wasted analyzing statistics that
dont tell much about the actual impact of the
training - Bottom line Easy data to collect, but not always
useful for evaluation purposes - Still useful for management purposes
185Benchmark Data
- Companies need to develop benchmarks for
measuring performance improvement - Managers typically know the job areas that need
performance improvement - Both pre-training and post-training data need to
be collected and compared - Must also look for other contextual factors
186Online Testing Tools(see http//www.indiana.edu/
best/)
187(No Transcript)
188(No Transcript)
189Test Selection Criteria (Hezel, 1999)
- Easy to Configure Items and Test
- Handle Symbols
- Scheduling of Feedback (immediate?)
- Provides Clear Input of Dates for Exam
- Easy to Pick Items for Randomizing
- Randomize Answers Within a Question
- Weighting of Answer Options
190More Test Selection Criteria
- Recording of Multiple Submissions
- Timed Tests
- Comprehensive Statistics
- Summarize in Portfolio and/or Gradebook
- Confirmation of Test Submission
191More Test Selection Criteria(Perry Colon, 2001)
- Supports multiple items typesmultiple choice,
true-false, essay, keyword - Can easily modify or delete items
- Incorporate graphic or audio elements?
- Control over number of times students can submit
an activity or test - Provides feedback for each response
192More Test Selection Criteria(Perry Colon, 2001)
- Flexible scoringscore first, last, or average
submission - Flexible reportingby individual or by item and
cross tabulations. - Outputs data for further analysis
- Provides item analysis statistics (e.g., Test
Item Frequency Distributions).
193Computer Log DataChen, G. D., Liu, C. C., Liu,
B. J. (2000). Discovering decision knowledge from
Web log portfolio for managing classroom
processes by applying decision tree and data cute
tech. Journal of Educ Computing Research, 23(3),
305-332.
- Determine student behavior patterns
- student posting opinions,
- asking questions,
- replying to opinions,
- posting articles, etc.
- Web logs can also help instructors make informed
pedagogical decisions. For instance, does a
particular teaching strategy or task improve
student interaction?
194Computer Log DataChen, G. D., Liu, C. C., Liu,
B. J. (2000). Discovering decision knowledge from
Web log portfolio for managing classroom
processes by applying decision tree and data cute
tech. Journal of Educ Computing Research, 23(3),
305-332.
- In a corp training situation, computer log data
can correlate online course completions with - actual job performance improvements such as
- fewer violations of safety regulations,
- reduced product defects,
- increased sales, and
- timely call responses.
195Email and Chat
- Chats and email messages might provide data about
the effectiveness of the training event.
196Online Survey Tools for Assessment
197Sample Survey Tools
- Zoomerang (http//www.zoomerang.com)
- IOTA Solutions (http//www.iotasolutions.com)
- QuestionMark (http//www.questionmark.com/home.htm
l) - SurveyShare (http//SurveyShare.com from
Courseshare.com) - Survey Solutions from Perseus (http//www.perseusd
evelopment.com/fromsurv.htm) - Infopoll (http//www.infopoll.com)
198(No Transcript)
199(No Transcript)
200Survey Tool Features
- Maintain email lists and email invitations
- Conduct polls
- Adaptive branching and cross tabulations
- Modifiable templates
- Maintain library of past surveys
- Publish reports
- Technical support, chat advice
- Different types of accountshosted, corporate,
professional, etc.
201(No Transcript)
202(No Transcript)
203(No Transcript)
204(No Transcript)
205(No Transcript)
206Web-Based Survey Advantages
- Faster collection of data
- Standardized collection format
- Computer graphics may reduce fatigue
- Computer controlled branching and skip sections
- Easy to answer clicking
- Wider distribution of respondents
207Web-Based Survey Problems Why Lower Response
Rates?
- Low response rate
- Lack of time
- Unclear instructions
- Too lengthy
- Too many steps
- Cant find URL
- Perceived as aggressive
208Web-Based Survey Solutions Some Tips
- Send second request
- Make URL link prominent
- Offer incentives near top of request
- Shorten survey, make attractive, easy to read
- Credible sponsorshipe.g., university
- Disclose purpose, use, and privacy
- E-mail cover letters
- Prenotify of intent to survey
209Tips on Authentification
- Check e-mail access against list
- Use password access
- Provide keycode, PIN, or ID
- (Futuristic Other Palm Print, fingerprint, voice
recognition, iris scanning, facial scanning,
handwriting recognition, picture ID)
210Some Final Advice
211- As venture capital drys up and state funding is
cut, evaluation and accountability takes center
stage in e-learning decision-making and
discussion.
212Questions?Comments?Concerns?