Measuring Human Performance - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Measuring Human Performance

Description:

Measuring Human Performance Measuring Human Performance Limitations of Topic System Procedure lacks precision Doesn t identify test takers Not designed on learners ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 44
Provided by: JeffM55
Category:

less

Transcript and Presenter's Notes

Title: Measuring Human Performance


1
Measuring Human Performance
2
Introduction
  • Kirkpatrick (1994) provides a very usable model
    for measurement across the four levels Reaction,
    Learning, Behavior, and Results. These
    categories are discrete and can be measured. The
    goal of this presentation is to bring to light
    many of the topics, concerns, and issues that
    must be understood before carrying out the
    business of testing, measuring, or evaluating the
    success of training in the work force today.

3
What is a test? What is testing?
  • The instrument used to collect data
  • A process of collecting quantifiable information
    about the degree to which a competence or ability
    is present in the test taker. (Anderson, BC)

4
Reasons for Testing
  • Prerequisite tests
  • Entry test
  • Diagnostic test
  • Post test
  • Equivalency test

5
Norm Reference Vs Criterion Reference
6
Norm Referenced Testing
  • Test items separate test-takers one from another
  • Normal distribution curve

7
Criterion Referenced Testing
  • Test items based on specific objectives
  • Mastery Curve / Skewed from Normal Distribution

8
SKA
  • Skill
  • Knowledge
  • Attitude

9
Domains of Learning
  • Cognitive
  • Affective
  • Psychomotor

10
Blooms Taxonomy for Cognitive Levels
  • Knowledge
  • Comprehension
  • Application
  • Analysis
  • Synthesis
  • Evaluation

11
Krathwohls Taxonomy for Affective Levels
  • Receiving
  • Responding
  • Valuing
  • Organization
  • Characterization by a value or value complex

12
Simpsons Taxonomy for Psychomotor Levels
  • Perception
  • Set
  • Guided Response
  • Mechanism
  • Complex Overt Response
  • Adaptation
  • Origination

13
Test Items Related to Blooms Taxonomy
  • Multiple Choice
  • Most flexible across the Taxonomy spectrum,
    especially first three levels
  • Advantages
  • Guessing probability low
  • Diagnostic capabilities
  • East to grade
  • Statistical Analysis

14
Multiple Choice cont..
  • Disadvantages
  • Difficult to write
  • Provides keys for recall
  • doesnt do well for high level cognition
    evaluation

15
True and False
  • Could be used at all levels but.
  • Advantages
  • easy to write
  • easy to score
  • can to item analysis

16
T/F cont.
  • Disadvantages
  • 50/50 guess factor
  • often used when M/C seems too hard to write
  • Reliability is so poor..Very little evaluation
    value.
  • So why do teachers often include T/F?

17
Matching
  • Best suited for Application level.not
    recommended for any by me.
  • Advantages
  • Easy to write
  • East to Grade
  • Statistical Analysis

18
Matching cont
  • Disadvantage
  • Requires the two lower learning level
  • Process of elimination diminishes probability
  • low reliability
  • Why would a teacher use Matching?

19
Fill in the Blank
  • Best suited for the lower levels
  • Advantage
  • Recall is essential, few clues
  • Disadvantage
  • Single word or phrase
  • grading beyond single word or phase is in trouble
  • enters the realm of subjective grading..poor
    reliability

20
Short Answer
  • Can get to the high order thinking
  • Advantages
  • Easy to write
  • produces original responses
  • Disadvantages
  • Basically same as fill in.reliability

21
Essay
  • The best for higher order
  • Advantage
  • high order
  • creative ability
  • writing ability

22
Essay cont
  • Disadvantage
  • Tough to grade
  • forget stats
  • Youll see this often in Masters and Ph.d.
    classes

23
Validity
  • Does the test measure what it is suppose to
    measure.
  • How close to the bulls eye did it hit.

24
Reliability
  • How consistent is the test
  • Is there a tight pattern of hits


25
(No Transcript)
26
Types of Validity
  • Concurrent Validity
  • Content Validity
  • Criterion Related Validity
  • Predictive Validity
  • Construct Validity

27
Types of Reliability
  • Test-Retest Reliability
  • Inner-Rater Reliability

28
What is the real score of a test?
  • An error factor must be considered
  • test score error factor

29
Ten Evaluation Instruments for Technical Training
  • Interviews
  • Questionnaires
  • Group Discussion
  • Critical Incident
  • Work Diaries

30
Instruments cont...
  • Performance Records
  • Simulation Role-Play
  • Observation
  • Written Test
  • Performance Test

31
Designing Tests
  • Questions you must ask yourself
  • Who is the test designed for?
  • What do you want to know?
  • How many Questions will be required?
  • How will it be administered?
  • How will it be scored?

32
3 Methods of Test Construction
  • Topic Based
  • Statistical Based
  • Objective Based

33
Topical Based Test
  • Selection done by chapter
  • Selection done by topic
  • Selection done by the importance of the topic

34
Limitations of Topic System
  • Procedure lacks precision
  • Doesnt identify test takers
  • Not designed on learners level
  • Doesnt specify competencies

35
Statistical Selection
  • Items statistically selected
  • Standardized
  • Norm Referenced

36
Limitations of Statistical
  • What is measured not specific
  • Lacks precision of CRT
  • Difficult to select items

37
Objectives Based Test
  • Based on defined competencies
  • Applies to criterion referenced tests and scores

38
Testing and Kirpatricks Four Levels
  • The more downward, from the performance of the
    company to the performance of the individuals,
    the more difficult to obtain.
  • The more downward...the more usable the
    information

39
Four Levels
  • REACTION
  • LEARNING
  • BEHAVIOR
  • RESULTS

40
Reaction
  • Checking individuals reaction often means,
    measuring Customer Satisfaction
  • Happy rating sheets
  • observations
  • other
  • How can you quantify the responses?

41
Learning
  • Measurable behavior changes in the three SKA
    Dimensions

42
Behavior
  • Behavior change due to training program.
  • Surveys
  • Interviews
  • Other

43
Results
  • Measurable by looking at changes in
  • production
  • quality
  • Safety
  • Sales
  • other
Write a Comment
User Comments (0)
About PowerShow.com