Models of Human Performance - PowerPoint PPT Presentation

Loading...

PPT – Models of Human Performance PowerPoint presentation | free to download - id: 1a57f-YmUxN



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Models of Human Performance

Description:

Visual Illusions. Old Woman or Young girl? Rabbit or duck? ... Visual or auditory stimulus. Keypress node, breaking. Word into typed letters; ... – PowerPoint PPT presentation

Number of Views:261
Avg rating:3.0/5.0
Slides: 214
Provided by: EEE97
Learn more at: http://www.cs.uga.edu
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Models of Human Performance


1
Models of Human Performance
  • CSCI 4800
  • Spring 2006
  • Kraemer

2
Objectives
  • Introduce theory-based models for predicting
    human performance
  • Introduce competence-based models for assessing
    cognitive activity
  • Relate modelling to interactive systems design
    and evaluation

3
What are we trying to model?
4
Seven Stage Action ModelNorman, 1990
GOAL OF PERSON
5
Describing Problem Solving
  • Initial State
  • Goal State
  • All possible intervening states
  • Problem Space
  • Path Constraints
  • State Action Tree
  • Means-ends analysis

6
Problem Solving
  • A problem is something that doesnt solve easily
  • A problem doesnt solve easily because
  • you dont have the necessary knowledge or,
  • you have misrepresented part of the problem
  • If at first you dont succeed, try something else
  • Tackle one part of the problem and other parts
    may fall into place

7
Conclusion
  • More than one solution
  • Solution limited by boundary conditions
  • Representation affects strategy
  • Active involvement and testing

8
Functional Fixedness
  • Strategy developed in one version of the problem
  • Strategy might be inefficient
  • X ) XXXX
  • Convert numerals or just see 4

9
Data-driven perception
  • Activation of neural structures of sensory
    system by pattern of stimulation from environment

10
Theory-driven perception
  • Perception driven by memories and expectations
    about incoming information.

11
KEYPOINT
  • PERCEPTION involves a set of active processes
    that impose
  • STRUCTURE,
  • STABILITY,
  • and MEANING
  • on the world

12
Visual Illusions
http//www.genesishci.com/illusions2.htm
Rabbit or duck?
Old Woman or Young girl?
13
Interpretation
  • Knowledge of what you are looking at can aid
    in interpretation
  • JA CKAN DJI
  • LLW ENTU PTH
  • EHI LLT OFE
  • TCH APA ILO
  • FWA TER
  • Organisation of information is also useful

14
Story Grammars
  • Analogy with sentence grammars
  • Building blocks and rules for combining
  • Break story into propositions
  • Margie was holding tightly to the string of
    her beautiful new balloon. Suddenly a gust of
    wind caught it, and carried it into a tree. It
    hit a branch, and burst. Margie cried and cried.

15
Story Grammar
Story
Episode
Setting
1
Reaction
Event
Internal response
Overt response
Event
Event
6
Event
Event
Event
Event
sadness
4
3
2
Change Of state
5
16
Inferences
  • Comprehension typically requires our active
    involvement in order to supply information that
    is not explicit in the text
  • 1. Mary heard the ice-cream van coming
  • 2. She remembered her pocket money
  • 3. She rushed into the house.

17
Inference and Recall
  • Thorndyke (1976) recall of sentences from Mary
    story
  • 85 correct sentence
  • 58 correct inference
  • sentence not presented
  • 6 incorrect inference

18
Mental Models
  • Van Dijk and Kintsch (1983)
  • Text processed to extract propositions, which are
    held in working memory
  • When sufficient propositions in WM, then linking
    performed
  • Relevance of propositions to linking proportional
    to recall
  • Linking reveals gist

19
Semantic Networks
Has Skin Can move Eats Breathes
ANIMAL
Can fly Has Wings Has feathers
BIRD
Has fins Can swim Has gills
FISH
Is Yellow Can sing
CANARY
Collins Quillian, 1969
20
Levels and Reaction time
A canary can sing
A canary can fly
A canary has gills
A canary has skin
Collins Quillian, 1969
A canary is a canary
A canary is a bird
A canary is a fish
A canary is an animal
21
Canaries
  • Different times to verify the statements
  • A canary is a bird
  • A canary can fly
  • A canary can sing
  • Time proportional to movement through network

22
Scripts, Schema and Frames
  • Schema chunks of knowledge
  • Slots for information fixed, default, optional
  • Scripts action sequences
  • Generalised event schema (Nelson, 1986)
  • Frames knowledge about the properties of things

23
Mental Models
  • Partial
  • Procedures, Functions or System?
  • Memory or Reconstruction?

24
Concepts
  • How do you know a chair is a chair?

A chair has four legsdoes it? A chair has a
seatdoes it?
25
Prototypes, Typical Features, and Exemplars
  • Prototype
  • ROSCH (1973) people do not use feature sets, but
    imagine a PROTOTYPE for an object
  • Typical Features
  • ROSCH MERVIS (1975) people use a list of
    features, weighted in terms of CUE VALIDITY
  • Exemplars
  • SMITH MEDIN (1981) people use an EXAMPLE to
    imagine an object

26
Representing Concepts
  • BARSALOU (1983)
  • TAXONOMIC
  • Categories that are well known and can be
    recalled consistently and reliably
  • E.g., Fruit, Furniture, Animals
  • Used to generate overall representation of the
    world
  • AD HOC
  • Categories that are invented for specific purpose
  • E.g., How to make friends, Moving house
  • Used for goal-directed activity within specific
    event frames

27
Long Term Memory
  • Procedural
  • Knowing how
  • Declarative
  • Knowing that
  • Episodic vs. Semantic
  • Personal events
  • Language and knowledge of world

28
Working Memory
  • Limited Capacity
  • 7 2 items (Miller, 1965)
  • 4 2 chunks (Broadbent, 1972)
  • Modality dependent capacity
  • Strategies for coping with limitation
  • Chunking
  • Interference
  • Activation of Long-term memory

29
Baddeleys (1986) Model of Working Memory
Central executive
Visual Cache
Inner scribe
Phonological store
Auditory word presentation
Visual word presentation
Articulatory control process
30
Slave Systems
  • Articulatory loop
  • Memory Activation
  • Rehearsal capacity
  • Word length effect and Rehearsal speed
  • Visual cache
  • Visual patterns
  • Complexity of pattern, number of elements etc
  • Inner scribe
  • Sequences of movement
  • Complexity of movement

31
Typing
  • Eye-hand span related to expertise
  • Expert 9, novice 1
  • Inter-key interval
  • Expert 100ms
  • Strategy
  • Hunt Peck vs. Touch typing
  • Keystroke
  • Novice highly variable keystroke time
  • Novice very slow on unusual letters, e.g., X
    or Z

32
Salthouse (1986)
  • Input
  • Text converted to chunks
  • Parsing
  • Chunks decomposed to strings
  • Translation
  • Strings into characters and linked to movements
  • Execution
  • Key pressed

33
Rumelhart Norman (1982)
  • Perceptual processes
  • Perceive text, generate word schema
  • Parsing
  • Compute codes for each letter
  • Keypress schemata
  • Activate schema for letter-keypress
  • Response activation
  • Press defined key through activation of
    appropriate hand / finger

34
Schematic of Rumelhart and Normans connectionist
model of typing
middle ring index little
thumb Left hand
middle index ring thumb
little Right hand
Response system
Keypress node, breaking Word into typed
letters Excites and inhibits nodes
z
z
j
a
activation
Word node, activated from Visual or auditory
stimulus
jazz
35
Automaticity
  • Norman and Shallice (1980)
  • Fully automatic processing controlled by SCHEMATA
  • Partially automatic processing controlled by
    either Contention Scheduling
  • Supervisory Attentional System (SAS)

36
Supervisory Attentional System Model
Supervisory Attentional System
Control schema
Trigger database
Perceptual System
Effector System
Contention scheduling
37
Contention Scheduling
  • Gear changing when driving involves many routine
    activities but is performed automatically
    without conscious awareness
  • When routines clash, relative importance is used
    to determine which to perform Contention
    Scheduling
  • e.g., right foot on brake or clutch

38
SAS activation
  • Driving on roundabouts in France
  • Inhibit look right Activate look left
  • SAS to over-ride habitual actions
  • SAS active when
  • Danger, Choice of response, Novelty etc.

39
Attentional Slips and Lapses
  • Habitual actions become automatic
  • SAS inhibits habit
  • Perserveration
  • When SAS does not inhibit and habit proceeds
  • Distraction
  • Irrelevant objects attract attention
  • Utilisation behaviour patients with frontal lobe
    damage will reach for object close to hand even
    when told not to

40
Performance Operating Characteristics
  • Resource-dependent trade-off between performance
    levels on two tasks
  • Task A and Task B performed several times, with
    instructions to allocate more effort to one task
    or the other

41
Task Difficulty
  • Data limited processes
  • Performance related to quality of data and will
    not improve with more resource
  • Resource limited processes
  • Performance related to amount of resource
    invested in task and will improve with more
    resource

42
POC
P
P
M
Cost
Cost
Task A
Task A
M
Task B
Task B
  • Data limited
  • Resource limited

43
Why Model Performance?
  • Building models can help develop theory
  • Models make assumptions explicit
  • Models force explanation
  • Surrogate user
  • Define benchmarks
  • Evaluate conceptual designs
  • Make design assumptions explicit
  • Rationale for design decisions

44
Why Model Performance?
  • Human-computer interaction as Applied Science
  • Theory from cognitive sciences used as basis for
    design
  • General principles of perceptual, motor and
    cognitive activity
  • Development and testing of theory through models

45
Types of Model in HCI
Whitefield, 1987
46
Task Models
  • Researchers Model of User, in terms of tasks
  • Describe typical activities
  • Reduce activities to generic sequences
  • Provide basis for design

47
Pros and Cons of Modelling
  • PROS
  • Consistent description through (semi) formal
    representations
  • Set of typical examples
  • Allows prediction / description of performance
  • CONS
  • Selective (some things dont fit into models)
  • Assumption of invariability
  • Misses creative, flexible, non-standard activity

48
Generic Model Process?
  • Define system goals, activity, tasks, entities,
    parameters
  • Abstract to semantic level
  • Define syntax / representation
  • Define interaction
  • Check for consistency and completeness
  • Predict / describe performance
  • Evaluate results
  • Modify model

49
Device and Task Models
50
Device Models
  • Buxtons 3-state device model

State 2
State 1
State 0
51
Application
Button up
Pen off
State 2
State 1
State 0
Pen on
Button down
drag
select
Out of range
52
Different pointing devices
53
Conclusions
  • Models abstract aspects of interaction
  • User, task, system
  • Models play a variety of roles in design

54
Hierarchical Task Analysis
  • Activity assumed to consist of TASKS performed in
    pursuit of GOALS
  • Goals can be broken into SUBGOALS, which can be
    broken into tasks
  • Hierarchy (Tree) description

55
Hierarchical Task Description
56
The Analysis comes from plans
  • PLANS conditions for combining tasks
  • Fixed Sequence
  • P0 1 2 exit
  • Contingent Fixed Sequence
  • P1 1 when state X achieved 2 exit
  • P1.1 1.1 1.2 wait for X time 1.3 exit
  • Decision
  • P2 1 2 If condition X then 3, elseif
    condition Y then 4 5 exit

57
Reporting
  • HTA can be constructed using Post-it notes on a
    large space (this makes it easy to edit and also
    encourages participation)
  • HTA can be difficult to present in a succinct
    printed form (it might be useful to take a
    photograph of the Post-it notes)
  • Typically a Tabular format is used

58
Redesigning the Interface to a medical imaging
system
59
Original Design
  • Menu driven
  • Menus accessed by first letter of command
  • Menus arranged in hierarchy

60
Problems with original design
  • Lack of consistency
  • D DOS commands Delete Data file Date
  • Hidden hierarchy
  • Only experts could use
  • Inappropriate defaults
  • Setting up a scan required correction of
    default settings three or four times

61
Initial design activity
  • Observation of non-technology work
  • Cytogeneticists inspecting chromosomes
  • Developed model of task
  • Hierarchical task analysis
  • Developed design principles, e.g.,
  • Cytogeneticists as picture people
  • Task flow
  • Task mapping

62
Task Model
  • Work flows between specific activities

Administration
Patient details
Reporting
Cell sample
Set up
Analysis
Microscope
63
First prototype
Layout related to task model Sketch very
simple Annotations show modifications
64
Second prototype
Refined layout Prototype using HyperCard Initi
al user trials compared this with a mock-up of
the original design
65
Final Product
Picture taken from company brochure Initial
concepts retained Further modifications possible
66
Predicting Transaction Time
67
Predicting Performance Time
  • Time and error are standard measures of human
    performance
  • Predict transaction time for comparative
    evaluation
  • Approximations of human performance

68
Unit Times
  • From task model, define sequence of tasks to
    achieve a specific goal
  • For each task, define average time

69
Quick Exercise
  • Draw two parallel lines about 4cm apart and about
    10cm long
  • Draw, as quickly as possible, a zig-zag line for
    5 seconds
  • Count the number of lines and the number of times
    you have crossed the parallel lines

70
Predicted result
  • About 70 lines
  • About 20 cross-overs

71
Why this prediction?
  • Movement speed limited by biomechanical
    constraints
  • Motor subsystem change direction _at_ 70ms
  • So 5000 / 70 71 oscillations
  • Cognitive / Perceptual system cycles
  • Perceptual _at_ 70ms
  • Cognitive _at_ 100ms
  • Correction takes 7070100 240ms
  • 5000/240 21

72
Fitts Law
  • Paul Fitts 1954
  • Information-theoretic account of simple movements
  • Define the number of bits processed in
    performing a given task

73
Fitts Tapping Task
W
a
74
Fitts Law
  • A 62, W 15
  • A 112, W 7
  • A 112, W 21

Hits 60 40 20 0
54
43
a 10 b 27.5
21
b
a
Log2 (2A/W)
1 5.3 2 4.5 3 3.2
  • Movement Time a b (log2 2A/W)

75
Alternate Versions
  • MT a b log2 (2A/W)
  • MT b log2 (A/W 0.5)
  • MT a b log2 (A/W/1)

76
a and b are constants
  • Data derived from plot
  • Data as predictors?

77
Potential Problems
  • Data-fitter rather than law
  • Generic value ab 100
  • Variable predictive power for devices?
  • From mouse data we get
  • (assume A 5 and W 10) ?log2(2A/W) ? 0.3
  • 339ms, 150.5ms and 34.9ms (!!)

78
Hick Hyman Law
  • William Hick 1952
  • Selection time, from a set of items, is
    proportional to the number of items
  • T k log2 (n1), Where k a constant
    (interceptslope)
  • Approximately 150ms added to T for each item

79
Example of Hick-Hyman Law
Search Time (s) 4 3 2 1 0
words
numbers
2 3 4 5 6 7 8 10 12
Landauer and Nachbar, 1985
80
Keystroke Level Models
  • Developed from 1950s ergonomics
  • Human information processor as linear executor of
    specified tasks
  • Unit-tasks have defined times
  • Prediction summing of times for sequence of
    unit-tasks

81
Building a KLM
  • Develop task model
  • Define task sequence
  • Assign unit-times to tasks
  • Sum times

82
Example cut and paste
  • Task Model Select line Cut Select insertion
    point paste
  • Task One select line
  • move cursor to
  • start of line
  • press (hold) button
  • drag cursor to
  • end of line
  • release button

83
Times for Movement
  • H homing, e.g., hand from keyboard to mouse
  • Range 214ms 400ms
  • Average 320ms
  • P pointing, e.g., move cursor using mouse
  • Range defined by Fitts Law
  • Average 1100ms
  • B button pressing, e.g., hitting key on keyboard
  • Range 80ms 700ms
  • Average 200ms

84
Times for Cognition / Perception
  • M mental operation
  • Range 990ms 1760ms
  • Average 1350ms
  • A switch attention between parts of display
  • Average 320ms
  • R recognition of items
  • Range 314ms 1800ms
  • Average 340ms
  • Perceive change
  • Range 50 300ms
  • Average 100ms

85
Rules for Summing Times
  • How to handle multiple Mental units
  • M before Ks in new argument strings
  • M at start of cognitive unit
  • M before Ps that select commands
  • Delete M if K redundant terminator

86
Alternative
  • What if we use accelerated scrolling on the
    cursor keys?
  • Press ? key and read scrolling numbers
  • Release key at or near number
  • Select correct number

87
Critical Path Models
  • Used in project management
  • Map dependencies between tasks in a project
  • Task X is dependent on task Y, if it is necessary
    to wait until the end of task Y until task X can
    commence

88
Procedure
  • Construct task model, taking into account
    dependencies
  • Assign times to tasks
  • Calculate critical path and transaction time
  • Run forward pass
  • Run backward pass

89
Example
R
M
H
P
P
P
M 1.35 H 0.32 P 0.2 R 0.34
R 0.34
M 1.35
H 0.32
P 0.2
P 0.2
P 0.2
1
2
3
4
5
6
90
Critical Path Table
91
Comparison
  • Summing of times result
  • 2.61s
  • Critical path result
  • 2.47s
  • R allowed to float

92
Other time-based models
  • Task-network models
  • MicroSAINT
  • Unit-times and probability of transition

p
Prompt 50ms
Speak word 300 ? 9ms
System response 1000 ? 30ms
1-p
93
Models of Competence
94
Performance vs. Competence
  • Performance Models
  • Make statements and predictions about the time,
    effort or likelihood of error when performing
    specific tasks
  • Competence Models
  • Make statements about what a given user knows and
    how this knowledge might be organised.

95
Sequence vs. Process vs. Grammar
  • Sequence Models
  • Define activity simply in terms of sequences of
    operations that can be quantified
  • Process Models
  • Simple model of mental activity but define the
    steps needed to perform tasks
  • Grammatical Models
  • Model required knowledge in terms of sentences

96
Process Models
  • Production systems
  • GOMS

97
Production Systems
  • Rules (Procedural) Knowledge
  • Working memory state of the world
  • Control strategies way of applying knowledge

98
Production Systems
  • Architecture of a production system

99
The Problem of Control
  • Rules are useless without a useful way to apply
    them
  • Need a consistent, reliable, useful way to
    control the way rules are applied
  • Different architectures / systems use different
    control strategies to produce different results

100
Forward Chaining
C
A
If not C then GOAL
101
Backward Chaining
Need GOAL
If not C then GOAL
Need not C
If A and B then not C
Need B
If A then B
102
Production Systems
  • A simple metaphor

Ships
Docks
103
Production Systems
  • Ships must fit the correct dock
  • When one ship is docked, another can be launched

104
Production Systems
105
Production Systems
106
Production Rules
  • IF condition
  • THEN action
  • e.g.,
  • IF ship is docked
  • And free-floating ships
  • THEN launch ship
  • IF dock is free
  • And Ship matches
  • THEN dock ship

107
The Parsimonious Production Systems Rule Notation
  • On any cycle, any rule whose conditions are
    currently satisfied will fire
  • Rules must be written so that a single rule will
    not fire repeatedly
  • Only one rule will fire on a cycle
  • All procedural knowledge is explicit in these
    rules rather than being explicit in the
    interpreter

108
Worked Example The Tower of Hanoi
A B C
1
2
3
4
5
109
Possible Steps 1
  • Disc 1 from a to c
  • Disc 2 from a to b
  • Disc 1 from c to a
  • Disc 3 from a to c
  • Disc 2 from b to c
  • Disc 1 from a to c

110
Worked Example The Tower of Hanoi
A B C
1
2
4
3
5
111
Possible Steps 2
  • Disc 4 from a to b
  • Disc 1 from c to b
  • Disc 2 from c to a
  • Disc 1 from b to a
  • Disc 2 from a to b
  • Disc 3 from a to b

112
Worked Example The Tower of Hanoi
A B C
1
2
3
4
5
113
Possible Steps 3
  • Disc 5 from a to c
  • Disc 1 from b to a
  • Disc 2 from b to c
  • Disc 1 from a to c
  • Disc 3 from b to a
  • Disc 1 from c to b
  • Disc 2 from c to a

Disc 4 from b to c Disc 1 from a to c Disc 2 from
a to b Disc 1 from c to b Disc 3 from a to c Disc
1 from b to a Disc 2 from b to c Disc 1 from a to
c
114
Simons (1975) goal-recursive logic
  • To get the 5-tower to Peg C, get the 4-tower to
    Peg B, then move
  • The 5-disc to Peg C, then move the 4-tower to
    Peg C
  • To get the 4-tower to Peg B, get the 3-tower to
    Peg C, then move
  • The 4-disc to Peg B, then move the 3-tower to
    Peg B
  • To get the 3-tower to Peg C, get the 2-tower to
    Peg B, then move
  • The 3-disc to Peg C, then move the 2-tower to
    Peg C,
  • To get the 2-tower to Peg B, move the 1-disc to
    Peg C, then move
  • The 2-disc to Peg B, then move the 1-disc to Peg
    A

115
Production Rule 1
  • SUBGOAL_DISCS
  • IF the goal is to achieve a particular
    configuration of discs
  • And Di is on Px but should go to Py in the
    configuration
  • And Di is the largest disc out of place
  • And Dj is on Py
  • And Dj is smaller than Di
  • And Pz is clear OR has a disc larger than Dj
  • THEN set a subgoal to move the Dj tower to Pz
    and Di to Py

116
Production Rule 2
  • SUBGOAL_MOVE_DISC
  • IF the goal is to achieve a particular
    configuration of discs
  • And Di is on Px but should go to Py in the
    configuration
  • And Di is the largest disc out of place
  • And Py is clear
  • THEN move Di to Py

117
Goals Operators Method SelectionCard, Moran and
Newell, 1983
  • Human activity modelled by Model Human Processor
  • Activity defined by GOALS
  • Goals held in Stack
  • Goals pushed onto stack
  • Goals popped from stack

118
Goals
  • Symbolic structures to define desired state of
    affairs and methods to achieve this state of
    affairs
  • GOAL EDIT-MANUSCRIPT top level goal
  • GOAL EDIT-UNIT-TASK specific sub goal
  • GOAL ACQUIRE UNIT-TASK get next step
  • GOAL EXECUTE UNIT-TASK do next step
  • GOAL LOCATION-LINE specific step

119
Operators
  • Elementary perceptual, motor or cognitive acts
    needed to achieve subgoals
  • Get-next-line
  • Use-cursor-arrow-method
  • Use-mouse-method

120
Methods
  • Descriptions of procedures for achieving goals
  • Conditional upon contents of working memory and
    state of task
  • GOAL ACQUIRE-UNIT-TASK
  • GET-NEXT-PAGE if at end of manuscript
  • GET-NEXT-TASK

121
Selection
  • Choose between competing Methods, if more than
    one
  • GOALEXECUTE-UNIT-TASK
  • GOALLOCATE-LINE
  • select if hands on keyboard
  • and less than 5 lines to move
  • USE CURSOR KEYS
  • else
  • USE MOUSE

122
Example
  • Withdraw cash from ATM
  • Construct task model
  • Define production rules

123
Task Model
  • Method for goal Obtain cash from ATM
  • Step1 access ATM
  • Step2 select cash option
  • Step3 indicate amount
  • Step4 retrieve cash and card
  • Step5 end task

124
Production Rules
  • ((GOAL USE ATM TO OBTAIN CASH)
  • ADD-UNIT-TASK (access ATM)
  • ADD-WM-UNIT-TASK (access ATM)
  • ADD-TASK-STEP (insert card in slot)
  • SEND-TO-MOTOR(place card in slot)
  • SEND-TO-MOTOR (eyes to slot)
  • SEND-TO-PERCEPTUAL (check card in)
  • ADD (WM performing card insertion)
  • ADD-TASK-STEP (check card insertion)
  • DELETE-UNIT-TASK (access ATM)
  • ADD-UNIT-TASK (enter PIN)

125
Problems with GOMS
  • Assumes error-free performance
  • Even experts make mistakes
  • MHP gross simplifies human information processing
  • Producing a task model of non-existent products
    is difficult

126
Task Action Grammar
  • GOMS assumes expert knows operators and methods
    for tasks
  • TAG assumes expert knows simple tasks, i.e.,
    tasks that can be performed without
    problem-solving

127
TAG and competence
  • Competence
  • Defines what an ideal user would know
  • TAG relies on world knowledge
  • up vs down
  • left vs right
  • forward vs backward

128
Task-action Grammar
  • Grammar relates simple tasks to actions
  • Generic rule schema covering combinations of
    simple tasks

129
TAG
  • A grammar
  • maps
  • Simple tasks
  • Onto
  • Actions
  • To form
  • an interaction language
  • To investigate
  • consistency

130
Consistency
  • Syntactic use of expressions
  • Lexical use of symbols
  • Semantic-syntactic alignment order of terms
  • Semantic principle of completeness

131
Procedure
  • Step 1 Write out commands and their structures
  • Step 2 Determine in commands have consistent
    structure
  • Step 3 Place command items into variable/feature
    relationship
  • Step 4 Generalise commands by separating into
    task features, simple tasks, task-action rule
    schema
  • Step 5 Expand parts of task into primitives
  • Step 6 Check to ensure all names are unique

132
Example
  • Setting up a recording on a video-cassette
    recorder (VCR)
  • Assume that all controls via front panel and that
    the user can only use the up and down arrows

133
Feature list for a VCR
  • Property Date, Channel, Start, End
  • Value number
  • Frequency Daily, Weekly
  • Record on, off

134
Simple tasks
  • SetDate Property Date, Value US, Frequency
    Daily
  • SetDate Property Date, Value US, Frequency
    Weekly
  • SetProgProperty Prog, Value US
  • SetStartProperty start, Value US, Record
    on
  • SetEndProperty start, Value US, Record
    off

135
Rule Schema
  • 1. TaskProperty US, Value ? SetValue Value
  • 2. TaskProperty Date, Value, Frequency US
    ? SetValue Value press ? ? until
    Frequency US
  • 3. TaskProperty Start, Value ? SetValue
    Value press Rec
  • 4. SetValue Value US ? press ? ? until
    Value US
  • 5. SetValueValue US ? use ? ? until
    Value US

136
Architectures for Cognition
137
Why Cognitive Architecture?
  • Computers architectures
  • Specify components and their connections
  • Define functions and processes
  • Cognitive Architectures could be seen as the
    logical conclusion of the human-brain-as-computer
    hypothesis

138
Why do this?
  • Philosophy Provide a unified understanding of
    the mind
  • Psychology Account for experimental data
  • Education Provide cognitive models for
    intelligent tutoring systems and other learning
    environments
  • Human Computer Interaction Evaluate artifacts
    and help in their design
  • Computer Generated Forces Provide cognitive
    agents to inhabit training environments and games
  • Neuroscience Provide a framework for
    interpreting data from brain imaging

139
General Requirements
  • Integration of cognition, perception, and action
  • Robust behavior in the face of error, the
    unexpected, and the unknown
  • Ability to run in real time
  • Ability to Learn
  • Prediction of human behavior and performance

140
Architectures
  • Model Human Processor (MHP)
  • Card, Moran and Newell (1983)
  • ACT-R
  • Anderson (1993)
  • EPIC
  • Meyer and Kieras (1997)
  • SOAR
  • Laird, Rosenbloom and Newell (1987)

141
Model Human Processor
  • Three interacting subsystems
  • Perceptual
  • Auditory image store
  • Visual image store
  • Cognitive
  • Working memory
  • Long-term memory
  • Motor

142
Parameters of MHP
143
Average data for MHP
  • Long-term memory ?
  • Working memory 3 7 chunks, 7s
  • Auditory image store 17 letters, 200ms
  • Visual image store 5 letters, 1500ms
  • Cognitive processor 100ms
  • Perceptual processor 70ms
  • Motor processor 70ms

144
Conclusions
  • Simple description of cognition
  • Uses standard times for prediction
  • Uses production rules for defining and combining
    tasks (with GOMS formalism)

145
Adaptive Control of Thought, Rational
(ACT-R)http//act.psy.cmu.edu
146
Adaptive Control of Thought, Rational (ACT-R)
  • ACT-R symbolic aspect realised over subsymbolic
    mechanism
  • Symbolic aspect in two parts
  • Production memory
  • Symbolic memory (declarative memory)
  • Theory of rational analysis

147
Theory of Rational Analysis
  • Evidence-based assumptions about environment
    (probabilities)
  • Deriving optimal strategies (Bayesian)
  • Assuming that optimal strategies reflect human
    cognition (either what it actually does or what
    it probably ought to do)

148
Notions of Memory
  • Procedural
  • Knowing how
  • Described in ACT by Production Rules
  • Declarative
  • Knowing that
  • Described in ACT by chunks
  • Goal Stack
  • A sort of working memory
  • Holds chunks (goals)
  • Top goal pushed (like GOMS)
  • Writeable

149
Production Rules
  • Knowing how to do X
  • Production rule set of conditions and an action
  • IF it is raining
  • And you wish to go out
  • THEN pick up your umbrella

150
(Very simple) ACT
  • Network of propositions
  • Production rules selected via pattern matching.
    Production rules coordinate retrieval of chunks
    from symbolic memory and link to environment.
  • If information in working memory matches
    production rule condition, then fire production
    rule

151
ACT
Declarative memory
Procedural memory
Retrieval Storage Match Execution
Working memory
Encoding Performance
OUTSIDE WORLD
152
Knowledge Representation

addend1
sum
U (4) T (1) H (0)
six
addend2
16 18 _____ 34 _____ 1
eight
Goal buffer add numbers in right-most
column Visual buffer 6, 8 Retrieval buffer 14
153
Symbolic / Subsymbolic levels
  • Symbolic level
  • Information as chunks in declarative memory, and
    represented as propositions
  • Rules as productions in procedural memory
  • Subsymbolic level
  • Chunks given parameters which are used to
    determine the probability that the chunk is
    needed
  • Base-level activation (relevance)
  • Context activation (association strengths)

154
Conflict resolution
  • Order production rules by preference
  • Select top rule in list
  • Preference defined by
  • Probability that rule will lead to goal
  • Time associated with rule
  • Likely cost of reaching goal when using sequence
    involving this rule

155
Example
  • Activity Find target and then use mouse to
    select target
  • Hunt_Feature
  • IF goal find target with feature F
  • AND there is object X on screen
  • THEN move attention to object X
  • Found_target
  • IF goal find target with feature F
  • AND target with F in location L
  • THEN move mouse to L and click

156
Example
  • Model reaction time to target
  • Assume switch attention linearly increases with
    each new position
  • Assume probability of feature X in location y
    0.53
  • Assume switch attention 185ms
  • Therefore, reaction time 185 X 0.53 98ms per
    position
  • Empirical data has RT of 103ms per position

157
Example
  • Assume target in field of distractors
  • P 0.42
  • Therefore, 185 x .42 78ms per position
  • Empirical data 80ms per position

158
Learning
  • Symbolic level
  • Learning defined by adding new chunks and
    productions
  • Subsymbolic level
  • Adjustment of parameters based on experience

159
Conclusions
  • ACT uses simple production system
  • ACT provides some quantitative prediction of
    performance
  • Rationality optimal adaptation to environment

160
Executive Process Interactive Control
(EPIC)ftp//ftp.eecs.umich.edu/people/kieras
161
Executive Process Interactive Control (EPIC)
  • Focus on multiple task performance
  • Cognitive Processor runs production rules and
    interacts with perceptual and motor processors

162
EPIC parameters
  • FIXED
  • Connections and mechanisms
  • Time parameters
  • Feature sets for motor processors
  • Task-specific production rules and perceptual
    encoding types
  • FREE
  • Production rules for tasks
  • Unique perceptual and motor processors
  • Task instance set
  • Simulated task environment

163
EPIC
Long-term memory
Production memory
PERCEPTUAL PROCESSORS
DISPLAY
Production Rule interpreter
Auditory
Auditory
Visual
Visual
Task environment
Working memory
Speech
Speech
Manual
Manual
Tactile
164
Production Memory
  • Perceptual processors controlled by production
    rules
  • Production Rules held in Production Memory
  • Production Rule Interpreter applies rules to
    perceptual processes

165
Working Memory
  • Limited capacity (or duration of 4s) and holds
    current production rules
  • Cognitive processor updates every 50ms
  • On update, perceptual input, item from production
    memory, and next action held in working memory

166
Resolving Conflict
  • Production rules applied to executive tasks to
    handle resource conflict and scheduling
  • Conflict dealt with in production rule
    specification
  • Lockout
  • Interleaving
  • Strategic response deferent

167
Example
Task one Stimulus one Perceptual
process Cognitive process Response
selection Memory process Response one
Task two Stimulus two Perceptual
process Cognitive process Response
selection Memory process Response two
Executive process Move eye to S2 Enable task1
task 2 Wait for task1 complete Task1end Task2
permission Trial end
168
Conclusions
  • Modular structure supports parallelism
  • EPIC does not have a goal stack and does not
    assume sequential firing of goals
  • Goals can be handled in parallel (provided there
    is no resource conflict)
  • Does not support learning

169
States, Operators, And Reasoning
(SOAR)http//www.isi.edu/soar/soar.html
170
States, Operators, And Reasoning (SOAR)
  • Sequel of General Problem Solver (Newell and
    Simon, 1960)
  • SOAR seeks to apply operators to states within a
    problem space to achieve a goal.
  • SOAR assumes that actor uses all available
    knowledge in problem-solving

171
Soar as a Unified Theory of Cognition
  • Intelligence problem solving learning
  • Cognition seen as search in problem spaces
  • All knowledge is encoded as productions
  • ? a single type of knowledge
  • All learning is done by chunking
  • ? a single type of learning

172
Young, R.M., Ritter, F., Jones, G.  1998 
"Online Psychological Soar Tutorial" available
at http//www.psychology.nottingham.ac.uk/staff/F
rank.Ritter/pst/pst-tutorial.html
173
SOAR Activity
  • Operators  Transform a state via some action
  • State  A representation of possible stages of
    progress in the problem
  • Problem space  States and operators that can be
    used to achieve a goal.
  • Goal Some desired situation.

174
SOAR Activity
  • Problem solving applying an Operator to a State
    in order to move through a Problem Space to reach
    a Goal. 
  • Impasse   Where an Operator cannot be applied
    to a State, and so it is not possible to move
    forward in the Problem Space. This becomes a new
    problem to be solved.
  • Soar can learn by storing solutions to past
    problems as chunks and applying them when it
    encounters the same problem again

175
SOAR Architecture
Chunking mechanism
Production memory Pattern ?Action Pattern
?Action Pattern ?Action
Working memory
Objects
Preferences
Working memory Manager
Conflict stack
Decision procedure
176
Explanation
  • Working Memory
  • Data for current activity, organized into objects
  • Production Memory
  • Contains production rules
  • Chunking mechanism
  • Collapses successful sequences of operators into
    chunks for re-use

177
3 levels in soar
  • Symbolic the programming level
  • Rules programmed into Soar that match
    circumstances and perform specific actions
  • Problem space states goals
  • The set of goals, states, operators, and context.
  • Knowledge embodied in the rules
  • The knowledge of how to act on the problem/world,
    how to choose between different operators, and
    any learned chunks from previous problem solving

178
How does it work?
  • A problem is encoded as a current state and a
    desired state (goal)
  • Operators are applied to move from one state to
    another
  • There is success if the desired state matches the
    current state
  • Operators are proposed by productions, with
    preferences biasing choices in specific
    circumstances
  • Productions fire in parallel

179
Impasses
  • If no operator is proposed, or if there is a tie
    between operators, or if Soar does not know what
    to do with an operator, there is an impasse
  • When there are impasses, Soar sets a new goal
    (resolve the impasse) and creates a new state
  • Impasses may be stacked
  • When one impasse is solved, Soar pops up to the
    previous goal

180
Learning
  • Learning occurs by chunking the conditions and
    the actions of the impasses that have been
    resolved
  • Chunks can immediately used in further
    problem-solving behaviour

181
The Switchyard video
182
Conclusions
  • It may be too "unified"
  • Single learning mechanism
  • Single knowledge representation
  • Uniform problem state
  • It does not take neuropsychological evidence into
    account (cf. ACT-R)
  • There may be non-symbolic intelligence, e.g.
    neural nets etc not abstractable to the symbolic
    level

183
Comparison of Architectures
184
The Role of Models in Design
185
User Models in Design
  • Benchmarking
  • Human Virtual Machines
  • Evaluation of concepts
  • Comparison of concepts
  • Analytical prototyping

186
Benchmarking
  • What times can users expect to take to perform
    task
  • Training criteria
  • Evaluation criteria (under ISO9241)
  • Product comparison

187
Human Virtual Machine
  • How might the user perform?
  • Make assumptions explicit
  • Contrast views

188
Evaluation of Concepts
  • Which design could lead to better performance?
  • Compare concepts using models prior to building
    prototype
  • Use performance of existing product as benchmark

189
Reliability of Models
  • Agreement of predictions with observations
  • Agreement of predictions by different analysts
  • Agreement of model with theory

190
Comparison with Theory
  • Approximation of human information processing
  • Assumes linear, error-free performance
  • Assumes strict following of correct procedure
  • Assumes only way correct procedure
  • Assumes actions can be timed

191
KLM Validity
Predicted values lie within 20 of observed
values
192
Comparison of KLM predicted with times from user
trials
Total time (s) 25 20 15 10
CUI P 15.84s mean 15.37s Error 2.9
GUI P 11.05s mean 8.64s Error 22
1 2 3 4 5 6 7 Trial number
193
Inter / Intra-rater Reliability
  • Inter-rater
  • Correlation of several analysts
  • 0.754
  • Intra-rater
  • Correlation for same analysts on several
    occasions
  • 0.916
  • Validity
  • correlation with actual performance
  • 0.769

Stanton and Young, 1992
194
How compare single data points?
  • Models typically produce a single prediction
  • How can one value be compared against a set of
    data?
  • How can a null hypothesis be proved?

195
Liao and Milgram (1991)
A-D-?sd A-D A-D?sd A AD-?sd AD
AD?sd D
196
Defining terms
  • A Actual values, with observed standard
    deviation (sd)
  • D Derived values
  • ? 5 (P
  • ? 20 (P

197
Acceptance Criteria
  • Accept Ho if A-D ? sd
  • Reject Ho if D
  • Reject Ho if D A-D ? sd

198
Analytical Prototyping
  • Functional analysis
  • Define features and functions
  • Development of design concepts, e.g., sketches
    and storyboards
  • Scenario-based analysis
  • How people pursue defined goals
  • State-based descriptions
  • Structural analysis
  • Predictive evaluation
  • Testing to destruction

199
Analytical Prototyping
  • Functional analysis
  • Scenario-based analysis
  • Structural analysis

200
Rewritable Routines
  • Mental models
  • Imprecise, incomplete, inconsistent
  • Partial representations of product and procedure
    for achieving subgoal
  • Knowledge recruited in response to system image

201
Simple Architecture
202
Global Prototypical Routines
  • Stereotyped Stimulus-Response compatibilities
  • Generalisable product knowledge

203
State-specific Routines
  • Interpretation of system image
  • Feature evolution
  • Expectation of procedural steps
  • Situated / Opportunistic planning

204
Describing Interaction
  • State-space diagrams
  • Indication of system image
  • Indication of user action
  • Prediction of performance

205
State-space Diagram
  • State number
  • System image
  • Waiting for
  • Transitions

206
Defining Parameters
207
Developing Models
Start 0ms
P0.997 P0.74
P0.003 P0.26
Recall plan 1380ms
Wrong plan 1380ms
P0.9996 P0.9996
P0.0004 P0.0004
P1 P1
Press play 200ms
Cycle through menu 800ms
Press Playmode 200ms
P0.9996 P0.9996
P0.0004 P0.0004
P1 P1
Press Playmode 200ms
Press Enter 0ms
Switch off 300ms
P0.9996 P0.9996
P0.0004 P0.0004
P1 P1
Press Other Key 200ms
Press Play 0ms
208
Results
209
What is the point?
  • Are these models useful to designers?
  • Are these models useful to theorists?

210
Task Models - problems
  • Task models take time to develop
  • They may not have high inter-rater reliability
  • They cannot deal easily with parallel tasks
  • They ignore social factors

211
Task Models - benefits
  • Models are abstractions you always leave
    something out
  • The process of creating a task model might
    outweigh the problems
  • Task models highlight task sequences and can be
    used to define metrics

212
Task Models for Theorists
  • Task models are engineering approximations
  • Do they actually describe how human information
    processing works?
  • Do they need to?
  • Do they describe cognitive operations, or just
    actions?

213
Some Background Reading
  • Dix, A et al., 1998, Human-Computer Interaction
    (chapters 6 and 7) London Prentice Hall
  • Anderson, J.R., 1983, The Architecture of
    Cognition, Harvard, MA Harvard University Press
  • Card, S.K. et al., 1983, The Psychology of
    Human-Computer Interaction, Hillsdale, NJ LEA
  • Carroll, J., 2003, HCI Models, Theories and
    Frameworks towards a multidisciplinary science,
    (chapters 1, 3, 4, 5) San Francisco, CA Morgan
    Kaufman
About PowerShow.com