A case study in UI design and evaluation for computer security - PowerPoint PPT Presentation

About This Presentation
Title:

A case study in UI design and evaluation for computer security

Description:

Design of two user interfaces: native XP interface, Salmon ... Manipulate one variable (in our case, the interface, XP or Salmon) ... – PowerPoint PPT presentation

Number of Views:224
Avg rating:3.0/5.0
Slides: 68
Provided by: robre
Learn more at: http://cups.cs.cmu.edu
Category:

less

Transcript and Presenter's Notes

Title: A case study in UI design and evaluation for computer security


1
A case study in UI design and evaluation for
computer security
  • Rob Reeder
  • January 30, 2008

2
Memogate A user interface scandal !!
3
Overview
  • Task domain Windows XP file permissions
  • Design of two user interfaces native XP
    interface, Salmon
  • Evaluation Which interface was better?
  • Analysis Why was one better?

4
Part 1 File permissions in Windows XP
  • File permissions task Allow authorized users
    access to resources, deny unauthorized users
    access to resources
  • Resources Files and folders
  • Users People with accounts on the system
  • Access 13 types, such as Read Data, Write Data,
    Execute, Delete

5
Challenges for file permissions UI design
  • Maybe thousands of users impossible to set
    permissions individually for each
  • Thirteen access types hard for a person to
    remember them all

6
Grouping to handle users
  • Administrators
  • Power Users
  • Everyone
  • Admin-defined

7
A problematic user grouping
Xu
Ari
Miguel
Bill
Yasir
Cindy
Zack
Group A
Group B
8
Precedence rules
  • No setting Deny by default
  • Allow gt No setting
  • Deny gt Allow
  • (gt means takes precedence over)

9
Grouping to handle access types
Execute
9
10
Moral
  • Setting file permissions is quite complicated
  • But a good interface design can help!

11
The XP file permissions interface
12
The Salmon interface
12
13
Expandable Grid
13
14
Example task Wesley
  • Initial state
  • Wesley allowed READ WRITE from a group
  • Final state
  • Wesley allowed READ, denied WRITE
  • What needs to be done
  • Deny Wesley WRITE

15
Whats so hard?
  • Conceptually Nothing!
  • Pragmatically
  • User doesnt know initial group membership
  • Not clear what changes need to be made
  • Checking work is hard

16
Learning Wesleys initial permissions
1
2
Click Effective Permissions
Click Advanced
3
4
Select Wesley
View Wesleys Effective Permissions
16
17
Learning Wesleys group membership
5
Bring up Computer Management interface
6
Click on Users
7
Double-click Wesley
Read Wesleys group membership
Click Member Of
8
9
17
18
Changing Wesleys permissions
10
11
Deny Write
Click Add
12
Click Apply
18
19
Checking work
13
14
Click Effective Permissions
Click Advanced
15
16
Select Wesley
View Wesleys Effective Permissions
19
20
XP file permissions interface Poor
20
21
Part 2 Common security UI design problems
  • Poor feedback
  • Ambiguous labels
  • Violation of conventions
  • Hidden options
  • Omission errors

22
Problem 1 Poor feedback
1
2
Click Effective Permissions
Click Advanced
3
4
Select Wesley
View Wesleys Effective Permissions
22
23
Salmon immediate feedback
23
24
Grid consolidated feedback
24
25
Problem 2 Labels (1/3)
25
26
Problem 2 Labels (2/3)
26
27
Salmon clearer labels
27
28
Grid fewer, clearer labels
29
Problem 3 Violating interface conventions
29
30
Problem 3 Violating interface conventions
30
31
Salmon better checkboxes
31
32
Grid direct manipulation
32
33
Problem 4 Hidden options
34
Problem 4 Hidden options
1
2
Double-click entry
Click Advanced
3
Click Delete checkbox
35
Salmon All options visible
35
36
Grid Even more visibility
36
37
Problem 5 Omission errors
37
38
Salmon Feedback helps prevent omission errors
38
39
Grid No omission errors
39
40
FLOCK Summary of design problems
  • Feedback poor
  • Labels ambiguous
  • Omission error potential
  • Convention violation
  • Keeping options visible

41
Part 3 Evaluation of XP and Salmon
  • Conducted laboratory-based user studies
  • Formative and summative studies for Salmon
  • Ill focus on summative evaluation

42
Advice for user studies
  • Know what youre measuring!
  • Maintain internal validity
  • Maintain external validity

43
Common usable security metrics
  • Accuracy with what probability do users
    correctly complete tasks?
  • Speed how quickly can users complete tasks?
  • Security how difficult is it for an attacker to
    break into the system?
  • Etc. satisfaction, learnability, memorability

44
Measure the right things!
  • Speed is often useless without accuracy (e.g.,
    setting file permissions)
  • Accuracy may be useless without security (e.g.,
    easy-to-remember passwords)

45
Measurement instruments
  • Speed Easy use a stopwatch, time users
  • Accuracy Harder need unambiguous definitions
    of success and failure
  • Security Very hard may require serious math,
    or lots of hackers

46
Internal validity
  • Internal validity Making sure your results are
    due to the effect you are testing
  • Manipulate one variable (in our case, the
    interface, XP or Salmon)
  • Control or randomize other variables
  • Use same experimenter
  • Experimenter reads directions from a script
  • Tasks presented in same text to all users
  • Assign tasks in different order for each user
  • Assign users randomly to one condition or other

46
47
External validity
  • External validity Making sure your experiment
    can be generalized to the real world
  • Choose real tasks
  • Sources of real tasks
  • Web forums
  • Surveys
  • Your own experience
  • Choose real participants
  • We were testing novice or occasional
    file-permissions users with technical backgrounds
    (so CMU students staff fit the bill)

48
User study compared Salmon to XP
  • Seven permissions-setting tasks, Ill discuss
    two
  • Wesley
  • Jack
  • Metrics for comparison
  • Accuracy (measured as deviations in users final
    permission bits from correct permission bits)
  • Speed (time to task completion)
  • Not security left that to Microsoft

49
Study design
  • Between-participants comparison of interfaces
  • 12 participants per interface, 24 total
  • Participants were technical staff and students at
    Carnegie Mellon University
  • Participants were novice or occasional file
    permissions users

50
Wesley and Jack tasks
Wesley task
Jack task
  • Initial state
  • Wesley allowed READ WRITE
  • Final state
  • Wesley allowed READ, denied WRITE
  • What needs to be done
  • Deny Wesley WRITE
  • Initial state
  • Jack allowed READ, WRITE, ADMINISTRATE
  • Final state
  • Jack allowed READ, denied WRITE ADMINISTRATE
  • What needs to be done
  • Deny Jack WRITE ADMINISTRATE

51
Salmon outperformed XP in accuracy
Salmon
Salmon
XP
XP
52
Salmon outperformed XP in accuracy
p 0.09
p lt 0.0001
Salmon
Salmon
XP
XP
53
Salmon did not sacrifice speed
XP
XP
Salmon
Salmon
54
Salmon did not sacrifice speed
p 0.35
p 0.20
XP
XP
Salmon
Salmon
55
Part 4 Analysis
  • What led Salmon users to better performance?

56
How users spent their time - Wesley
57
Where Salmon did better - Wesley
58
Where XP did better - Wesley
59
How users spent their time - Jack
60
Where Salmon did better - Jack
61
Where XP did better - Jack
62
Common UI problems summary
  • Feedback poor
  • Labels ambiguous
  • Omission error potential
  • Convention violation
  • Keeping options visible

63
User interface evaluation summary
  • Know what youre measuring
  • Internal validity Control your experiment
  • External validity Make your experiment realistic

64
Rob Reeder reeder_at_cs.cmu.edu CMU Usable Privacy
and Security Laboratory http//cups.cs.cmu.edu/
65
x-x-x-x-x-x-x END x-x-x-x-x-x-x
66
Results
Grid
Small-size Small-size Large-size Large-size
Task type Accuracy Speed Accuracy Speed
View simple
View complex
Change simple
Change complex
Compare groups
Conflict simple
Conflict complex
Memogate simulation
Precedence rule test
Windows
89
61
29s
42s
56
56
64s
61s
39s
35s
94
100
55s
67s
17
39
30s
89
100
50s
94
52s
100
42s
67
100s
61
70s
0
17
143s
Insufficient data
67
39s
111s
89
83
103s
126s
83
67
55s
73s
72
61
61
103s
104s
89
100
29s
52s
Insufficient data
0
6
Insufficient data
105s
100
20s
94
94
66s
116s
78
89
78
42s
71s
94
78
118s
115s
67
Good UI design ? Peace on Capitol Hill?
68
Measure the right thing!
  • Keystroke dynamics analysis poses a real threat
    to any computer user. Hackers can easily
    determine a users password by recording the
    sounds of the users' keystrokes. We address this
    issue by introducing a new typing method we call
    "Babel Type", in which users hit random keys when
    asked to type in their passwords. We have built a
    prototype and tested it on 100 monkeys with
    typewriters. We discovered that our method
    reduces the keystroke attack by 100. This
    approach could potentially eliminate all risks
    associated with keystroke dynamics and increase
    user confidence. It remains an open question,
    however, how to let these random passwords
    authenticate the users.
Write a Comment
User Comments (0)
About PowerShow.com