Title: Quick wins with usability testing
1Quick wins with usability testing
- Dey Alexander
- Usability Specialist
- Web Resources and Development
- Ph ext 54740
2Overview
- What is usability testing?
- Why we do usability testing
- How usability testing is conducted
- Designing usability tests some important
considerations - Quick wins with usability testing
- Monash International site navigation
- ITS mobile phone procurement
3What is usability?
- It is about being able to use a web site or
application easily, quickly and with confidence
so that you are able to complete your task or
achieve your goal - 3 main dimensions
- Effectiveness can the user get the job done?
- Efficiency how much time and effort was
involved? - Satisfaction was the user satisfied? Will they
want to use it again?
4What is usability testing?
- A method for assessing the usability of a web
site or application - Involves real users
- Users perform real tasks
- Performance is observed and measured
- Measurements and other data are recorded
- Data are analysed
5Broad goals of usability testing
- To improve usability
- See what works well and what causes users
problems - Make design changes to overcome these problems
- To compare usability
- Decide between competing products
- Decide between two (or more) interfaces
- See if new design is better than old design
- See if our competitors are better than us
6Usability testing is not
- Functional testing, beta testing or user
acceptance testing - Not checking to see if the site/application
functions as intended - Not testing to see if the application is bug-free
- Not testing the robustness of the
site/application - Not testing to see if site/application allows the
user to perform the required tasks/functions as
per the project specification
7Usability testing is not
- Usability inspections
- Experts can tell you what they think might cause
problems for users, but this is a professional
opinion at best - High usage
- Lots of hits does not mean the site/application
is usable
8- What is usability testing?
- Why we do usability testing
- How usability testing is conducted
- Designing usability tests some important
considerations - Quick wins with usability testing
- Monash International site navigation
- ITS mobile phone procurement
9Why we do usability testing
- We build web sites and applications to
- Promote Monash to the world, generate business
- Provide information to staff, current students
and other users - To help staff do their work, and students do
their study/research - To reduce the number of phone calls received for
help or information - To reduce the work of staff by moving
face-to-face work online - Etc
10Why we do usability testing
- None of these objectives can be fully realised
unless the site or application is usable - Effective
- Efficient
- Satisfying
- Only usability testing can tell us if this is so
and if not, what we need to improve
11- What is usability testing?
- Why we do usability testing
- How usability testing is conducted
- Designing usability tests some important
considerations - Quick wins with usability testing
- Monash International site navigation
- ITS mobile phone procurement
12Basic steps in usability testing
- Determine your test objectives and concerns
- Design the test
- What users you will recruit for testing?
- What tasks will the users perform?
- What will you measure?
- Run the test(s) and collect data
- Analyse data
- Report results and recommendations
13Where is usability testing done?
- Often done in a high tech usability lab
- Multiple video cameras
- Recording equipment
- Scan converter showing the users monitor display
- Observation room
- Data logging equipment
- Test monitor/facilitator
14Classic usabilitylab setup
15Usability lab photos
16Discount usability testing
- Usability testing can be done without the
technology - Cheaper
- Can be done anywhere
- Dont even need a computer if youre testing a
paper prototype - Discount approach still requires planning
17Formal usability testing methodology
- Used for research projects
- Formulate a hypothesis
- e.g. Interface A will result in shorter task
times and less errors than interface B. - Participants randomly chosen
- Tight controls in place
- All participants should have a nearly identical
experience - Control groups used
- User sample of sufficient size
18Use of formal methodology
- Not feasible in commercial environment
- Purpose is different (not research-oriented)
- Fast-paced development environment not compatible
with time-consuming/rigorous research approach - Organisational constraints including budget and
resources - Pre-requisite knowledge of experimental method
and statistics is considerable
19Standard usability testing methodology
- Used for all other testing
- Problem statements or concerns replace hypotheses
- Representative samples of end users may are
selected not necessarily random - Controls in test environment are in place
- Control groups are usually not used
- Sample sizes are not statistically significant
20- What is usability testing?
- Why we do usability testing
- How usability testing is conducted
- Designing usability tests some important
considerations - Quick wins with usability testing
- Monash International site navigation
- ITS mobile phone procurement
21Important considerations in usability testing
- Defining test goals and concerns
- Selecting tasks and writing task scenarios
- Deciding what data to collect
- Role of the test facilitator
- When to test
- We will be running training through SDU to cover
a broader range of issues/techniques relevant to
usability testing
22Why we must define test goals and concerns
- Testing can only be successful when goals and
concerns have been identified - They help guide
- Selection of users to participate in testing
- You cant test everyone
- Selection of tasks to use in testing
- You can test every task
- What will be measured through testing
- You cant focus on collecting and measuring
everything
23Defining goals and concerns
- Goals must be specific
- We want to see if web site is easy to use is
too general - What aspects of the site/application are of
concern? - What tasks do you think might be difficult?
- Which groups of users are you worried about?
24Sources of test goals and concerns
- Feedback from users
- Email, phone calls, requests for help
- Concerns of your client/management
- Problems raised by designers/developers
- Issues indicated by web server log file analysis,
e.g. poor clickthrough rates
25Sources of goals and concerns
- Problems identified by usability inspections
- An inspection prior to testing can be a good idea
- Time-dependent issues
- Early in development you may by trying different
interaction styles that youll want to test - Issues arising during development lifecycle, e.g.
want to test a feature that has just been added
26Sources of goals and concerns
- Task analysis
- Concerns might relate to common or
frequently-performed tasks or to tasks that are
critical to successful use of the system - Problems identified in earlier tests
- In an iterative design cycle, several tests in a
row could address the same set of concerns - Usability goals
- Testing should focus on testing progress towards
meeting the usability goals established for the
system
27Examples Monash Web Redevelopment
- IA based on organisational structures would
users unfamiliar with the org structure be able
to find info? - Source feedback from users, usability inspection
- How this affected test design
- Recruited staff lt 6 months work at Monash
- Designed a task to see if this would cause a
problem - Task asked users to find semester dates most
searched and many failed to identify the page
labelled University Secretariat Principal
Dates
28Screenshot search results
A search on semester dates produced the right
result, but it wasnt recognised by users
29Examples Monash Web Redevelopment
- Poor navigation design would users see level
2 navigation? - Source usability inspection, casual observations
of users - How this affected test design
- Recruited staff with lt and gt 6 months work at
Monash - Designed a task to see if this would cause a
problem - Task asked users for find out history of Monash
campuses many went straight to the right page,
but did not see the link
30ScreenshotMonash Info
Users often failed to notice the level 2
navigation on Monash pages
31Selecting tasks
- Not (usually) possible to test every task, so
need a process to select tasks - Tasks that are suggested by test goals and
concerns, that probe potential or suspected
usability problems - Tasks that are critical for use of the system
- Tasks that are likely to be frequently performed
- Tasks that can be performed on the system in its
current stage of development
32Test tasks need to be written as scenarios
- Test situation needs to be made as realistic as
possible - Tasks need to be written as scenarios because
scenarios are more realistic - Task Create a personal signature
- Scenario You want your name and address to
appear on the bottom of all the messages you
send. Is there any way of doing this when using
Hotmail?
33Writing task scenarios
- Scenarios should be short
- Use the users language, not the systems
- Should be unambiguous
- Provide no hidden clues
- Give participants enough information to do the
task - Provide the motivation for doing the task
- Are directly linked to the test concerns and goals
34Typical data collected during usability tests
- Performance data
- Actions and behaviours that can be observed
- Quantitative can be counted, assigned a value
- Subjective data
- Perceptions, opinions, judgements
- Quantitative ratings can be assigned a value
- Qualitative spontaneous user comments, or those
elicited during debriefing, post-test
questionnaires
35Performance measures
- Time to finish task
- Number of pages clicked through
- Number of tasks successfully completed
- Number of tasks abandoned by user
- Number of navigation errors
- Number of search errors
- Number of form input errors
- Number of other errors
- Observations of user frustration
- Observations of user confusion
- Expressions of satisfaction by user
36Subjective measures
- Spontaneous comments by user
- Ratings of ease of navigation
- Ratings of ease of use of search engine
- Ratings of usefulness of content
- Ratings of visual appeal
- Preferences over competitors systems
- Preferences over a previous version
- Predictions of behaviour (I would buy this
product, pay extra for this feature, etc.)
37Deciding what to measure
- Data collected during the test must focus on
- Concerns that are being evaluated, test goals
- Usability goals are they close to being met?
- Data collection will be limited by
- Stage of development
- Some measures wont be collectable when testing
early prototypes - Equipment used
- Personnel available
38Examples Monash Web Redevelopment
- A number of measures were important because
- The site had never been tested
- Usability was believed to be poor but we couldnt
quantify just how poor - We were beginning a redesign process and wanted
to ensure we made it better rather than worse
39Examples Monash Web Redevelopment
- Measuring efficiency
- Measured time to complete task
- Wanted to see if users took optimal navigation
path - Not possible to note the path (no video taping,
multiple paths possible) - Noted entry point(s) instead
- And counted clicks
40Examples Monash Web Redevelopment
- Measuring effectiveness
- Measured task completion rates
- Measured task success rates
- Measured number of errors
- Measured a range of error types
41Examples Monash Web Redevelopment
- Measuring satisfaction
- Noted expressions of satisfaction or otherwise
(used audio tape to collect these) - Measured user satisfaction using a questionnaire
after the test - Ease of finding information
- Ease and efficiency of navigation
- Page layout
- Visual appeal
- Quality of content, etc.
42When to test
- Ideally usability testing should happen early in
the design phase of a project, and should
continue throughout it - Best not to test only at the end
- If you find a major problem, might be too late or
too expensive to redesign - But better late than never, at least test results
can set agenda for improvements - Any testing is better than no testing at all
43Role of the test monitor is critical
- Needs to be a people person
- Understands and respects rights of participants
- Put test participants at ease
- Good communicator
- Handle difficult situations
- Able to persuade designers to make changes
- Good observer of human behaviour with knowledge
of human factors - See what happens
- Understand why it happened
44Role of test monitor is critical
- Knowledge/experience of research methods
- Know how to avoid biasing data collection
- Understanding of the system being tested
- The bigger the site, the more vital this becomes
- Needs to know what a user might do in a test
- Able to see the big picture
- Focus on the most significant data
- Form a cohesive picture of the test and test
series
45- What is usability testing?
- Why we do usability testing
- How usability testing is conducted
- Designing usability tests some important
considerations - Quick wins with usability testing
- Monash International site navigation
- ITS mobile phone procurement
46Starting with simple testing
- It takes time to learn how to design, conduct and
report on usability testing - Start with simple testing
- Practice your skills of observation
- Have users perform just one task
- Dont try to collect lots of data
- Just look for showstoppers
- Here are two examples of simple testing.
47Monash International site navigation
- Example of simple usability testing
- Used paper prototype
- Users involved did two simple tasks
- Tests conducted by MI staff member
- MI about reworking site design and navigation
- About to go to a graphic designer to get
navigation buttons created - Agreed to do some usability testing first
48Monash International site navigation
- Created navigation structure on sheets of A4
paper - First page had main navigation options
- Subsequent pages contained the links accessed by
selecting one of the main navigation options - Tested with 20 prospective international
students - Each user attempted 2 tasks from a total of 10
different tasks - Tasks were taken from top 10 questions to call
centre
49Monash International site navigation
- No video or audio tape
- Only data collected was users navigation choices
- Date showed
- Success rate only 26
- But only 15 took optimal route
50Mobile phone procurement - background
Site was completed and had just gone live prior
to testing
51Mobile phone procurement test overview
- Client Services raised concerns about usability
- A quick usability inspection was conducted
- One major problem identified
- Developer approached and agreed to take part in
usability testing - One task designed to test the main concern
- You need to order two mobile phones one for
yourself and one for your manager. Your manager
wants a high-end phone, and you must order a
mid-range phone for yourself.
52Mobile phone procurement test overview
- No video or audio recording used
- No quantitative measures were taken
- Only looked for major problems
- Took rough notes on a sheet of note paper
- Test took 5-10 minutes per person
- Test conducted at users workplace
- No report written
- Developer witnessed testing
- Solutions discussed verbally
53Mobile phone catalogue page
- 3 of 4 users did not notice buttons at top of
page - Users suggested that having phone prices listed
here would be helpful - Named anchors confused some users
- Note show video!
54Descriptionpage
No problems noted onthis page
55Shopping cart
Major problemhere 3 of 4 users lostcart
contents
Note show video!
56Shopping cart
- Only the user whodseen the top buttonson the
catalogue pagedid not make an error on this
page - Only one user read the text MISREAD itand
used browser BACK button
57Checkout procedure
Major problemhere did notenter full name Took
some timeto realise whatwas required One user
triedseveral timesbefore givingup
58Design revisions and retesting
- Revisions were made to pages where major problems
occurred - Also changed catalogue page to show phone prices
- Retested with 4 more users
- 5-10 mins per person
- Tested in my office
- Videotaped tests
- No report written
- Results and possible solutions discussed verbally
with developer
59Revised catalogue page
Re-testing showed problem with mouseover not
producing the hyperlink hand icon Confused all
4 users and one gave up saying it wasnt possible
to order a phone
60Revised shopping cart
Re-testing showed this problem was almost
resolved Text said Do not use Back button
61Revised checkout procedure
Re-testing showed this problem was almost
resolved too Text instructions needed to be
closer to the input box Needed to say that the
name required was the phone owners name
62Found one additional problem
Need to identify required fields on this form
63Error message encountered
These fields should have been filled in
64Quick wins with usability testing
- Test as you design
- Test before you build
- Many things can be tested on paper first
- Series of small tests are easier to plan, manage
and report on - Testing after youve built an entire site or
application requires more time and resources - Redesigning after youve built will cost more
- Just look for showstoppers
- If youre able, note other things you can do to
improve the users experience
65Further reading
- Joseph Dumas Janice Redish, A Practical Guide
to Usability Testing, revised ed. 1999 - Jeffrey Rubin, Handbook of Usability Testing,
1994 - Jakob Nielsen, Usability Testing, chap 6 of
Usability Engineering, 1993