Chapter 12: Introducing Evaluation - PowerPoint PPT Presentation

Loading...

PPT – Chapter 12: Introducing Evaluation PowerPoint presentation | free to download - id: 733145-NjU1Y



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Chapter 12: Introducing Evaluation

Description:

Chapter 12: Introducing Evaluation – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 21
Provided by: kut54
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Chapter 12: Introducing Evaluation


1
Chapter 12 Introducing Evaluation
2
The aims
  • To illustrate how observation, interviews and
    questionnaires that you encountered in Chapters 7
    and 8 are used in evaluation.
  • To explain the key concepts and terms used in
    evaluation.
  • To introduce three main evaluation evaluation
    approaches and key evaluation methods within the
    context of real evaluation studies.

3
Six evaluation case studies
  • Evaluating early design ideas for a mobile device
    for rural nurses in India.
  • Evaluating cell phones for different markets.
  • Evaluating affective issues challenge and
    engagement in a collaborative immersive game.
  • Improving a design the HutchWorld patient
    support system.
  • Multiple methods help ensure good usability the
    olympic messaging system (OMS).
  • Evaluating a new kind of interaction an ambient
    system.

4
Why, what, where and when to evaluate
  • Iterative design evaluation is a continuous
    process that examines
  • Why to check that users can use the product and
    that they like it.
  • What a conceptual model, early prototypes of a
    new system and later, more complete prototypes.
  • Where in natural and laboratory settings.
  • When throughout design finished products can be
    evaluated to collect information to inform new
    products.
  • Designers need to check that they understand
    users requirements.

5
Bruce Tognazzini tells you why you need to
evaluate
  • Iterative design, with its repeating cycle of
    design and testing, is the only validated
    methodology in existence that will consistently
    produce successful results. If you dont have
    user-testing as an integral part of your design
    process you are going to throw buckets of money
    down the drain.
  • See AskTog.com for topical discussions about
    design and evaluation.

6
The language of evaluation
  • Analytical evaluation
  • Controlled experiment
  • Field study
  • Formative evaluation
  • Heuristic evaluation
  • Predictive evaluation
  • Summative evaluation
  • Usability laboratory
  • User studies
  • Usability studies
  • Usability testing
  • User testing

7
Evaluation approaches
  • Usability testing
  • Field studies
  • Analytical evaluation
  • Combining approaches
  • Opportunistic evaluations

8
Characteristics of approaches
Usability testing Field studies Analytical
Users do task natural not involved
Location controlled natural anywhere
When prototype early prototype
Data quantitative qualitative problems
Feed back measures errors descriptions problems
Type applied naturalistic expert
9
Evaluation approaches and methods
Method Usability testing Field studies Analytical
Observing x x
Asking users x x
Asking experts x x
Testing x
Modeling x
10
Evaluation to design a mobile record system for
Indian AMWs
  • A field study using observations and interviews
    to refine the requirements.
  • It would replace a paper system.
  • It had to be easy to use in rural environments.
  • Basic information would be recorded identify
    each house-hold, head of house, no. members, age
    and medical history of members, etc.

11
Could these icons be used with other cultures?
For more interesting examples of mobile
designs for the developing world see Gary
Marsdens home pagehttp//people.cs.uct.ac.za/g
az/research.html
12
Evaluating cell phones for different world markets
  • An already existing product was used as a
    prototype for a new market.
  • Observation and interviews.
  • Many practical problems needed to be overcome
    Can you name some?
  • Go to www.nokia.comand select a phone or
    imagine evaluatingthis one in a countrythat
    Nokia serves.

13
Challenge engagement in a collaborative
immersive game
  • Physiological measureswere used.
  • Players were more engaged when playing against
    another person than when playing against a
    computer.
  • What were the precautionary measures that the
    evaluators had to take?

14
What does this data tell you?
15
The HutchWorld patient support system
  • This virtual world supports communication among
    cancer patients.
  • Privacy, logistics, patients feelings, etc. had
    to be taken into account.
  • Designers and patients speak different languages.
  • Participants in this world can design their own
    avatar. Look at the My appearance slide that
    follows. How would you evaluate it?

16
My Appearance
17
Multiple methods to evaluate the 1984 OMS
  • Early tests of printed scenarios user guides.
  • Early simulations of telephone keypad.
  • An Olympian joined team to provide feedback.
  • Interviews demos with Olympians outside US.
  • Overseas interface tests with friends and family.
  • Free coffee and donut tests.
  • Usability tests with 100 participants.
  • A try to destroy it test.
  • Pre-Olympic field-test at an international event.
  • Reliability of the system with heavy traffic.

18
Something to think about
  • Why was the design of the OMS a landmark in
    interaction design?
  • Today cell phones replace the need for the OMS.
    What are some of the benefits and losses of cell
    phones in this context? How might you compensate
    for the losses that you thought of?

19
Evaluating an ambient system
  • The Hello Wall is a new kind of system that is
    designed to explore how people react to its
    presence.
  • What are the challenges of evaluating systems
    like this?

20
Key points
  • Evaluation design are closely integrated in
    user-centered design.
  • Some of the same techniques are used in
    evaluation as for establishing requirements but
    they are used differently (e.g. observation
    interviews questionnaires).
  • Three main evaluation approaches areusability
    testing, field studies, and analytical
    evaluation.
  • The main methods areobserving, asking users,
    asking experts, user testing, inspection, and
    modeling users task performance.
  • Different evaluation approaches and methods are
    often combined in one study.
  • Triangulation involves using a combination of
    techniques to gain different perspectives, or
    analyzing data using different techniques.
  • Dealing with constraints is an important skill
    for evaluators to develop.
About PowerShow.com