Evaluation Overview - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Evaluation Overview

Description:

staying on schedule. find evaluators. select equipment. CS 4750 Fall 2002 ... Do you agree with the NFL owner's decision in regards to the referee's suggested ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 32
Provided by: Irfan8
Category:

less

Transcript and Presenter's Notes

Title: Evaluation Overview


1
Evaluation Overview Questionnaire Design
2
Agenda
  • Questions ???
  • Thursday prototype implementation decision
  • Thursday, sign up for mini-prototype demo with
    Rod for the week of Nov 5-8
  • Review - Evaluating Input and Output Devices
  • Overview of Evaluation
  • Questionnaire design

3
Project Part 3Making an evaluation plan
  • What criteria are important?
  • What resources available?
  • evaluators, prototype, subjects
  • Required authenticity of prototype

4
Input and Output Devices
  • Look at example of Using Low-Cost Sensing to
    Support Nutritional Awareness
  • DFAB Interaction Framework
  • Input
  • Output
  • Alternative I/O devices and trade-offs
  • Evaluate

5
Interaction framework
Observation(task language)
6
Selecting the Devices
  • Look at users
  • Consider Task
  • Environment
  • Use Interaction Framework to assessease of use
    and task coveragedirectness of task, actions,
    interpretation

7
(No Transcript)
8
Evaluation Overview
  • Explain key evaluation concepts terms.
  • Describe the evaluation paradigms techniques
    used in interaction design.
  • Discuss the conceptual, practical and ethical
    issues that must be considered when planning
    evaluations.
  • Introduce the DECIDE framework.

9
User studies
  • User studies involve looking at how people behave
    in their natural environments, or in the
    laboratory, both with old technologies and with
    new ones.

10
Some Definitions
  • Objective vs. Subjective quantitative measure
    vs. opinion
  • Quantitative vs. Qualitative measurement vs.
    descriptions/anecdotes
  • Laboratory vs. Field or Naturalistic controlled
    environment vs. real- world context of use

11
Evaluation paradigm
  • Any kind of evaluation is guided explicitly or
    implicitly by a set of beliefs, which are often
    under-pinned by theory. These beliefs and the
    methods associated with them are known as an
    evaluation paradigm
  • Ex. Usability testing has a controlled
    environment for testing

12
Four evaluation paradigms
  • quick and dirty informal feedback evaluation
  • usability testing
  • measure users performance, strongly
    controlled
  • field studies
  • natural settings outsider vs. insider
  • predictive evaluation experts apply heuristics
    models predict performance

13
Quick and dirty
  • quick dirty evaluation describes the common
    practice in which designers informally get
    feedback from users or consultants to confirm
    that their ideas are in-line with users needs
    and are liked.
  • Quick dirty evaluations are done any time of
    design cycle.
  • The emphasis is on fast input to the design
    process rather than carefully documented
    findings.

14
Usability testing
  • Usability testing involves recording typical
    users performance on typical tasks in controlled
    settings. Field observations may also be used.
  • As the users perform these tasks they are watched
    recorded on video their key presses are
    logged.
  • This data is used to calculate performance times,
    identify errors help explain why the users did
    what they did.
  • User satisfaction questionnaires interviews are
    used to elicit users opinions.

15
Field studies
  • Field studies are done in natural settings
  • The aim is to understand what users do naturally
    and how technology impacts them.
  • In product design field studies can be used to-
    identify opportunities for new technology-
    determine design requirements - decide how best
    to introduce new technology- evaluate technology
    in use.

16
Predictive evaluation
  • Experts apply their knowledge of typical users,
    often guided by heuristics, to predict usability
    problems.
  • Another approach involves theoretically based
    models.
  • A key feature of predictive evaluation is that
    users need not be present
  • Relatively quick inexpensive

17
Evaluation Paradigm Characteristics
  • Role of Users
  • Who controls
  • Location
  • When used
  • Type of Data
  • Fed back into design by
  • Philosophy

18
Overview of techniques
  • observing users,
  • asking users their opinions
  • asking experts their opinions
  • testing users performance
  • modeling users task performance

19
Relationship between Paradigms and Techniques
20
DECIDE A framework to guide evaluation
  • Determine the goals the evaluation addresses.
  • Explore the specific questions to be answered.
  • Choose the evaluation paradigm and techniques to
    answer the questions.
  • Identify the practical issues.
  • Decide how to deal with the ethical issues.
  • Evaluate, interpret and present the data.

21
Determine the goals
  • What are the high-level goals of the evaluation?
  • Who wants it and why?
  • The goals influence the paradigm for the study
  • Some examples of goals
  • Identify the best metaphor on which to base the
    design.
  • Check to ensure that the final interface is
    consistent.
  • Investigate how technology affects working
    practices.
  • Improve the usability of an existing product .

22
Explore the questions
  • All evaluations need goals questions to guide
    them so time is not wasted on ill-defined
    studies.
  • For example, the goal of finding out why many
    customers prefer to purchase paper airline
    tickets rather than e-tickets can be broken down
    into sub-questions- What are customers
    attitudes to these new tickets? - Are they
    concerned about security?- Is the interface for
    obtaining them poor?
  • What questions might you ask about the design of
    a cell phone?

23
Choose the evaluation paradigm techniques
  • The evaluation paradigm strongly influences the
    techniques used, how data is analyzed and
    presented.
  • E.g. field studies do not involve testing or
    modeling

24
Identify practical issues
  • For example, how to
  • select users
  • stay on budget
  • staying on schedule
  • find evaluators
  • select equipment

25
Decide on ethical issues
  • Develop an informed consent form
  • Participants have a right to- know the goals of
    the study- what will happen to the findings-
    privacy of personal information- not to be
    quoted without their agreement - leave when they
    wish - be treated politely

26
Evaluate, interpret present data
  • The following also need to be considered-
    Reliability can the study be replicated?-
    Validity is it measuring what you thought?-
    Biases is the process creating biases?- Scope
    can the findings be generalized?- Ecological
    validity is the environment of the study
    influencing it - e.g. Hawthorn effect
  • How data is analyzed presented depends on the
    paradigm and techniques used.

27
Pilot studies
  • A small trial run of the main study.
  • The aim is to make sure your plan is viable.
  • Pilot studies check- that you can conduct the
    procedure- that interview scripts,
    questionnaires, experiments, etc. work
    appropriately
  • Its worth doing several to iron out problems
    before doing the main study.
  • Ask colleagues if you cant spare real users.

28
Making an evaluation plan
  • What criteria are important?
  • What resources available?
  • evaluators, prototype, subjects
  • Required authenticity of prototype

29
Evaluation techniques
  • Questionnaire
  • Interviews
  • Think aloud (protocol analysis)
  • Cognitive walkthrough
  • Predictive modeling
  • Heuristic evaluation
  • Empirical user studies

30
Classifying Techniques
  • How/When it is used?
  • Formative
  • Summative
  • What data obtained
  • Quantitative
  • Qualitative

31
Questionnaire Design
  • Summative or formative
  • Quantitative or qualitative
  • Usually inexpensive way to get lots of information

32
Goals of Questionnaire
  • A good questionnaire requires design
  • High-level goals
  • What questions are you trying to answer?
  • Who are you trying to get answers from?

33
Contents of a questionnaire
  • General/Background info
  • name, experience
  • Objective
  • Open-ended/subjective

34
Advice on survey design
  • Take your own survey first
  • Know what answers you are trying to elicit
  • Too long, and youll be sorry

35
Background examples
  • What is your age?
  • What is your major course of study?
  • Have you ever worked at a restaurant?

Potential problems?
36
Objective questions
  • Good for gathering quantitative trends
  • When taking notes in class what percentage of
    what the instructor writes do you write in your
    own notes?

Potential problems?
37
Form of response
  • Questionnaire format can include- yes, no
    checkboxes- checkboxes that offer many options-
    Likert rating scales- semantic scales-
    open-ended responses

38
Possible improvement
  • Multiple choice (choose one)
  • How much of what the lecturer writes in class do
    you record in your notes?
  • ___ More than what (s)he writes
  • ___ Everything (s)he writes
  • ___ Some of what (s)he writes
  • ___ None of what (s)he writes
  • ___ Other, please specify

39
A Likert version
  • In taking notes in this class, I always write
    down everything the instructor writes down on the
    board.
  • 1 2 3 4 5
  • Strongly Agree Neutral Disagree Strongly
  • Disagree Agree

40
Subjective
  • Good for exploring richer explanations
  • When something important is presented in class,
    describe how you signify its occurrence in your
    notes.

Potential problems?
Improvements?
41
More advice
  • see Web primers
  • on-line questionnaire design resource

42
Clarity is Important
  • Questions must be clear, succinct, and
    unambiguous
  • To be eligible, your mother and your father must
    both be living and you must maintain regular
    contact with both of them.

43
Avoid Question Bias
  • Leading questions unnecessarily force certain
    answers.
  • Do you think parking on campus can be made
    easier?
  • What is your overall impression of
  • 1.Superb
  • 2.Excellent
  • 3.Great
  • 4.Not so Great

44
Be Aware of Connotations
  • Do you agree with the NFL owners decision to
    oppose the referees pay request?
  • Do you agree with the NFL owners decision in
    regards to the referees pay demand?
  • Do you agree with the NFL owners decision in
    regards to the referees suggested pay?

45
Handle Personal Info Carefully
  • Ask questions subjects would not mind answering
    honestly.
  • What is your waist size?
  • All men wear a 32!!!
  • If subjects are uncomfortable, you will lose
    their trust.
  • Ask only what you really need to know.

46
Avoid Hypotheticals
  • Avoid gathering information on uninformed
    opinions.
  • Subjects should not be asked to consider
    something theyve never thought about (or know or
    understand).
  • Would a device aimed to make cooking easier help
    you?

47
Thursday
  • Prototype implementation plan due
  • Interviews, Think Aloud and Cognitive Walkthroughs
Write a Comment
User Comments (0)
About PowerShow.com