An Introduction to - PowerPoint PPT Presentation

Loading...

PPT – An Introduction to PowerPoint presentation | free to view - id: 1060ec-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

An Introduction to

Description:

The objective of the program: to provide students with ... Haptic. Taste. Smell. Short-term memory. Working memory. Control Processes. Rehearsal. coding ... – PowerPoint PPT presentation

Number of Views:144
Avg rating:3.0/5.0
Slides: 167
Provided by: gba
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: An Introduction to


1
An Introduction to
Interaction Design
  • Guohua Bai (Ph.D)
  • Kalmar University
  • Department of Technology
  • 391 82 Kalmar
  • gba_at_bth.se
  • 46 733 967108

2
The goals of the course
  • The objective of the program to provide students
    with knowledge, methods, and skills in creation,
    design, and management of digital interactive
    products for applications in social services,
    business management, home, work environment, and
    industries.
  • The goals of this course are
  • Practical issues to Interaction Design Program
  • Knowledge about the subject of different kinds of
    interactive systems and general development
    process
  • General Knowledge about human cognitive
    behaviours such as social ability (team work),
    memory, problem resolving, learning, decision
    making, perception, visualisation, etc .
  • Introduction to task analysis, evaluation, and
    methodologies of interaction design

3
Contents
  • The course comprises Introduction of the program,
    theoretical lecturing, study visiting, and
    practices within the area of interactive system
    design. Introduction part will provide students
    with specific knowledge about the overall
    education program of interaction design. The
    theoretical part is aiming at providing students
    general knowledge about the subject of
    interactive design. Through study visits students
    will experience different kinds of practical
    design problems as background for future study.

4
Teaching, Examination
  • Teaching
  • The method of instruction is lectures, seminars
    and compulsory practical assignments.
  • Examination
  • Assessment of the student achievement is normally
    made with the aid of a written and/or oral
    examination, as well as presentation (least 80)
    of compulsory assignments on either individual or
    group basis. The following grades are awarded
    Pass, or Fail.
  • Compulsory Moments
  • (1) lectures and visits 1P (2) Theoritical
    examination1P (3) project report 1P (4) Seminar1P
    (5) Oral presentation technique 1P

5
Literature
  • Newman W. M. Michael G. L. (1995) Interactive
    System Design, Addison-Wesley Pub.ltd. ISBN
    0-201-63162-8
  • Preece J (2002) Interaction design-beyond
    human-computer interaction,John Wiley Son,ISBN
    0-471-49278-7
  • http//www.interaction-design.org/
  • http//www2.iicm.edu/hci

6
Lab. Task
1. Decide your acting role in the experiment
(designer,end-user, or project leader) 2. Find
your partners to form a group (each group should
be no more than two designers, two users, and one
project leader) 3. The user(s) in the group must
have a task to be supported by computer
system. 4. The designer(s) must be able to find a
quick prototyping tool (e.g., Visual Basic, C,
Progress, Excel) which can be used during the
prototyping process. (Cont.)
7
Lab. Work (cont.)
5. Work in group to identify functional
requirements, interface design, etc. Documenting
all your communication (as a project diary). 6.
In the end of the course you have to analyze your
design process (via project diary) and write a
report (no less than 1500 words about the
reflection of the process). 7. On 22/09 and
23/09, each group will dispute (oral
presentation) your prototype (20 min. can be on
overhead projector or on computer demo.) to all
participants, and meanwhile acts as another
groups opponent (you have to read their report
first, and prepare at least three questions).
8
Other Courses You Will Study
  • Interaction Design 5p
  • User Interface Design 5p
  • User study for Interactive Systems Design 5p
  • Research Methodology 5p
  • Project work 5 p
  • Thesis Work 10p

9
What is Interaction Design?
  • Desinging interactive products to support people
    in their everyday and work lives.
  • The design of sapces for human communication and
    interaction (interaction design is not software
    design, just like architects is not a civil
    engineer)

10
Activities and technologies.
11
Interaction Design and its Concerns
  • The problem of objective (usability and context
    of work)
  • The problem of development process (design
    methodology)
  • The problem of human and computer (man-machine
    fit)

Work and context
Organisational design Social ethical aspects
Work structure Task analysis
Human
Computer
Cognitive processes Communication Learning Ergonom
ics
Interface design Input and output
devices Dialogue Processing
Life-cycle design method Prototyping Participatory
design
Evaluation Usability test Implementation
Development Design
12
Activities in interactive systems design
13
Characteristics of Interaction Design
  • Multidisciplinary knowledge (computer sciences,
    psychology, ergonomics, arts, sociology, design)
  • Integrity of theories, methodologies, skills, and
    experiences.
  • Multidisciplinary team work (computer supported
    cooperative work, CSCW)

14
Related studies
Systems Design
Cognitive Psychology
Ergonomics and Human Factors
Artificial Intelligence
Interaction Design
Linguistics
Computer Engineering
Social
Organizational Psychology
Philosophy
Sociology
15
Normans Interaction Model
Goals
Expectation
Intention
Evaluation
Action Specification
Interpretation
Execution
Perception
Mental Activity
  • Home work (1)
  • Preeces book Chapter 1, ch.2
  • Newmans book Ch.1
  • Search Web by interaction design

Physical Activity
Seven-stage model of (individual) interaction,
from Norman (1986)
16
Types of Interactive Systems and Design Focuses
  • 1. Life-critical systems
  • Examples control of air traffic, nuclear
    reactor, power utilities, police or fire
    dispatch, military operation, and medical
    instruments
  • Characteristics High cost, high reliability and
    efficiency, error free operation, long time
    training. Subjective satisfaction is not
    important. Retention is obtained by frequent use
    of common function.

17
  • 2. Industrial and commercial uses
  • Examples banking, insurance, order entry,
    inventory management, airline and hotel
    reservations, car rentals, credit card
    management, point-of -sales terminals
  • Characteristics Cost must be in balance with
    reliability, speed of performance, and error
    rate high cost for training personals, so easy
    of learning is important subjective satisfaction
    is modest importance retention is obtained by
    frequent use

18
  • 3. Office, home, and entertainment
  • Examples word processing, automated transaction
    machines, video games, education packages,
    information retrieval, e-mail, tel-conference.
  • Characteristics Must be easy to learn, low error
    rates subjective satisfaction is very important.

19
  • 4. Explorative, strategic, creative, and
    co-operative systems
  • Example Business decision making, architecture
    design, CSCW, artist program and music
    composition
  • Characteristics Users are knowledgeable in task,
    but novice in computer, high motivation and high
    expectation, difficult to design, must be easy to
    use, e.g., provide direct manipulation of objects
    represented of their real world task. Trial-and
    -error learning, participatory design.

20
Understanding Users - Cognitive Theory and
Application
21
Why do we need to understand users?
  • Interacting with technology is cognitive
  • We need to take into account cognitive processes
    involved and cognitive limitations of users
  • We can provide knowledge about what users can and
    cannot be expected to do
  • Identify and explain the nature and causes of
    problems users encounter
  • Supply theories, modelling tools, guidance and
    methods that can lead to the design of better
    interactive products

22
Cognitive Human
  • The man in the computer - a standard figure
  • The man in the reality of work - motive to
    operation

Why ?
Activity
Motive
What
Goal
Action
Operation
Condition
How
23
Attention
  • Selecting things to concentrate on from the mass
    around us, at a point in time
  • Focussed and divided attention enables us to be
    selective in terms of the mass of competing
    stimuli but limits our ability to keep track of
    all events
  • Information at the interface should be structured
    to capture users attention, e.g. use perceptual
    boundaries (windows), colour, reverse video,
    sound and flashing lights

24
Human Memory
LTM
STM
SIB
  • Long-term
  • memory
  • (Knowledge, mental model,
  • experiences in
  • episodic and
  • semantic forms)

Visual Auditory Haptic Taste Smell
Short-term memory Working memory
Environment input
Control Processes Rehearsal coding Decisions Retri
val Strategies
Response Output
25
Short-term Memory
  • Short-term memory is the memory of the present,
    used as working or temporary memory.
  • Information is retained in STM automatically and
    is retrieved without effort.
  • However, the amount of information in STM is
    severely limited 7 ?2 items.
  • STM is extremely fragile - the slightest
    distraction and its contents are gone. For
    example, STM can hold a seven digit phone number
    from the time you look it up until the time you
    use it, as long as no distractions occur.

26
Long Term Memory (LTM)
  • Long-term memory is the memory of permanent
    knowledge, experience.
  • It takes time to put stuff into LTM and time and
    effort to get stuff out.
  • Capacity is estimated at about 100 million items.

27
Learning vs. Efficiency
28
Forms of Representation
  • Prepositional (linguistic), Pictorial, and
    Auditory

Pictorial
Prepositional She is talking.
1 No discrete symbols 2 Implicit, no separate
symbol for relation 3 No clear rules of
combination or symbol types 4 Concrete
1 Discrete symbols 2 Explicit, needs symbol for
relation 3 Grammatical, clear rules of
combination for types of symbol 4 Abstract
29
Types of Knowledge
  • Experience (learning by doing, prototyping)
  • Production rule (If . Then )
  • Distributed network
  • Recall and recognition
  • Context

30
Knowledge in the Head and in the World
  • Not all of the knowledge required for precise
    behaviour has to be in the head. It can be
    distributed
  • partly in the head (memorise code)
  • partly in the world (e.g., default input)
  • and partly in the constraints of the world
    (e.g, inactive fields to omit input).

31
  • Semantic and episodic knowledge
  • Semantic knowledge mostly are in form of
    concept, general fact, statement,
    judgement.
  • Episodic knowledge's mostly are in form of
    personal experience, time and location
    specified.
  • Relationship (Vygovtskys theory)
  • semantic network with nodes for various concepts
    and links activation.

Nation
pencil
Cat
o.5
0.1
Europa
Dog
0.3
Fish
Frankrike
Italien
Marseille
Paris
Rom
Milano
huvudstad
32
Communication (1)
  • Communication means exchange of information
    between system and environment, within the system
    in order to control the systems function with
    respect to change of environment (by way of
    feedback.) (Winner).
  • Any viable system includes communication
    channels to knit the entire system together into
    one coherent whole and to ensure that the
    sub-units are working together appropriately and
    contributing to the objectives of the whole.

33
Communication (2)
  • The ultimate significance of communication is
    that it serves to bind societies together, "World
    State.
  • Communication is a key point to understand
    processes of humans perception, attention,
    memory, language, learning, problem solving, and
    decision making.
  • Shannons Communication model (syntactic and
    technical problem of communication).

34
Face-to-face communication
HCI - Support communication
35
A Mental Model of Communication
36
  • Task and Context Analysis
  • Task Analysis
  • Context Analysis

37
Task Analysis
  • A task is a human activity designed to achieve
    some goals, e.g., Checking credit, checking
    order, shipping product, etc.
  • The output of task analysis is a hierarchical
    task model (general structure), and a task
    scenarios (Concrete case) .
  • The Input of task analysis can be Observation of
    users at work, Interview or workshop, analysis of
    business documents, analysis of proposed system.

38
Process of Task Analysis
  • 1. Identify Tasks Observation, interview,
    analysis documents and business activities
  • 2. Select task scenarios sample from real
    representative business process.
  • 3. Analyse and model tasks by GOMS model, HTA.
  • 4. Describe task context work environment,
    social/interpersonal task context, constraints or
    pressure, errors made, organisation types
    (dynamic, stable, market, production, services..
  • 5. Identify subtasks requiring computer support
    manual(Unstructured task), interaction(semi-struct
    ured tasks), automatic (well structured). Also
    consider what is the benefit for using computer,
    and cost/benefit analysis.
  • 6. Express scenarios as actions on user objects
    what concrete objects involved, attributes used,
    relation types.
  • 7. Validate task model and scenarios task
    performers check, observing other scenarios, ask
    managers, and other operators.
  • Sources Redmond-Pyle, D. (1995) Graphical User
    Interface Design and Evaluation, Prentice Hall,
    London.

39
Example of a Task Analysis
  • Identified Task
  • Help desk for customers query of booking train
    tickets

40
Identify tasks
  • Resolve customer query identify the customer,
    listen to their query, attempt to resolve it and
    record details of the query
  • Provide query status
  • Reproduce the problem
  • Update known problem
  • Provide customer information
  • Provide problem information
  • Monitor advisor workload
  • Monitor unresolved queries
  • Sources Redmond-Pyle, D. (1995) Graphical User
    Interface Design and Evaluation, Prentice Hall,
    London.

41
Select task scenarios
  • Scenario 1 resolve by using system support
  • Scenario 2 need further information
  • Scenario 3 resolve without system support

42
Task Scenarios
  • A task scenario is an example of a task. It must
    be representative, realistic, usable for HCI
    design, consistent with task model.
  • Example of task scenario (book a train ticket)
  • Operator answer phone
  • Customer Id like three tickets to Stockholm on
    4th April
  • Operator checks train departure list, note time
  • Operator We have three tickets to Stockholm on
    4th April, time 1000 am. Price 600 Kr/ per.
  • CustomerOk, we take it
  • Operator How do you pay?
  • CustomerVisa
  • Operator put into list, ..

43
Task Model
Hierarchic Task Description (Buy a train ticket)
A hierarchic Diagram
(Buy a train ticket)
  • 1. Find time of next train
  • 1.1 study list of train departures
  • 1.2 mentally note time and platform number of
    next train
  • 2 Purchase ticket
  • 2.1 stand in line at ticket counter
  • 2.2 on reaching counter, state destination and
    journey type
  • 2.3 receive quote for price of ticket
  • 2.4 pay money
  • 2.5 receive ticket and change

Find time of train
Study list
Note time
Purchase ticket
pay money
stand in line
state destination
get quote
receive ticket
44
Goals, Operators, Methods and Selection Rules (P.
Johnson)
  • Goals Symbolic structure that define a state of
    affairs to be achieved and determine a set of
    possible methods by which it may be accomplished.
  • Operators theses are elementary perceptual,
    motor or cognitive acts whose execution is
    necessary to change any aspects of the users
    mental state or to affect the task environment
  • Methods methods are descriptions of procedures
    for achieving goals. They are conditional
    sequences of goals and operators.
  • Selection Rules There are may be more than one
    method for achieving a give goal. The selection
    rules handle the process of choosing between
    methods, in the form of IF X THAN M.

45
More to consider in task analysis
  • Chose a scenario which frequently happen, time
    critical, user errors significant (dangerous,
    expensive, often)
  • Enrich the task model by asking
  • what happens if
  • some input info. is missing or incorrect?
  • Someone else has borrowed the folder?
  • Interruption from other urgent task?
  • the users make an error?
  • Subdivide task properly in details (logical
    action level, not on computer system)

46
Analyses and Model Tasks
Goal Customer satisfied Volume 60 per day Est.
Time 5-30 min. errors 1 in 20
0.
Resolve customer query
1.
2.
3.
4.
Identify customer
Identify problem
Advise customer
Log query
4.1
4.2
4.4
Set responsibility and action date
1.1
1.2
4.3
Associate query to problem
Record query info.
1.3
Identify customer organisation
Identify customer contact
Set status and priority
Check on maintenance
(Context information here)
47
Describe Task Context and Identify Subtasks
Requiring Computer Support
Goal Customer satisfied Volume 60 per day Est.
Time 5-30 min. errors 1 in 20
0.
Resolve customer query
1.
2.
3.
4.
Identify customer
Identify problem
Advise customer
Log query
4.1
4.2
4.4
Set responsibility and action date
1.1
1.2
4.3
Associate query to problem
Record query info.
1.3
Identify customer organisation
Identify customer contact
Set status and priority
Check on maintenance
48
Express Scenarios as Actions on Users Objects and
Valid Task Model and Scenarios
  • Excise to identify objects based on the script of
    scenarios.
  • Build up object model and relationship
  • (ref.. object modelling)

49
Context of Work
To understand anything, you need a model, not
just of the thing itself, but of the context in
which it operates.
What is this ?
and that?
50
Context tells you what it is!
51
The Organisational Aspects
  • The Nature of Organisations
  • People, technology, work, culture.
  • The impact of Information Technology on
    Organisation
  • Two lists of advantages and dis-advantages of
    computerisation.
  • Methods for Organisational Analysis
  • Scientific management (Taylor, 1911) Division of
    labour, training unskilled workers, monitoring
    working load.
  • Sociotechnical Systems Approach
  • Activity Theory
  • Ethnomethodology (details of work with
    technology, social practices, participation,
    evaluation of use technology)
  • CSCW and Organisational Consideration
  • Source Preece J. (1994) Human Computer
    Interaction, Addison-wesley, New York.

52
Scientific Management
  • Separation of planning and working
  • Choose the best person for the job that have been
    planned and designed
  • Determine how a task can be performed the most
    efficiently
  • Train the workers
  • Determine the best form of reward for the
    different task
  • Monitor worker performance to ensure that the
    prescribed methods are followed and the set goals
    achieved

53
The Sociotechnical Approach
  • Step 1 Initial scanning briefing
  • Step 2 Identification of unit operations
  • Step 3 Identification of variances
  • Step 4 Analysis of the social system
  • Step 5 Workers perception of their roles
  • Step 6 The maintenance systems
  • Step 7 The supply and user systems
  • Step 8 The corporate environment and development
    plans
  • Step 9 Proposals for change
  • (Source Mumford (1987) Sociotechnical system
    design evolving theory and practice. In
    computers and democracy, Bjerknes G., Ehn P. And
    Kyng M., eds., pp. 59-77, Aldershot Avebury)

54
Methodologies for Interactive System Design
  • Why Important
  • Difficult and Expensive, more than 75 work,
    involves study of psychology, communication, task
    analysis, etc.
  • Methodologies to Design
  • Waterfall model and
  • Prototyping Approach
  • Participatory Design

55
Introduction Who and what are involved in the
design
Tools User-Centred Design Methods Guideline Evalua
tion methods Software Tools Principles Prototyping
tools Experimentation Creative ideas
HCI Design Expertise
Design Team Managers System analysts Software
engineers programmers Graphic designers others
User Group
The product
Training
The design process
Application
Implementation
Hardware
In-house HCI group
Organization managers End user
Documentation
HCI Consultants
Software
Ergonomist expert
56
Perspectives on Interface Design
  • Design of Interface must consider
  • 1 Functional Perspective Concerned with whether
    or not the design is serviceable for its intended
    purpose.
  • 2 Aesthetic perspective Concerned with whether
    or not the design is pleasing in its appearance
    and artistic design
  • 3 Structural Perspective the structure is
    adaptive and currently useful

57
Software Engineering Methods
  • Software Engineering
  • The development and use of principles, methods
    and tools to design and develop, economically and
    optimally, software systems that are
    aesthetically pleasing, efficient, reliable and
    usable for the purposes for which they were
    designed.
  • Principles Include principles of usability,
    design and construction, They should lead to the
    formulation of criteria by which the quality of
    the design can be tested.
  • Methods provide the process model for software
    development
  • Tools languages, notations, semi-libraries and
    toolkits.

58
Lifecycle or Waterfall Approach
59
Prototyping Approach
  • Introduction
  • The use of experimenting with prototypes has
    become prominent due to a number of claimed
    advantages in the case that users requirements
    are difficult to specify. In principle users
    should be highly motivated in acting since they
    are provided with more chances to improve their
    work, to verify if their needs are taken care of
    and that the terms used in the interface,
    functions of the designed system are consistent
    with their work.

60
  • Participatory Design Approach

PD
The Scandinavian tradition of socio-technical
approach to the design of work and artefacts
61
Usability Engineering
62
Defining Usability
63
  • Usability Design Principles
  • What are principles
  • They apply to any user interface, more or less
    independently of the supported activity, the
    user, and the form of solution.
  • Why do we need Principles
  • Because of varieties of tasks and new technique,
    designers constantly involve in unfamiliar design
    problems. Principles can provide a source of help
    by referencing of general principles which are
    based upon cognitive theory.

64
Principles for DesignThe Windows Interface-An
application design GuideMicrosoft Press,
Redmond, Washington, 1992
  • User Control Be interactive quickly (see limits
    on response time) and easily, easy in and easy
    out Customisation, good default support task
    (activity) performance (not how to use computer)
  • Directness Manipulating objects directly (not
    e.g., to type command), e.g., re-size a window by
    dragging not by co-ordinates in dialog box.

65
Bad design Indirect operation
66
Bad Design- Indirectness
67
Principles for Design
  • Consistency Consistency with the real world (see
    Avoid Breaking a Metaphor) and consistency within
    and among applications in using of concepts,
    linguistic, visual, and function. (priority of
    consistency within application, if there is
    conflict between consistency among app. and
    within app.)
  • Clarity simple metaphor, unambiguous terms,
    transparent function, process display.
  • Aesthetics pleasing appearance

68
Avoid Breaking a Metaphor
  • As a means of deleting files and documents, the
    Macintosh trashcan is a perfectly intuitive
    metaphor.
  • Unfortunately, the designers decided to extend
    the trashcan metaphor to include the completely
    counterintuitive function of ejecting diskettes
  • Ejecting a diskette on the Mac.

69
Affordness
  • Affordness are the the perceived and actual
    properties of an artefact, which determine how it
    might possibly be used.
  • Appearance indicates how to use something
  • A chair affords (suggests) sitting.
  • Knobs are for turning.
  • Slots are for inserting things.
  • A button affords pushing.
  • A menu affords choosing.
  • When affordness are taken advantage of, the user
    knows what to do just by looking.
  • When simple things need pictures, labels, or
    instructions, the design has failed!

70
Constraints
  • The difficulty of dealing with a novel situation
    is directly related to the number of
    possibilities.
  • Constraints are physical, semantic, cultural, and
    logical limits on the number of possibilities.
  • Physical constraints such as pegs and holes limit
    possible operations.
  • Semantic constraints rely upon our knowledge of
    the situation and of the world.
  • Cultural constraints rely upon accepted cultural
    conventions.
  • Logical constraints exploit logical
    relationships. For example a natural mapping
    between the spatial layout of components and
    their controls.
  • Where affordness suggest the range of
    possibilities, constraints limit the number of
    alternatives.

71
Cultural Constraints
  • Beware that cultural constraints can vary
    enormously, for example
  • Light switches
  • America down is off
  • Britain down is on
  • Water taps
  • America anti-clockwise is on
  • Britain anti-clockwise is off
  • The colour red
  • America danger
  • Egypt death
  • India life
  • China happiness

72
Bad design affordability vs. constrains?
73
The PC Cup Holder - Example of affordness vs.
constrains
  • A supposedly true story from a Novell NetWire
    SysOp
  • Caller Hello, is this Tech Support?''
  • Tech Rep Yes, it is. How may I help you?''
  • Caller The cup holder on my PC is broken and
    I am within my warranty period. How do I go about
    getting that fixed?''
  • Tech Rep I'm sorry, but did you say a cup
    holder?''
  • Caller Yes, it's attached to the front of my
    computer.''
  • Tech Rep Please excuse me if I seem a bit
    stumped, it's because I am. Did you receive this
    as part of a promotional, at a trade show? How
    did you get this cup holder? Does it have any
    trademark on it?''
  • Caller It came with my computer, I don't
    know anything about a promotional. It just has
    '4X' on it.''
  • At this point the Tech Rep had to mute the
    caller, because he couldn't stand it. The caller
    had been using the load drawer of the CD-ROM
    drive as a cup holder, and snapped it off the
    drive.
  • This story was found at Greenberg(1997) and is
    attributed there to George Wagner
    g.wagner_at_sylvania.sev.org.

74
Principles for Design
  • Feedback immediate feedback to users action
    (e.g., dragging, typing, clicking, moving etc.)
    by visual, auditory response (see limits on
    responses times)
  • Forgiveness allow trial and error learning
    (learning by doing), regret or undo function,
    error messages
  • Awareness of human strengths and limitations
    e.g., human perception, recall and recognition,
    memory, reasoning,

75
Bad design too much contents, colors, flash
76
Bad design bad clarity too much content, colors,
text graphics
77
Bad design Unnecessary and confusing functions
(content)
78
Other Principles
  • Allow quick way for experienced user (see curve
    of learning)
  • Provide navigation support (home, back, bookmark,
    etc.)
  • Special action should be noticed with special
    sound, colour,blink.
  • Do not use more than four different kinds of
    colour in your system.
  • Use colour as information (red warning, green
    ok)
  • Use white-black interface first (colour later)

79
Other Principles
  • The user should be required to know as little as
    possible to use the system
  • Non-thought disruptiveness (working memory
    consistency, see limits on response time)
  • All error message should be helpful and fully
    self-explanatory
  • The dialogue should never put the user in a
    situation where he does not know what to do next
  • The most possible selection should be made of
    default option.

80
Other Principles
  • Consider Organization dynamics
  • By using menus, natural language, diagrams,
    window system (icon, fields, mouse), touch
    screen, and panels selection to avoid using
    computer language
  • The dialogue should completely avoid forcing the
    user to remember mnemonics or an alien syntax
    (short term memory limitation)

81
Other Principles
  • Graphics techniques should be used as help to
    clear thinking (e.g., to choose proper icon to
    indicate function, consequence, action, etc.)
  • Self teaching and computer aided instruction
    that can be invoked at any point during the
    interaction.

82
Bad Design Keeping Unnecessary Design Items
Spinning logos
3D graphics
Large graphics
Music
83
Bad design Splash Pages
84
Typical Ways of Measuring Usability
  • Learnability pick novice users of system,
    measure time to perform certain tasks.
    Distinguish between no/some general computer
    experience.
  • Efficiency decide definition of expertise, get
    sample expert users (difficult), measure time to
    perform typical tasks.
  • Memorability get sample casual users (away from
    system for certain time), measure time to perform
    typical tasks.
  • Errors count minor and catastrophic errors made
    by users while performing some specified task.
  • Satisfaction ask users' subjective opinion
    (questionnaire, interview), after trying system
    for real task.

85
Ten Usability Heuristics (1)
  • Visibility of system status
  • The system should always keep users informed
    about what is going on, through appropriate
    feedback within reasonable time.
  • Match between system and the real world
  • The system should speak the users' language, with
    words, phrases and concepts familiar to the user,
    rather than system-oriented terms. Follow
    real-world conventions, making information appear
    in a natural and logical order.
  • User control and freedom
  • Users often choose system functions by mistake
    and will need a clearly marked "emergency exit"
    to leave the unwanted state without having to go
    through an extended dialogue. Support undo and
    redo.
  • Consistency and standards
  • Users should not have to wonder whether different
    words, situations, or actions mean the same
    thing. Follow platform conventions.
  • Error prevention
  • Even better than good error messages is a careful
    design which prevents a problem from occurring in
    the first place.

86
Ten Usability Heuristics (2)
  • Recognition rather than recall
  • Make objects, actions, and options visible. The
    user should not have to remember information from
    one part of the dialogue to another. Instructions
    for use of the system should be visible or easily
    retrievable whenever appropriate.
  • Flexibility and efficiency of use
  • Accelerators -- unseen by the novice user -- may
    often speed up the interaction for the expert
    user such that the system can cater to both
    inexperienced and experienced users. Allow users
    to tailor frequent actions.
  • Aesthetic and minimalist design
  • Dialogues should not contain information which is
    irrelevant or rarely needed. Every extra unit of
    information in a dialogue competes with the
    relevant units of information and diminishes
    their relative visibility.
  • Help users recognise, diagnose, and recover from
    errors
  • Error messages should be expressed in plain
    language (no codes), precisely indicate the
    problem, and constructively suggest a solution.
  • Help and documentation
  • Even though it is better if the system can be
    used without documentation, it may be necessary
    to provide help and documentation. Any such
    information should be easy to search, focused on
    the user's task, list concrete steps to be
    carried out, and not be too large.

87
Apple Mac UI Design Principles
  • 1. Use of Metaphors
  • 2. Aesthetic Integrity
  • 3. Consistency
  • 4. Perceived Stability
  • 5. Direct Manipulation
  • 6. See and Point
  • 7. WYSIWYG
  • 8. Feedback
  • 9. Forgiveness
  • 10. User Control

88
Users and Designers Conceptual System

89
Designers Centred Design
  • The design is based upon
  • What can be built easily on this platform?
  • What can I create from the tools available?
  • What do I as a developer find interesting to work
    on?

90
User Centred Design
  • The design is based upon a user's
  • abilities and needs
  • context
  • work
  • tasks

91
Usability Engineering Lifecycle
  • 1. Know the User
  • 2. Competitive Analysis
  • 3. Set Usability Goals
  • 4. Parallel Design
  • 5. Participatory Design
  • 6. Co-ordinated Design of Total Interface
  • 7. Applying Guidelines
  • 8. Prototyping
  • 9. Usability Evaluation (Inspection and Testing)
  • 10. Iterative Design
  • 11. Follow-up Studies

92
1. Know the User
  • Observe Users in Working Environment Site
    visits, unobtrusive observation. Don't believe
    their superiors!
  • Individual User Characteristics Classify users
    by experience (see Figure), educational level,
    age, amount of prior training, etc.
  • Task Analysis Users' overall goals, current
    approach, model of task, prerequisite
    information, exceptions to normal work flow.
  • Functional Analysis Functional reason for task
    what really needs to be done, and what are merely
    surface procedures?
  • Evolution of User and Task Users change as they
    use system, then use system in new ways.

93
Categories of User Experience
94
2. Competitive Analysis
  • Competitive analysis of software components
  • Use existing interface framework as far as
    possible (Motif, MS-Windows, Java AWT) - saves a
    lot of work.
  • Use existing components and applications rather
    than re-inventing the wheel.
  • Competitive analysis of competing systems
  • Analyse competing products heuristically or
    empirically.
  • Intelligent borrowing'' of ideas from other
    systems.

95
3. Set Usability Goals
  • Decide in advance on usability metrics and
    desired level of measured usability
  • Financial impact analysis - estimate savings
    based on loaded cost of users, compared to cost
    of usability effort.

96
4. Parallel Design
  • Explore design alternatives - designers should
    work independently, then compare draft designs
    (see Figure).
  • Brainstorm with whole team (engineers, graphic
    designer, writer, marketing types, usability
    specialist, one or two representative users).

97
Parallel Design
98
How to Brainstorm
  • Meet away from usual workplace (different
    building, hut in the mountains).
  • Use plenty of paper. Cover the walls with it!
  • Draw. Scribble. Use lots of coloured pens.
  • Three rules during brainstorming
  • 1. No one is allowed to criticise another's
    ideas.
  • 2. Engineers must not say it can't be
    implemented.
  • 3. Graphic designer must not laugh at engineers'
    drawings.
  • Only after brainstorming, organise ideas and
    consider their practicality and viability.

99
5. Participatory Design
  • Have access to pool of representative users.
  • Guided discussion of prototypes, paper mock-ups,
    screen designs with representative users.

100
6. Co-ordinated Design of Total Interface
  • Consistency across total interface
    documentation, online help, tutorials,
    videotapes, training classes as well as screens
    and dialogues. By means of
  • Interface standards specific rules as to how
    interface should look and feel.
  • Widget libraries shared code implementing
    standard UI functionality.
  • Shared culture training, meetings, interface
    evangelist''.

101
7. Applying Guidelines
  • Smith S. Mosier J. Design Guidelines for
    Designing User Interface Software The MITRE
    Corp., 1986. 944 guidelines ISBN 9992080418
  • ftp//ftp.cis.ohio-state.edu/pub/hci/Guidelin
    es
  • Brown C. Human-Computer Interface Design
    Guidelines Ablex, NJ, 1988. ISBN 0893913324 302
    guidelines
  • Mayhew D. Principles and Guidelines in
    Software User Interface Design Prentice-Hall,
    1991. ISBN 0137219296 288 guidelines

102
8. Prototyping
  • Perform usability evaluation as early as possible
    in the design cycle by building and evaluating
    prototypes.
  • Prototypes cut down on either the number of
    features, or the depth of functionality of
    features
  • Vertical Prototype in-depth functionality for a
    few selected features.
  • Horizontal Prototype full interface features,
    but no underlying functionality.
  • Scenario only features and functionality along a
    pre-specified scenario (task) or path through the
    interface.
  • These varieties of prototype are illustrated in
    Figure.

103
Dimensions of Prototyping
104
Techniques for Implementing Prototypes
  • Verbal prototyping verbal description of choices
    and results.
  • Paper mock-ups printouts or sketches of screen
    designs.
  • Wizard of Oz human expert operating behind
    scenes.
  • Fake data similar data, images instead of video,
    etc. Simple algorithms ignore special cases.
  • Prototyping tools e.g. HyperCard, ToolBook.
  • UIMS (User Interface Management Systems)
    interactive interface builders such as Visual
    C.

105
9. Usability Evaluation
  • Usability Inspection
  • Inspection of interface design using heuristics
    and judgement (no user tests).
  • Usability Testing
  • Empirical testing of interface design with real
    users.

106
10. Iterative Design
  • Severity ratings of usability problems
    discovered.
  • Fix problems new version of interface.
  • Capture design rationale record reasons why
    changes were made.
  • Evaluate new version of interface.
  • design, test, redesign.

107
11. Follow-Up Studies
  • Important usability data can be gathered after
    the release of a product for the next version
  • Specific field studies (interviews,
    questionnaires, observation).
  • Standard marketing studies.
  • Instrumented versions of software log data.
  • Analyse user complaints, modification requests,
    bug reports.

108
Systems evaluation
  • Objectives of Evaluation
  • to determine the effectiveness or potential
    effectiveness of the system
  • to provide a means for suggesting improvement
  • The context of evaluation includes
  • the users experience
  • the type of task
  • the system being used
  • the environment in which the study takes place

109
Measurable Human Factors to Evaluation
  • Time to learn How long does it take for typical
    members of the user community to learn how to use
    some specified functions?
  • Speed to performance How long does it take to
    carry out the specified task?
  • Rate of errors by users How many and what kinds
    of errors do users make in carrying out the
    specified task?
  • Retention over time How well do users maintain
    their knowledge after an hour, a day, or a week?
  • Subjective satisfaction How much did the users
    like using various aspects of the system?
  • (Ben Shneiderman, 1998, Designing the user
    interface, Addison-Wesley)

110
When to do evaluation
  • Formative evaluation takes place before
    implementation in order to influence the product
    that will be produced
  • Summative evaluation takes place after
    implementation with the aim of testing the proper
    functioning of the final system.

111
Methods, techniques and tools
  • Analytic evaluation uses formal or semi-formal
    interface description to predict user
    performance.
  • Expert evaluation involves experts in assessing
    an interface
  • Observational evaluation involves observing or
    monitoring users behaviour while they are using
    a system.
  • Survey evaluation seeks to elicit users
    subjective opinions of the system.
  • Experimental evaluation uses scientific
    experimental practice to test hypotheses about
    the system.

112
Evaluation at Stages of Design
Previous evaluation
Systems Engineering
Analysis
Formative evaluation
Design
Coding
Implementation, maintenance
Summative evaluation
113
Analytic evaluation
  • Task analysis and task structure (HTA)
  • user interactions (input- process - output)
  • users operation, scenarios

114
Expert evaluation
  • It is essential to select experts who have
    knowledge of general principles for design and
    experiences in design.
  • the experts should not have been involved
    previous versions of the system under evaluation.
  • materials, interface presented to the experts
    should be representative to the real users use
    situation.

115
Observational evaluation
  • Direct observation observing users execution,
    making notes.
  • video recording record visible aspects of users
    activity, replayed and analysed (with users as
    participatory evaluation)
  • software logging record dialogue between the
    user and the system in time sequence of
    user-computer interaction.
  • interactive observation a hidden operator who
    simulates all the output from the system.
  • verbal protocols (think-aloud) record users
    spoken thought while task performing couple with
    video record

116
Survey evaluation
  • Structured interview easy to conduct, easy to
    analyses, but important details of users
    situation may not be recorded
  • questionnaires
  • open question answers are free to users
  • closed question answers are defined as multiple
    choices
  • flexible interview

117
Rating scales in closed questions
  • Check list
  • multi-point rating scale

Can you use the following text editing commands
? DUPLICATE PASTE
Dont know
yes
no
Rate the usefulness of the DUPLICATE command on
the following scale
Very useful
Of no use
118
Rating scales in closed questions
  • likert scale
  • semantic differential scale

Computer can simplify complex problems
Rating the Diagram package on the following
dimensions
extremely
quite
slightly
neutral
slightly
quite
extremely
Difficult confusing dreary
Easy clear fun
strongly agree
slightly agree
slightly disagree
strongly disagree
agree
neutral
disagree
119
Rating scales in closed questions
  • Ranked order questionnaire

Place the following commands in order of
usefulness (use a scale of 1 to 4 where 1 is the
most useful)
Paste
Duplicate
Group
Clear
120
Experimental evaluation
  • In an experimental evaluation, evaluators
    systemically manipulate the factors associated
    with design (input or independent variables) and
    study or measure the effects on users performance
    (output or dependent variables).
  • Example to test if different rates of learning
    (output) exist for three different type of
    interfaces (input as command, ask/answer, window
    based), this examination can be shown as this
    figure

121
Steps in experimental evaluation
  • 1 Formulating your goal of evaluation or
    hypotheses (based on principles)
  • 2 Developing predictions from the hypotheses
    (possible results or output)
  • 3 Choosing methods to test your predictions
  • 4 Identifying all the variables that might
    affect the result of your examination
  • 5. Deciding which are the input variables,
    output variables, and environment
  • 6 Selecting Subjects
  • 7 Conduct evaluation (examination), Collecting
    Data
  • 8. Conducting statistics or analysis, synthesis
  • 9 Conclusion of your evaluation (according to
    your hypotheses)

122
Three criteria for a valid evaluation
  • 1 The examiner must systematically
    manipulate/observe one or more independent
    variables in the domain under investigation
  • 2 The examination must be made under controlled
    conditions,
  • such that all variables which could affect
    the outcome of the
  • experiment are controlled
  • 3 The examiner must measure output as a function
    of the input variables.

123
Hypotheses
  • Hypotheses are conditional statements in the
    form
  • If conditions then result
  • The examination is set up to manipulate input
    variables to observe their output. The final
    result of the examination will either confirm or
    reject the hypotheses.
  • Hypothesis example
  • If users are provided a structured interface,
    then they will learn quicker than unstructured
    interface.

124
Conditions for experimental evaluation
  • System being observable To transform quality to
    quantity
  • System being stableThe system must be able to
    remain at some determined status under constant
    experimental conditions.
  • System being controllable The system output
    (dependent variables) must be sensitive to the
    change of input (independent variables).
  • (The selected dependent variables and
    independent variables must have an observable
    cause-effect relationship. )
  • You must have a goal for an experiment, e.g, to
    test some predefined hypotheses, to gain more
    information about problem space, to identify
    variables in relation to a problem under
    investigation.

125
Evaluation measuring
  • 1 Nominal Scale A classification of categories,
    qualitative differences, no quantitative meaning
    (e.g., group 1, group 2,......).
  • 2 Ordinal Scale Ordering according to
    quantitative differences, though it does not
    exactly tell how much the differences are (good,
    bad, ..).
  • 3 Interval Scale Assigning an equidistant to
    different status of measured phenomena (e.g.,
    Fahrenheit temperature scale)
  • 4 Ratio Scale There is an absolute zero point
    so we can make ratio calculation (e.g, distance,
    mass, time)

126
Subject selection
  • A subject is the person who participates in the
    examination.
  • Factors to consider in subject selection
  • 1. the subject's previous experiences
  • 2 the level of skill
  • 3. the number of subjects needed (n gt 6).
  • Single case study (n1) when
  • Time is limited the type of subjects are in
    short supply
  • general knowledge about human factors are of
  • interest and when generalisation is not
    important.

127
Subject grouping
  • Within-group design
  • Group 1 Group 2
  • With DSS No DSS
  • s1 s1
  • s2 s2
  • s3 s3
  • s4 s4
  • s5 s5
  • efficient in use of time and subject good for
    testing learning.
  • carry-over effects
  • Between-group design
  • Group 1 Group 2
  • With DSS No DSS
  • s1 s6
  • s2 s7
  • s3 s8
  • s4 s9
  • s5 s10
  • no carry-over effects (no order or experience
    problem).
  • subjects costly carry-over effects.
  • members in the same group must share the same
    properties

128
Subject grouping
  • Mixed factorial design - An example
  • Group 1
    Group 2
  • Interface style - picture Interface
    style - text
  • Trials, 1,2,3,4.
    Trials, 1,2,3,4.
  • s1
    s5
  • s2
    s6
  • s3
    s7
  • s4
    s8
  • efficient where carry-over effect are not great
    and where measurement
  • of learning performance is important
  • complicated when many input variables involved
    (in this example,
  • two kinds of input variables Interface
    Stylespicture, text (between
  • groups), and trials1,2,3,4 (within group or
    repeated measure).

129
Analysis and synthesis
  • Analysis
  • Dissect conceptually or physically
  • Learn the properties or behaviour of the
    separate parts
  • From the properties of the parts, deduce the
    properties/behaviour of the whole.
  • Synthesis
  • Identify the system of which the unit in focus is
    a part
  • Explain the properties or behaviour of the system
  • Explain the properties or behaviour of the unit
    in focus as a part or function of the system.

130
Which methods and when?
  • Differences among the five evaluation methods

Analytic Specification no users, tasks
Quantitative specified Expert Specificatio
n or Role playing, no qualitative prototype tas
k restrictions Observational Simulation or Real
users, no task Quantitative/ Prototype restric
tions qualitative Survey Simulation or Real
users, no task Quantitative/ prototype
restrictions qualitative Experimental Normally
full Real users, no task Quantitative/ prototy
pe restrictions qualitative
131
Which methods and when?
  • Advantages and disadvantages of the evaluation
    methods

Method Advantages Disadvantages Analytic Usab
le early in design Narrow focus, lack of
diagnostic out few resources required put for
redesign, broad assumptions of users
cognitive operations, complex Expert Strongly
diagnostic, overview Restrictions in role
playing, no real of whole, few
resources behaviour of user, difficult finding
experts Observational Quickly highlights
difficulties Observation side.effect on users
rapid iterative design, verbal performance,
time and resources protocols useful for
design consuming in data analysis Survey Address
es users opinions Bias to users experience, low
response diagnostic, rating scales to rates,
complicated and lengthy analysis, quantitative
result, large group time consuming for
interview. Experimental Powerful method,
quantitative High resources demands,
knowledge data for analysis, reliability
and requirement high, difficult integration
in validity good, replicable. design
132
Home work
133
Lifecycle or Waterfall Approach
134
System Engineering Activities
  • 1. Analysis of task, domain, users, hardware,
    software, all at a general level.
  • 2. Establishing user requirements
  • 3. Establishing system requirement
  • 4. Allocation of requirements to software
  • Purpose To identify the scope of the system and
    general design area.
  • Output A general requirement document that
    address each of areas mentioned above.

135
Analysis Activities
  • Tasks
  • Users
  • Domain
  • Software
  • Purpose To understand the information exchange,
    domain entities, required function,
    characteristics of interface, criteria of
    evaluation.
  • Output A specific requirement document.

136
Design Activities
  • Data structures
  • Software architecture
  • Procedural detail
  • User interface
  • Purpose To translate the requirements
    specification into a model for the software
  • Output A software design specification.

137
Coding Activities
  • Translation of design specification into
    machine-runnable form.
  • Purpose To produce a runnable version of the
    design specification.
  • Output A software based on requirement
    specification.

138
Testing Activities
  • Logical testing of the software
  • Testing of the functionality
  • Testing of the usability
  • Testing of the efficiency of the design and
    implementation.
  • Purpose To assess the quality of the design and
    the coding.
  • Output An assessment report on the design
    quality and recommendations for redesign.

139
Maintenance Activities
  • Repairing any errors or faults in the design or
    the coding.
  • Updating the design because of changes in the
    requirement.
  • Updating of the design because of change in the
    environment.
  • Purpose To allow the software to be adapted to
    the changes.
  • Output Revised set of requirements, design and
    software.

140
Problems About the Iifecycle Model
  • 1 Real projects are not sequential in the rigid
    way that this model assumes, but in a different
    order with iteration.
  • 2 It is not possible to elicit or identify all
    the requirements at the start of the project
    because of unpredictable changes.
  • 3 It is often expensive to correct design and
    coding errors in the late process of testing and
    maintenance.

141
Advantage About the Lifecycle/Waterfall Model
  • 1 It provides a comprehensive template (as of
    DNA) in which many important aspects of design
    can be placed.
  • 2 It provides generic steps that are found in
    most software engineering paradigms.
  • 3 It is the most widely used model at least in
    large software project.
  • 4 It is superior than unplanned design
    (haphazard).

142
Prototyping Approach
  • Introduction
  • The use of experimenting with prototypes has
    become prominent due to a number of claimed
    advantages in the case that users requirements
    are difficult to specify. In principle users
    should be highly motivated in acting since they
    are provided with more chances to improve their
    work, to verify if their needs are taken care of
    and that the terms used in the interface,
    functions of the designed system are consistent
    with their work.

143
Types of Prototypes
  • Throwaway prototype to initiate user interest,
    develop builder skills, to reduce risk and
    investment.
  • Evolving prototype Adaptive, prototype to be
    product
  • Co-operative Prototype Participatory Design (PD)
  • Embryonic Prototype Feedback Learning, organic
    development.

144
Model of Prototyping Approach
145
Steps in Prototyping
  • Requirements Gathering Designers and the
    customer define the overall objectives of the
    system, and some known requirements.
  • Quick Design Focuses on the design of interface.
  • Build Prototype Choose the types of prototypes
    and build quickly.
  • Evaluate and Refine Involving the designers, the
    customers, and the users to experience the
    prototype and elicit requirement more specific
    than requirement gathering.
  • Engineer product Based on the prototype, the
    final product is engineered
About PowerShow.com