EDSS Program Review: June 2002 - PowerPoint PPT Presentation

About This Presentation
Title:

EDSS Program Review: June 2002

Description:

6/19/2002. Applied Physics Laboratory/ Naval Research Laboratory. 1 ... COPIER. Coffee. FILE. T. O. O. Wind. FILE. FILE. SMOOS. SMQ-11 (Satellite) : Chair. F. IT21. T ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 43
Provided by: davidjones9
Learn more at: https://apl.uw.edu
Category:
Tags: edss | copier | june | program | review

less

Transcript and Presenter's Notes

Title: EDSS Program Review: June 2002


1
EDSS Program Review June 2002
  • David JonesAPL-UW
  • Jim Ballas NRL

2
Outline
  • David
  • Introduction
  • Project Overview
  • HCI Experience
  • Progress Report

3
Outline
  • Jim
  • UCD Process
  • Task Analysis
  • Evaluations
  • Future Plans

4
APL-UW Team Members
  • Bob Miyamoto- PI
  • David Jones
  • Troy Tanner Bill Kooiman

5
APL-UW Background
  • Miyamotos Group
  • Env Effects on Sensors
  • TDA development
  • Training Tools
  • Me

6
Project Overview
  • General Philosophy- the design model fits the
    users mental model UCD
  • Perform task analyses that feed into the
    interface design its evaluation
  • Support EDSS developers with UCD and HCI
    standards and guidelines
  • Provide iterative feedback

7
Example of UCD Application DMARS
  • Study the users information needs
  • Study how the user performs given tasks
  • Create an intuitive process
  • Involve users in design

8
High Seas Workflow System
  • Helps produce the High Seas Warning
  • Heard from the supervisors
  • Then heard from actual users
  • Different stories
  • Created a system for the users
  • Flexible design

9
HSW cont
  • Funded by DARA SPAWAR
  • Written in JAVA
  • Uses Polexis XIS for map functions
  • Running operationally at SD METOC Center

10
FNC Project EVIS
  • Studying human-system component of METOC support
  • Worked closely with users and customers
  • Conducted experiments
  • Performed Cognitive Task Analysis
  • Gone to sea for evaluation

11
Progress Report
  • Initial Task Analysis -Jim
  • Gaining Domain Knowledge
  • Training Observations- David
  • Organizing a UCD Workshop

12
Gaining Domain Knowledge
  • EDSS Users Guide
  • Draft Mission Needs Statement for Distributed
    Collaborative Planning Systems for Expeditionary
    Forces
  • COMPHIBGRU THREE 041605Z OCT99
  • 4.X Tiger Team User Input Spreadsheet
  • Pubs NWP 302.1 ATP 3 ch 6, JP 3-02

13
Training on the USS Tarawa
  • Attended training in San Diego Apr
  • Enthusiastic users
  • Hands-on training well received
  • Great audience for usability evaluation

14
Example of User reactions
  • Staff personnel were excited about creating
    overlays for planning
  • But..
  • Menu headings caused some confusion
  • Are Assault Plans part of AOA Mgmt?
  • When do I use the Env DB?

15
User reactions
  • Navigation among the different windows was
    difficult at times
  • Some windows require expertise that all users
    might not have or forgot

16
Quick Thoughts after Training
  • Most users want it to look like Windows
  • DII UIS provides HCI guidance
  • Some ideas
  • Back Arrows Undo command
  • Web-based and searchable users guide
  • Tooltip help- On mouse of menu title
  • Workflow wizards or web-based training

17
UCD Workshop
  • Scheduled for 30 July 2002
  • At SAIC Tysons Corner office
  • Our ideas
  • UCD Processes
  • DII HCI Standards
  • HCI Design Principles with examples
  • HCI Evaluation
  • SAIC Ideas?

18
Jim Ballas
  • UCD HCI
  • Task Analysis
  • Evaluations
  • Future Plans

19
NRL-WDC
  • Team Members
  • Jim Ballas
  • Ph.D. in Applied Experimental Psychology
  • Derek Brock
  • M.S. in Computer Science, HCI emphasis
  • Beth Kramer
  • M.S. in Human Factors Psychology
  • Janet Stroup
  • B.A., some graduate CS coursework

20
NRL-WDC Interface Design and Evaluation Section
  • 4 Ph.Ds on staff of 10
  • Expertise in HCI, Human factors, Cognitive
    Science, Computer Science, Auditory perception
  • Projects include AEGIS (with LMC), DDX (with
    Raytheon), NATO ST, KSA EVIS Management
  • 6.1 to 6.3 projects

21
NRL-WDC Interface Design and Evaluation Section
  • HCI Research cited in Major reference documents
  • Handbook of Human Computer Interaction
  • ACM CHI Conference
  • 2001 paper Demystifying Direct Manipulation
  • Wrote and Revised Operator Workstation Evaluation
    section for IUSW-21 at Sea Test this Sept

22
User Centered Design
  • Following approach outlined in NATO COADE
    document
  • Additional principles HCI as an instance of
    language use

23
User Centered Design COADE
24
User Centered Design COADE
25
Viewing HCI as an Instance of Language Use
  • The design and implementation of an effective
    software application and its user interface is
    ultimately a communication problem that always
    involves both the designers meaning and the
    users understanding
  • The principles at work in peoples use language
    form a comprehensive framework for the design of
    human-computer interaction

26
Principles of Language Use
  • Any form of communication between people is an
    instance of language use language is used to do
    things together
  • Language use requires people to coordinate their
    actions and their attention (cognition) it
    always involves speakers meaning and
    addressees understanding.
  • Meaning and understanding require common ground

27
Common Ground
  • Common ground is knowledge that people establish
    they can use with each other on the basis of
    shared experience
  • When common ground is missing, meaning and
    understanding breakdown
  • Building common ground is always a serial process
    - even though the resulting shared knowledge may
    contain gaps

28
Layers in Language Use and in HCI
  • Language use frequently involves more than one
    conceptual layer of activity telling a story,
    for instance, involves at least two layers
  • The story teller and the listener participate as
    themselves in the first layer
  • In the second layer, the events of the story take
    place
  • Similarly, HCI has two principal layers of
    activity
  • The designer and user participate as themselves
    in the first layer
  • In the second layer, the user interacts with the
    computer as if it (and not the designer) were the
    users counterpart
  • Each layer in an instance of language use makes
    different demands of the users language use
    skills

29
Language Use Issues in HCI
  • In the first layer of HCI, designers, through the
    softwares presentation, must help users to
    compensate for gaps that direct access (through
    menus, etc.) imposes on the process of building
    coherent common ground
  • In the second layer, wherever possible,
    interfaces should be designed to allow users to
    establish and use common ground with the
    interface itself as a regular part of their
    interaction with the computer

30
Guidelines, Standards, and Relevant Literature
  • DII User Interface Standards MilStd2525
  • Research on distributive planning Klein Miller
  • Work directed by NRL
  • Cited in MCDP-5
  • General human factors and HCI literature

31
Work to Date Initial Task Analysis Partially
Complete
  • Initial Task ListEDSS planning
  • Make Basic Decisions
  • Create Operational Area
  • Determine Landing Craft
  • Complete Default Craft Parameters Table
  • Make Navigation Decisions
  • Design Sea Echelon Areas
  • Free Hand
  • 4W Grid
  • Select HLZ
  • Select Beach Center and Boat Lane
  • Design Routes
  • Design Display
  • Determine Ship-to-Shore Movement

32
Task Analysis other tasks
  • Administrative/file operation tasks
  • Log On And Initializing
  • Log Off
  • Installing New EDSS Software
  • Exporting Plans
  • Importing Plans
  •  GCCS-M Tasks
  • Saving Slides
  • Deleting Slides
  • Exporting Slides
  • Installing Maps And Charts
  • Retrieving Maps Or Charts
  • Removing Maps Or Charts
  • Uninstalling Maps, Charts, And Imagery
  • Line Of Sight (Los) Profile

33
Future Work
  • Complete task analysis
  • Including cognitive analysis using Critical
    Decision Method
  • Workshop
  • To include illustrations of design issues, e.g.
    illustration of lighting/ filtering effect on
    color images

34
Future Work Evaluation
  • Approach evaluate user performance and compare
    to desired performance
  • Examples
  • DMARS
  • EVIS
  • Software Development Tools and Processes

35
DMARS Evaluation Summary
  • Observe and compare use of three media
  • Paper (NAVOCEANO Mine Warfare Pilot publication)
  • WWW (based on MWP, so called RP-WEB
  • UC-CD (User Centered Digital METOC Acoustic
    Reference Manual - DMARS)
  • Five METOC tasks prepare brief and answer 4
    questions
  • 12 METOC officers and enlisted personnel
  • NAS Patuxent River
  • NAS North Island
  • Each person tested on five tasks using three media

36
DMARS Evaluation Process Measures
  • Task timing logged with Activity Catalog Tool
  • a NASA sponsored tool
  • Coding scheme distinguished following tasks
  • Acquire information
  • Browse search for topic in the METOC document by
    navigating from one section to another to another
    (Search for and Move To)
  • Interpret read information from a specific
    section
  • Assemble briefing document
  • Compose/edit generate and/or modify the document
    using a word processor or presentation software
  • Copy/paste copy material from the METOC document
    into the briefing document.

37
DMARS Evaluation Outcome measures
  • Accuracy
  • On Problem 1, Subject presented brief and
    experimenter graded eight items
  • Required analysis, not just picture
  • Problems 2-5, Subject supplied answers.
  • Number of images used in briefing
  • Preference

38
DMARS Evaluation Time to Find Information to
Prepare Briefing
  • Significant effect of document type
  • F(2, 16) 5.62, p lt .01
  • RP-WEBgt UC-CD, PAPER
  • Discussion
  • Effect on key problem
  • Only on browse time
  • No briefing preparation differences (when time to
    manually prepare slides added to PAPER condition
  • No interpretation time differences
  • Magnitude
  • RP-WEB 160s longer than UC-CD on a task which
    overall takes 1260 s (13 time increase)

39
DMARS Evaluation Accuracy
  • Significant effect f document type F(2,16) 3.48
    , p .055
  • UC-CD gt PAPER, RP-WEB _at_ p .087
  • Discussion
  • UC-CD errors on problems 4 and 5 due to omission
    of location selection
  • UC-CD errors on winds in Problem 1 due to user
    inexperience with new form of wind vector--only a
    short training period used.

40
Example of At Sea Observations USS CARL VINSON
during Battlegroup training (COMPTUEX)
Door
Storage (document)
Wind
SMQ-11 (Satellite)
NITES Server RAID
Door
F Forecaster T Technician O Project Observer
Printer
O
SMOOS
NITES Server
SPA-25(Radar)
Desk
NITES NT
Chair
Desk
FILE
IT21
CCTV
NITES NT
Port Hole
CCTV
NITES NT
FILE
O
FILE
NOW
COPIER
Coffee
41
At Sea Observation Methodology
  • Office Environment
  • Observed forecasters workflow (not-interfering
    basis)
  • Two watches a day (12 on, 12 off)
  • Each session is about 12 hours long
  • Two Observers for each watch two observations
    per watch
  • Equipment
  • Three video cameras
  • Note taking
  • Palm Pilot for recording the timing of the task
    Performance
  • Questionnaires
  • Interviews

42
Future Work Summary
  • Workshop
  • Task Analysis
  • Design recommendations
  • Evaluation
Write a Comment
User Comments (0)
About PowerShow.com