What is evaluation - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

What is evaluation

Description:

Ethnomethodology / conversation analysis. Breakdown analysis. Distributed cognition ... distributed cognition and ethnomethodology to consider shared ... – PowerPoint PPT presentation

Number of Views:72
Avg rating:3.0/5.0
Slides: 29
Provided by: magnus6
Category:

less

Transcript and Presenter's Notes

Title: What is evaluation


1
What is evaluation?
  • Eliot Stern (1992) Evaluation is any activity
    that throughout the planning and delivery of
    innovative programmes enables those involved to
    learn and make judgements about the starting
    assumptions, implementation processes and
    outcomes of the innovation concerned.
  • Basic types
  • formative studying something to make it better
  • summative studying something after the event

2
Why evaluate?
  • Avoiding the junk cupboard
  • What effect does it have on the individual?
  • What effect does it have on the organisation?
  • as working group
  • as formal organisation
  • collective effects on the individuals
  • Does it meet the pre-defined criteria?
  • Checking functionality
  • Should I install it in my company?

3
Some evaluation techniques (list may look
familiar)
  • Heuristic evaluation (expert evaluation)
  • Cognitive walkthroughs
  • Questionnaires, interviews and focus groups
  • Cooperative design methods.
  • Laboratory experiments
  • GOMS and task analysis
  • Ethnomethodology / conversation analysis
  • Breakdown analysis
  • Distributed cognition
  • Activity theory

4
Problems of evaluation
  • Multiple disciplines, issues and techniques
  • Multiple stakeholder groups with their viewpoints
    and criteria for success
  • Multiple activities under the term evaluation (a
    disputed category)

5
Participatory Evaluation Through Redesign and
Analysis (PETRA)
  • Began life as two MSc projects, to evaluate a
    synchronous shared editor (ShrEdit)
  • Evaluator's perspective analyse the activity
  • Participant's perspective redesign the tool
  • Evaluator's perspective used distributed
    cognition and ethnomethodology to consider shared
    understanding
  • Participant's perspective used low-tech redesign
    methods (Playschool) to redesign collab writing
    software

6
Systemic Evaluation Through Stakeholder Learning
(SESL)
  • Dont make evaluation a matter of judgement of
    absolute right or wrong
  • Rather, take evaluation to be part of
    organisational learning
  • Focus on process not outcomes
  • Consider many stakeholder viewpoints
  • Look at the whole system

7
SESL Overview
Determine system
Identify stakeholders
Study
Key issues
Analyse
Feedback results
8
Determining the system
  • Whats a cooperative system?
  • a combination of technology, people,
    organisations and processes that facilitates the
    communication and coordination necessary for a
    group to effectively work together in the pursuit
    of a shared goal, and to achieve gain for all its
    members
  • What is this cooperative system?
  • Cf. Checkland root definitions

9
What type are you?
  • Are you studying the socio-technical system, or
    just the social or technical?

ACTUAL
Effects People-Focus
Conceptual
OBJECTIVES
EFFECTS
Buying
Formative
POTENTIAL
10
On stakeholders
  • Ways of finding out your stakeholders
  • Who affects, depends on or can influence the
    system? Who is affected by it or is influenced by
    it?
  • Take a list of typical stakeholders tailor
    them, e.g. those who use the software, their
    colleagues and managers, the software developers
    and retailers, the Information Systems department
    of their organisation (if appropriate), and
    perhaps the customers of the organisation.
  • Have a representative group collectively
    construct a stakeholder map.

11
A sample stakeholder map
12
The study/analysis cycle (1)
  • Classical evaluation
  • get to the lab or the field, study use
  • analyse data (somehow), look again
  • Note this appears quite late in SESL
  • Questions to guide process (depending on type)...

13
The study/analysis cycle (2)
  • 1. What are the systems effects upon...
  • the work of the group using the system? the
    life of the group? the life of the people in the
    group? the life and work of the people outside
    the group? the organisation(s) of which the group
    is a part?
  • 2. What are the systems objectives (from the
    different perspectives of the various
    stakeholders)? To what extent are these being
    met?
  • 3. What are the potential effects of the system
    upon...
  • 4. What are the objectives for this new system
    (from different stakeholder perspectives)? How
    well are they likely to be met?

14
Feedback
  • To all stakeholders
  • As a guide for learning
  • Not just a result rather part of an ongoing
    process of organisational learning

15
SESL as questions...
16
SESL Q1. What do you want to get out of your
evaluation?
  • Why are you doing the evaluation?
  • What do you want to get out of it?
  • to make changes to the existing system?
  • to make judgements about the effects of a
    long-established system?
  • to decide which new software/hardware to buy?
  • How will the results be made available?

17
SESL Q2. What do you want to evaluate?
  • Main concern technology, social processes or a
    mixture of the two?
  • Levels of the onion
  • Scope of the system
  • Function What does the system do? What is it
    supposed to do?
  • Complexity Is it 'high-tech' or 'low-tech'? What
    is the interaction with other systems and media?

18
SESL Q3. Who are the stakeholders what are
their interests?
  • Some kinds of stakeholders
  • users/customers/clients of the technology
  • users/customers/clients of facilities supported
    by the technology
  • those who pay
  • those who run / service the technology
  • Who are the key stakeholders whose views need
    most to be met?
  • What are the system's effects upon them?
  • What are the interests of the various
    stakeholders involved?

19
SESL Q4. About the system - getting into the
nitty-gritty
  • Who do the main users co-operate with, and how?
  • What kinds of co-operation go on?
  • What are the key MIS technologies, and what is
    their role?
  • Do they help or hinder co-operation?
  • How do your MISs fit into the overall management
    of the institution?
  • How could the technologies be better used? What
    obstacles need to be overcome to achieve this?

20
An example of SESL...
21
SESL as pro-forma
  • System cooperation within Unit for Facilities
    Management Research (UFMR), Sheffield Hallam
    University, via network drive
  • Main Technology Filestore I drive
  • Evaluation type formative, in-use,
    socio-technical
  • Stakeholders members of Unit customers support
    staff rest of faculty rest of university
  • Methods Interviews hanging around
  • Analytic Framework Distributed Cognition, Common
    Information Spaces

22
Key questions
  • Who do you co-operate with, and how?
  • What kinds of co-operation go on?
  • Who are the key stakeholders of co-operation
    within UFMR? Are their stakes met?
  • Whats the role of email, the common filestore
    (I drive) and Web pages?
  • Do they help or hinder co-operation?
  • How could they be better used?

23
Findings general
  • Five offices, two groups
  • What will happen after move?
  • Most co-operation face-to-face
  • Little formal co-operation
  • Problems with scaling up?
  • Dissemination of information 1) organisational
    memory 2) top-down
  • Most people seem to work alone

24
Findings technology
  • I drive
  • Used extensively by some, less so by others
  • General feeling of lack of structure
  • Importance of being-in-the-know
  • Scaling-up?
  • Email
  • Commonest use is around university
  • Most external communication by phone/letter
  • Big problem with junk mail limits use

25
Recommendations general
  • Shift to new office brings real opportunities
  • Is SHU organisational culture most appropriate?
  • Bringing the two sides of the Unit together
  • Why not change the way we work? (BT)
  • Information dissemination needs to be clearer

26
Recommendations technology (I)
  • If possible, look at new technology
  • New PCs
  • Windows 95 (longer filenames, dates in open
    file etc.)
  • External dialup possibilities
  • Lotus Notes (text search, calendars, classify
    documents, workflow etc.)
  • Email filtering
  • What possibilities come from UFMR having its own
    fileserver?

27
Recommendations technology (II)
  • Whether or not new technology
  • Need to develop clear filenames, locations
  • Make sure theyre adhered to
  • Publicise what info is available/useful
  • Model of all information being public by default,
    unless good reason
  • Someone needs to take responsibility for the I
    drive filestore
  • Look at training options

28
References
  • Ramage 1999, The Learning Way Evaluating
    Co-operative Systems, PhD thesis
    http//phoenix.open.ac.uk/magnus_ramage/LearningW
    ay.html
  • Ramage 1997. Developing a methodology for the
    evaluation of cooperative systems. Proceedings of
    IRIS 20, http//www.comp.lancs.ac.uk/computing/
    research/cseg/projects/evaluation/developing.html
  • Preece et al 1994, Human Computer Interaction
  • M.Q. Patton, Utilization-Focused Evaluation, 1986
    (or others by Patton)
Write a Comment
User Comments (0)
About PowerShow.com