5th Annual Conference on the Teaching of Computing Assessment methods employed in UK Higher Educatio - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

5th Annual Conference on the Teaching of Computing Assessment methods employed in UK Higher Educatio

Description:

5th Annual Conference on the Teaching of Computing ... Mullet and Sanyo. Designing Visual Interfaces. 5. 2. Smith, Andy. Human factors. 14. 6 ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 34
Provided by: grah64
Category:

less

Transcript and Presenter's Notes

Title: 5th Annual Conference on the Teaching of Computing Assessment methods employed in UK Higher Educatio


1
5th Annual Conference on the Teaching of
Computing - Assessment methods employed in UK
Higher Education programmes
  • D. GrahamSchool of Computing and Mathematical
    SciencesUniversity of Greenwich30 Park
    RowLondon SE10 9LSUKE-mail D.Graham_at_gre.ac.uk
    http//www.cms.gre.ac.uk

2
1. Introduction
  • Assessment is of fundamental importance in higher
    education.
  • Students take their cues from what is assessed
    rather than what is asserted by lecturers as
    being important. 17.
  • The purpose of assessment is to enable students
    to demonstrate that they have fulfilled the
    learning outcomes of the pathway and that they
    have achieved the standard required for the
    award(s) they seek. (University
    of Greenwich, 1998 20).

3
  • Assessment can be described by the terms
    Measurement, Assessment, Evaluation, Test and
    Examination.
  • There are a number of other terms which describe
    specific assessment, namely Formative,
    Summative, Norm-referenced and Criterion-reference
    d.
  • A further distinction can be made between the
    kinds of the tests employed for assessment, i.e.
    written tests (which include essays and objective
    tests), and practical assessments and oral
    examinations.
  • The cardinal criteria for assessment are those of
    validity, reliability, discrimination, and
    practicality or utility.

4
  • Different research methods can be used within the
    positivist/interpretative approaches, including
    surveys using questionnaires interviews
    observation and participation and documentary
    analysis 6.
  • Robson 18 defines a survey as the collection
    of a small amount of data in a standardised form
    from a relatively large number of individuals.
    This standardised data is usually collected using
    a questionnaire.
  • The main criticism of the survey researchers is
    that their preset response categories determine
    the way the respondents can answer a question,
    making it impossible to evaluate the validity of
    their answers 9.

5
2. Course Case Study
  • Aims
  • To provide an understanding of the cognitive
    psychology issues related to user interfaces
  • To provide the student with the knowledge of how
    user-centred design helps building user
    interfaces which are easy to learn and friendly
    to use
  • To provide the student with the knowledge of how
    software engineering techniques such as formal
    specifications, task analysis and object oriented
    design can be used for the development of user
    interfaces
  • To provide an understanding of how complex
    multimedia systems can be designed and
    implemented
  • To provide the student with an understanding of
    the main principles associated with virtual
    environments

6
  • Learning Outcomes
  • On completion of this unit, students will be able
    to
  • Demonstrate an understanding of the nature of
    cognitive psychology and how it influences the
    ways in which users interact with computer
    systems,
  • Formally specify, analyse, design and implement
    user interfaces for complex software systems,
  • Develop multimedia applications which incorporate
    advanced user interaction techniques,
  • Demonstrate an understanding of the theory and
    application of virtual environments,
  • Demonstrate a critical awareness of issues in
    Human Computer Interaction.

7
  • Main Learning and Teaching Activities Lectures,
    practical and classroom based tutorials,
    programming and use of multimedia, graphics and
    virtual environment packages.
  • Assessment Details In accordance with department
    guidelines 19, 20, the course is assessed as
    70 examination, with a 3 hour examination
    assessing learning outcomes A, B, D, and E, and
    30 coursework. Coursework is on designing and
    implementing a multimedia system, testing
    learning outcomes A, B and C.
  • Pre-requisites None. Implicit through
    progression route.
  • Key Texts and Recommended Reading Preece et al.
    (1994), Dix et al. (1997), Faulkner (1988 2000),
    plus recent journals and websites on HCI.

8
3. Critical Evaluation of HCI Course Assessment
Practices
  • Assessment aims and strategy
  • The learning outcomes of the course emphasise
    practical skills as well as the acquisition of
    theoretical, conceptual and empirical knowledge.
    Assessment strategy includes formative as well as
    summative assessment tasks.

9
  • Assessment planning
  • Human-Computer Interaction (level 3) breakdown 1
    Individual coursework assignment early in
    semester 2 (30) 1 Examination in semester 2
    (70).
  • Five learning outcomes are given for HCI. Skills
    include specification, analysis, design and
    implementation of user interfaces, and critical
    awareness of issues in HCI these are implicit in
    the outcomes.
  • The method of assessment chosen is consonant.
  • The method is reasonably efficient in terms of
    student and staff time.
  • Greater reference should have been made to the
    cognitive domain of Blooms 1 taxonomy in
    relation to the learning outcomes for the HCI
    course.
  • Grading scales used incorporate both percentage
    grading (0-100) and literal grading (A-F).

10
  • Assessment of knowledge and understanding
  • The coursework for HCI is akin to a small
    project, with the inclusion of an application
    (website). Objective testing is applicable to
    large student numbers, however we are not
    satisfied that they could be used to demonstrate
    the achievement of learning outcomes beyond level
    2.
  • Assessment and evaluation of group work
  • The current HCI course does not involve group
    work. Student numbers make self-assessment and
    peer-assessment impossible because of issues such
    as workload, collusion and plagiarism.

11
  • Assessment of practical skills and work-based
    learning
  • Laboratory work is usually assessed by both a
    report and a demonstration. There are criteria
    for assessing both components, an extended
    checklist. Demonstrations tend to provide a fast
    and reliable way of assessment and feedback. The
    Halo effect does not occur simply because of the
    number of students. Reports are harder to assess
    because of the wide variations, and are also very
    time consuming. There are major problems with
    plagiarism for both software and reports (written
    assessments). The checklists used are good and
    allow for extras beyond the required outcomes.
    For demonstrations they tend to be both valid and
    reliable.

12
  • Assessment process management
  • The assessment regulations used mirror those
    suggested by Brown et al 3.
  • The procedures for handing-in assessments and
    requests for extensions, are precisely as those
    stated in Quinn 17.
  • The conduct of formal written examinations is
    also as described in Quinn 17. For the HCI
    course in 2002/3, the small number of 30 students
    resulted in there being no problems with the
    assessment process management.
  • Advice on the presentation of assessment work for
    HCI was given mainly during timetabled HCI
    workshops, as well as in the tutorials and labs.

13
4. Survey
  • A survey was conducted at the 6th HCI Educators
    Workshop in April 2003 in Edinburgh 11, 12.
  • These delegates constituted a small, but highly
    representative sample of HCI Educators.
  • The questionnaire provided a mixture of mostly
    quantitative and some qualitative data. There
    were 29 workshop delegates, and 62 (18/29)
    delegates responded. All of these responders
    teach HCI.

14
Courses taught by Responders Levels and Years
15
Courses taught by Responders Levels and Years
16
Courses taught by Responders Duration
17
Courses taught by Responders Duration
18
Numbers of Students on Courses
19
Numbers of Students on Courses
20
Teaching Methods used by Responders
21
Teaching Methods used by Responders
22
Assessment Methods used by Responders
23
Assessment Methods used by Responders
24
Literature used by Responders
25
Literature used by Responders
26
5. Conclusions
  • The Case Study was conducted in the summer of
    2002. Overall it can be concluded that the Case
    Studys programme and HCI course assessment
    policy and practices, are well considered in
    terms of broad social and educational purposes,
    although several possible improvements have been
    identified.
  • The survey was conducted in April 2003. The
    problem described by Foddy 9, was prevalent
    with respect to responses on teaching and
    assessment methods, i.e. coursework.
  • Statistical evaluation of results beyond
    frequency counts and percentages was felt to be
    inappropriate for such a small sample size.

27
  • From the survey we can conclude however, that HCI
    is taught mostly at level 3 for a single
    semester, for groups of between 1 and 50
    students. Lectures are still the most commonly
    used teaching method. For assessment, the main
    method used is Miscellaneous Coursework, followed
    closely by examination.
  • If the former is assumed to be assignment-based
    then lab work, essays, reports, presentations,
    logbooks and peer assessment, could all be added
    to the total, making 78 (31/40) of responses for
    assignment-based assessment, and 22 (9/40) for
    examination-based assessment.
  • The most commonly used books were those adopted
    for the Case Study course, i.e. Preece et al.
    14 and Dix et al. 8.

28
  • Both the evaluation and survey have led to
    improvements, embodied in the revised
    documentation for the HCI course 2003/4 11, 12.
  • The critical evaluation, achieved through
    Documentary Analysis, led to the current Bloomed
    version of the Course Specification in 2003/4.
  • The subject coverage, coursework and examination
    assessment of the HCI course was found to be more
    than satisfactory, post workshop, through
    personal communications.

29
  • A follow-up survey was conducted at the 7th
    Educators Workshop on HCI in Preston in April
    2004. The aims of this survey were to clarify
    some of the findings of the first.
  • There were only 10 participants, 5 of whom did
    the survey last year, 3 did not do the survey,
    and 2 did not specify. For these 10, HCI was
    taught mainly at level 2 and MSc (1st year).
  • As was found with the initial survey, lectures
    are the most commonly used teaching method for
    groups of 1-50 students.
  • Exams however, were the most commonly used
    assessment method.

30
  • Practical assessment was found to be included
    more than once.
  • In reality, examination and coursework (a
    practical assignment) are likely to be the most
    common joint assessment method.
  • There was little change regarding books, except
    Dix was ahead of Preece by one.

31
  • Future work could include the replication of this
    study for other courses and programmes in order
    to determine whether or not the assessment
    characteristics are particular or generic to
    individual courses and programmes. The temporal
    effects of this study and any future one, should
    be acknowledged.

32
Acknowledgements
  • We thank the delegates of the 6th Educators
    Workshop on HCI in Edinburgh, for participating
    in the initial survey and making it possible and
    to the same and new delegates of the 7th Educator
    Workshop on HCI in Preston for their
    participation in the follow-up survey.

33
Thank youQuestions?
Write a Comment
User Comments (0)
About PowerShow.com