Some lessons from researching practical science summative assessment - PowerPoint PPT Presentation

About This Presentation
Title:

Some lessons from researching practical science summative assessment

Description:

Title: Slide 1 Author: Shaun P. Roberts Last modified by: Ros Roberts Created Date: 12/14/2005 8:58:30 AM Document presentation format: On-screen Show – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 24
Provided by: Shau1156
Category:

less

Transcript and Presenter's Notes

Title: Some lessons from researching practical science summative assessment


1
  • Some lessons from researching practical science
    summative assessment
  • Ros Roberts
  • STEM 2010 Beyond the Classroom.
  • Cambridge Botanical Gardens, 17th May 2010

2
Starting points 1
  • Summative assessment structure should encourage
    fieldwork in teaching including
  • Observations seeing the world thro a
    biologists / scientists eyes
  • Solving problems / investigations
  • Assessment structures should not distort practice
    in TL (c.f. routinisation of very limited
    range of Sc1 coursework)

3
Starting points 2
  1. Assessment in high stakes summative assessment
    system must be reliable
  2. Tensions between reliability and validity of
    performance assessment

4
Assessing performance
  • Assessing the ability to do an investigation out
    of the classroom
  • Via written accounts
  • Need up to 10 investigations to get reliable
    aggregate mark . Valid, reliable but impractical.
  • Therefore increase reliability but reduce
    validity in fewer assessments by
  • restricting the type of investigation (which can
    then be practiced/routinised)
  • Not really assessing doing but giving credit
    mainly for the substantive ideas (predictions,
    hypothesising, background, etc) which are also
    given credit in other parts of exam system.
  • Research suggests that ability of investigate
    is not directly correlated with other traditional
    measures of ability.

5
Alternatives?
  • Assessing the ability to do an investigation out
    of the classroom
  • Cant be done in mass assessment system
  • So, whats the next best thing?
  • Could focus on the ideas employed when
    investigating

6
Ideas in investigations e.g. factors affecting
shrimp distribution
7
What is to be assessed?
Substantive ecological / biological
ideas Concepts of evidence (the thinking behind
the doing) Observation Skills both written and
practical
8
Domain specification
  • i.e. What to teach and what to assess.
  • Clear articulation is essential for curriculum
    and assessment.
  • e.g. Domain Specification for Procedural
    Understanding
  • Concepts of Evidence the thinking behind the
    doing
  • These are the ideas that are needed to develop a
    procedural understanding
  • Validated against pure and applied work-place
    science
  • A sub-set required for GCSE to enable pupils to
    solve practical problems and evaluate others
    evidence
  • Validated by teachers attending our development
    sessions
  • http//www.dur.ac.uk/rosalyn.roberts/Evidence/cofe
    v.htm

9
Starting points 1
  • Summative assessment structure should encourage
    fieldwork in teaching including
  • Observations seeing the world thro a
    biologists / scientists eyes
  • Solving problems / investigations
  • Assessment structures should not distort practice
    in TL (c.f. routinisation of very limited
    range of Sc1 coursework)

10
Bring own data
  • What did you investigate?
  • Pick one example from your investigation where
    you were careful to make your design valid.
    Describe what you did, and explain how and why it
    helped to make the design valid.
  • Pick one example from your investigation where
    you made a series of repeat measurements - to
    take an average. Copy the series down. Explain
    how you decided that you had done enough
    measurements so that you could believe that the
    average was reliable.
  • How did you present your data so that patterns
    could be easily identified? Explain why you chose
    this method.
  • What pattern did you observe from the data you
    generated? Did all the data fit with the pattern
    if not, do you have any idea why?
  • How did the evidence you obtained from the
    investigation relate to the original questions
    you were attempting to solve?
  • How much do you trust the data you obtained?
    Explain your answer.
  • Justify your choice of instruments in the
    investigation.

11
e.g. Questions
12
(No Transcript)
13
(No Transcript)
14
(No Transcript)
15
(No Transcript)
16
Possible response
Possible response
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
Summary
  • Valid and reliable summative assessment in mass
    system is not possible
  • Assessment should encourage fieldwork teaching
    and not distort practice
  • Assessing ideas that have been taught (inc with
    fieldwork) will be more reliable and, if
    specified well in the curriculum and assessment,
    has validity
Write a Comment
User Comments (0)
About PowerShow.com