Title: Integrating on-line assessment with class-based learning: a preliminary study of the AIM marking system
1Integrating on-line assessment with class-based
learning a preliminary study of the AIM marking
system
- Richard Walker Gustav Delius
- 6th July, 2004
2Presentation Aims
- 1. Assessment and mathematics
- 2. Computer algebra based assessment systems
(AiM) - 3. Rationale for AiM - York innovation
- 4. AiM implementation
- 5. Student and staff feedback (2003-04)
- 6. Challenges and future developments
3Assessment and mathematics (1)
- The majority of tasks may be classified as either
- Lower order activities
- Factual recall
- Carry out routine calculation / algorithm
- Classify some mathematical object
- Interpret situation or answer
- Higher order activities
- 5. Prove, show, justify (general argument)
- 6. Extend a concept
- 7. Criticize a fallacy
- 8. Create an example
(Sangwin, 2003)
4Assessment and mathematics (2)
- For higher level activities
- often no one correct method
- no unique correct answer
- solutions routine but time consuming to mark
- Opportunity in some circumstances for marking to
be performed by computer algebra systems
5Computer algebra based assessment systems
- Advantages
- - can handle questions with no unique answers
(identifying algebraic equivalence) - - questions can be arbitrarily randomised
- - can ask students to supply examples
- - can give arbitrarily detailed feedback
- - allows detailed analysis of student
attempts - Disadvantages
- - time-consuming to set up well
- - marking routines can have bugs
6Alice interactive Mathematics
- Web based assessment system
- Uses Maple (computer algebra) checking for
equivalence of answer / solution - System is free / open source
- Working system at many universities (Birmingham
2000 York 2003)
7Rationale for AiM
- Reduce amount of routine coursework marking
redeploy GTA markers for seminar teaching - Reduce waiting time between coursework submission
and marking / feedback - Get students to practice focus on accuracy and
reflect on solutions (opportunity to resubmit) - Give students challenges exemplifying concepts
- Encourage collaboration without copying
(randomised questions)
8The York innovation
- Integrate AiM questions with traditional homework
questions (40 over range of courses Calculus,
Matrices etc.) - Students continue to receive problem sheets
(randomised) to work on at home - Marks for all assigned problems are collated and
displayed on Moodle
9(No Transcript)
10(No Transcript)
11(No Transcript)
12(No Transcript)
13AiM implementation
- 1st year students (n182)
- Introductory session to AiM (October 2003)
- Range of modules over two terms (2003-04)
Calculus, Maple, Matrices - Accounting for 40 of coursework, but not final
assessment - 10 penalty per wrong answer
14(No Transcript)
15Student expectations results of October 2003
survey
- RR 74 (134/182)
- computerized marking will have positive impact on
maths education (A54D12) - will be motivated to try again if answer wrong
(A69D7) - immediate feedback will encourage peer discussion
of solutions (A55D8) - feedback will help better prep for
seminars/lectures (A72D6)
16Student experience results of March 2004 survey
(1)
- RR 54 (98/182)
- computerized marking is relevant to maths
education (A62 D8) - complemented trad class-based teaching methods
(A67 D6) - class attendance less importance (A13 D74)
- feedback encouraged students to reattempt
questions (learn from mistakes) (A86 D4) - feedback encouraged reflection on solutions
(where I went wrong) (A69 D11) - feedback encouraged peer-based discussion of
solutions /study methods (A52 D23) - marking frustrated students highlighted errors
in work, but not reasons for mistakes (A57 D16)
17Student experience results of March 2004 survey
(2)
- Convenience, ease of use and immediacy of
feedback I can study in my own time.
Immediate feedback on my performance is very
useful AiM is extremely easy to access and
the quick response to questions makes it very
quick to know whether an answer is right or
wrong. - Peer-based collaboration Randomisation of
problems makes it possible to work with peers to
find the way through a problem, then complete it
on your own. All in all a fantastic system with
an intuitive and efficient front end. - Frustration with system glitches Very good to
have immediate feedback . Not good when there are
faults when there are faults in the system and
points are deducted for giving correct answers.
This throws doubts on the reliability of the
marking.
18Student experience results of March 2004 survey
(3)
- Method or solution? What is frustrating is that
there are no marks for method which is especially
annoying when the calculation involves a lot of
algebra.. You get penalised for
absentmindedness where if it was marked on paper
the marker would see it was a trivial error. - More guidance ..if an attempt is incorrect
absence of guidance as to what is wrong can be
frustrating. Its impossible to know whether the
answer is close or completely wrong. A hint
button might be nice, available when you have
made so many failed attempts. This would students
who cant do a question can learn how to do it
before a deadline, encouraging them to work more.
19Staff observations on AiM (1)
- too early to judge impact on student learning
- teething problems crafting of questions,
anticipation of student entries - no evidence to suggest positive effect on class
participation - but students will catch their own errors - be
accurate more secure
Even 3rd yr students when they
leave are very capable technically but are not so
capable knowing if they have done something
correctly. They need reassurance. AiM might help
us in this respect. This problem has vexed us for
as long as I can remember. There is a tendency
among students to want more and more information
- spoon-feeding. The weaning process gets harder
and harder.
20Staff observations on AiM (2)
- evidence of shift in student interaction patterns
- peer-based problem solving - posting problems /
solutions via forum - increased interaction with lecturers on hmk
email rather than office hours (scaling up risk) - some student dissatisfaction (particularly
weaker) should be getting more of marks for
knowing what to do, rather than how to do it
accurately - and frustration answers marked wrong mistyping
formula / syntax - danger of over-dependence on system / laziness
Students encouraged only to make a half decent
try, punching in answer and getting feedback.
They should be thinking before they submit an
answer.
21(No Transcript)
22Aim for AiM
- Style of questions so far emphasises accuracy
rather than self-learning - matrix manipulation is part of the language,
but not the poetry of maths - Development of system / feedback to point out
conceptual errors - Challenge to entice thinking not training
- there is a risk that students will become
technically competent, but not innovative and
creative - maths teaching is not in the business of
drill, but is all about exemplifying concepts,
giving students challenges as well as
opportunities to practice
23(No Transcript)
24(No Transcript)
25The Future
- Computer Algebra Based Learning and Evaluation
System (Naismith Sangwin, 2004) - - open source infrastructure for marking
mathematical learning objects - JISC project collaboration authoring tools for
creation of assessment equations, taking account
of user preferences and accessibility. - - partners Sheffield, Birmingham, Durham,
Edinburgh, Imperial (London)
26References
- http//aiminfonet.net
- York and AiM (ALTC)
- http//maths.york.ac.uk/moodle/yorkmoodle/course/