Medical Education Round - PowerPoint PPT Presentation

1 / 46
About This Presentation
Title:

Medical Education Round

Description:

To learn you more about: Some research outcomes in assessment of competence ... National Training Laboratories, Bethel, Maine, USA ... – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 47
Provided by: ceesv
Category:

less

Transcript and Presenter's Notes

Title: Medical Education Round


1
The state of the artin Student Assessment
  • Medical Education Round
  • 1 October 2001, McGill University, Montreal
  • Cees van der Vleuten
  • University of Maastricht
  • The Netherlands

2
My objectives
  • To learn you more about
  • Some research outcomes in assessment of
    competence
  • Making more informed assessment decisions

3
Overview of Presentation
  • Introduction to some terms
  • Issues of Reliability
  • Issues of Validity
  • Issues on Educational impact
  • You cant have it all!
  • Conclusions

4
Characteristics of instruments
5
Reliability
6
Validity
7
Educational impact
8
Reliability
  • Reliability is matter of sampling
  • Across content

9
Problem 1
10
Domain of Interest
?
?
11
Content Specificity Problem of Clinical Competence
12
Reliability as a function of testing time
1Norcini et al., 1985 2Stalenhoef-Halling et al.,
1990 3Swanson, 1987
4Was et al., under editorial review 5Newble
Swanson, 1987 6Ram et al., 1999
13
Reliability
  • Reliability is matter of sampling
  • Across content
  • Across other potential factors that cause error
    variance

14
Reliability of an oral examination (Swanson, 1987)
Two New Examiners for Each Case 0.61 0.76 0.86 0
.93
Same Examiner for All Cases 0.31 0.47 0.47 0.48
New Examiner for Each Case 0.50 0.69 0.82 0.90
Testing Time in Hours 1 2 4 8
Number of Cases 2 4 8 12
15
Reliability
  • Conclusion
  • Adequate reliability requires substantial
    sampling (therefore resources testing time,
    examiners, patients, etc.)
  • Efficiency is the hallmark

16
Efficiency strategies
  • Key feature approach
  • Test design strategies

17
Reliability
  • Practical suggestions
  • Do not rely on short tests
  • Sample broadly (content, in time, examiners,
    patients)
  • Consider efficiency in
  • selection of test format
  • construction test items
  • Be aware of (considerable) decision errors in
    terms of pass/fail decisions

18
A simple model of competence
Does
Shows how
Knows how
Knows
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990 65 S63-S7.
19
A simple model of competence
Performance or hands on assessment
Does
Shows how
Written, Oral or Computer based assessment
Knows how
Knows
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990 65 S63-S7.
20
Validity
  • Validity is matter of climbing the pyramid

21
Climbing the pyramid......
Does
Shows how
Knows how
Knows
22
Knows/Knows how
Does
Shows how
Knows how
  • The stimulus format is more important than the
    response format

Knows
Knows
23
Stimulus vs Response Format
Some cities have more bars than inhabitants. For
which of the following cities is this the case
  • Dublin
  • Maastricht
  • Pilsner

24
Knows/Knows how
Does
Shows how
Knows how
  • The stimulus format is more important than the
    response format
  • The stimulus format should be
  • Contextual/Authentic
  • Requiring judgment

Knows
Knows
  • Avoid complexity
  • Short static cases (as opposed to long dynamic)
  • Simple scoring systems

25
(No Transcript)
26
Shows how
Does
Shows how
Knows how
  • OSCE-ology
  • Make stations as clinically authentic as possible
  • Global judgments do well in OSCEs
  • Content specificity is the problem

Knows
27
Does
Does
Shows how
Knows how
  • Methods to assess in clinical practice
  • Emerging (promising) technology
  • Lot of research needs yet to be done

Knows
28
Validity is something missing?
29
New educational models
  • Situated learning
  • Project-based learning
  • Problem-based learning
  • Discovery learning
  • Student-centred learning
  • Authentic learning
  • Patient-based learning
  • Community-based learning
  • ...............

30
The learning pyramid
31
Three Cs of education
  • ? Situated learning ? Project-based learning ?
    Problem-based learning ? Discovery learning ?
    Student-centred learning ? Authentic learning ?
    Patient-based learning.................

32
New skills emphasized
  • learning how to learn
  • self-appraisal
  • leadership
  • team skills
  • metacognition
  • skills of expression (writing, presenting)
  • reflectiveness/reflexiveness.

33
Extending the pyramid
34
How to assess meta-skills?
  • Self assessment
  • Peer assessment
  • Co-assessment (combined self, peer, teacher
    assessment)
  • Log book/diary
  • Learning process simulations/evaluations
  • Product-evaluations
  • Portfolio assessment
  • ..........

35
How to assess meta-skills?
  • Basic method Information gathering relying more
    on descriptive and qualitative judgemental
    information

36
Validity
  • Conclusion
  • Educational or professional authenticity is the
    hallmark (within and across layers of the
    pyramid)

37
Validity
  • Practical suggestions
  • Dont be married to a single method (the method
    it self less important than the content)
  • Select the tasks which resembles clinical
    practice most (worry less about the competency
    being measured)
  • Avoid complexity ? KISS principle
  • To do a good job, you need a mixture of methods

38
Educational impact
  • Assessment drives learning

39
An alternative view
Curriculum
Assessment
Teacher
Student
40
  • Assessment may drive learning through
  • Content
  • Format
  • Programming/scheduling
  • Regulations
  • ..........

41
Educational impact
  • Suggestions
  • Verify the educational consequences of your
    assessment
  • Use the assessment strategically to achieve
    desirable learning behaviors and outcomes
  • Learning task assessment

42
Characteristics of instruments
  • Reliability (R)
  • Validity (V)
  • Educational impact (E)
  • Acceptability (A)
  • Cost (C)

43
Utility function
U wrR x wvV x weE x waA x wcC
  • U Utility
  • R Reliability
  • V Validity
  • E Educational impact
  • A Acceptability
  • C Cost
  • W Weight

44
Utility function
Weight
U wrR x wvV x weE
  • U Utility
  • R Reliability
  • V Validity
  • E Educational impact
  • A Acceptability
  • C Cost
  • w Weight

45
The assessment program
  • Reliability and validity (and other
    characteristics) are (also) parameters of the
    integral assessment programme rather than of
    individual instruments

46
Utility function
Weight
U wrR x wvV x weE
  • U Utility
  • R Reliability
  • V Validity
  • E Educational impact
  • A Acceptability
  • C Cost

47
You cant have it all
  • Conclusions
  • Assessment always requires a compromise
  • The compromise fully depends on the context of
    the assessment
  • Quality of assessments is a matter of the
    integral assessment programme, rather than of the
    individual instruments (so it requires good
    coordination and planning)

48
Conclusions
  • Assessment is less of a psychometric problem, but
    rather an educational design problem (i.e. how to
    use assessment strategically for its educational
    effects)
  • Assessment requires careful planning and
    monitoring
  • To do a good job, we need to cover the entire
    competence pyramid (so we need a cocktail of
    methods not a single method is best or bad, that
    fully depends on the context)
  • Compromises are inevitable
Write a Comment
User Comments (0)
About PowerShow.com