programmatic assessment for learning - PowerPoint PPT Presentation

Loading...

PPT – programmatic assessment for learning PowerPoint presentation | free to download - id: 813a56-ODJkZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

programmatic assessment for learning

Description:

assignment. Build an assessment programme for a workplace-based learning curriculum. GP training. practice, assignments plus day-release education – PowerPoint PPT presentation

Number of Views:251
Avg rating:3.0/5.0
Slides: 91
Provided by: edua2272
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: programmatic assessment for learning


1
programmatic assessment for learning
an example of medical education design
2
assignment
  • Build an assessment programme for a
    workplace-based learning curriculum.
  • GP training
  • practice, assignments plus day-release education
  • 1 year, two semesters, 4 terms
  • supervisor, medical educator, on-line platform

3
overview
  • a bit of history
  • programmatic assessment
  • the assessment programme

4
old model of medical competence
knowledge
skills
problem solving
attitudes
TEST
TEST
TEST
TEST
TEST
TEST
TEST
TEST
5
history
the quest for the best test
- oral versus written
- open versus closed items
- computer versus paper-and-pencil
- knowledge versus insight
- norm referenced versus criterion referenced
....and many more
6
typical approach to assessment
test
7
but.
incompetent
competent
pass
fail
8
major problems
9
quality elements of assessment
?
?
?
?
R
V
E
C
A
U
R reliability
V validity
E educational impact
C costs
A acceptance
Van der Vleuten CPM. The assessment of
Professional Competence Developments, Research
and Practical Implications. Advances in Health
Science Education 19961(1)41-67.
10
reliability
test
test
Penny
85
Leonard
73
Amy
59
51
Howard
11
validity Kanes view
practice observation for clinical reasoning
expertise, observation standards, scales, forms
observation
reliability, reproducibility saturation, expertis
e
observed score
relation with multiple choice key-feature, EMQ
universe score
think aloud, CRT, SCT
target domain
construct
12
educational impact
content
format
scheduling
regulatory structure
13
educational impact
Cilliers, F. J., Schuwirth, L. W. T., Herman, N.,
Adendorff, H., van der Vleuten, C. P. M.
(2012). A model of the pre-assessment learning
effects of summative assessment in medical
education. Advances in Health Sciences Education,
17(1), 39-53.
14
quality elements of assessment
?
?
?
?
Rw
Vw
Ew
Cw
Aw
U
R reliability
V validity
E educational impact
C costs
A acceptance
W weight
Van der Vleuten CPM. The assessment of
Professional Competence Developments, Research
and Practical Implications. Advances in Health
Science Education 19961(1)41-67.
15
But.
  • testing requires some strange assumptions

16
underlying concepts
traits are stable and generic characteristics
17
underlying concepts stable trait
1
2
3
4
T
1
1
1
1
4
A
B
.5
.5
.5
.5
2
C
0
0
0
0
0
18
underlying concepts stable trait
1
2
3
4
T
0
.5
.5
0
1
A
B
1
.5
0
1
2.5
C
3.5
1
1
.5
1
19
underlying concepts
traits are stable and generic characteristics
individual items in themselves are meaningless
20
underlying concepts meaningless items
Ms. Smit is 72 years old. She has angina
pectoris. Several times her blood pressure is
taken and found to be 170/100 mmHg. Which
antihypertensive drug is most indicated for
her?? a captopril. b chloorthalidon. c metoprolol.
21
underlying concepts meaningless items
Mr. Johnson, 35 years old, consults his GP
with complaints of chest pain. Without further
information about Mr. Johnson the most likely
origin of his chest pain is a the chest
wall b the lungs c the myocardium d the
esophagus.
22
underlying concepts meaningless items
resuscitation station in a skills test
23
underlying concepts meaningless items
communication station in a skills test
24
underlying concepts
traits are stable and generic characteristics
individual items in themselves are meaningless
sum scores determine what the test measures
statistics are based on elimination of information
25
underlying concepts reductionism
failed
26
underlying concepts
traits are stable and generic characteristics
individual items in themselves are meaningless
sum scores determine what the test measures
statistics are based on elimination of information
one single best instrument for each trait
27
old model of medical competence
knowledge
skils
problem solving
attitudes
TEST
TEST
TEST
TEST
TEST
TEST
TEST
TEST
28
competencies
competencies are simple or more complex tasks a
successful candidate must be able to handle, and
during which s/he uses at the right time the
correct and relevant knowledge, skills,
attitudes and meta-cognitions to manage the
situation successfully.
29
competency domains or roles
  • National Dutch blue print
  • 1 medical expert
  • scientist
  • worker in the health care system
  • person

30
overview
  • a bit of history
  • programmatic assessment
  • the assessment programme

31
from building blocks
32
to buildings
33
from methods to programmes
? multiple instruments, various formats
? strengths and weaknesses combined
? assessment moments ? decision moments
34
every assessment moment is a decision moment


competent
35
every assessment moment is NOT a decision moment
low stakes
medium stakes
high stakes
36
from methods to programmes
? multiple instruments, various formats
? strengths-weaknesses combined
? assessment moment ? decision moment
? multiple quality approaches
37
quality reliability
- consistency
- saturation
- expertise
- organisation
38
reliability is sampling
short essay2 0.68 0.73 0.84 0.82
practice Video test5 0.68 0.81 0.87 0.90
testing time in hours 1 2 4 8
MCQ1 0.62 0.76 0.93 0.93
paper cases1 0.36 0.53 0.69 0.82
obser- vation assessment4 0.43 0.60 0.76 0.86
orals3 0.50 0.69 0.82 0.90
1Norcini et al., 1985 2Stalenhoef-Halling et al.,
1990 3Swanson, 1987
4Newble Swanson, 1987 5Ram et al., 1999
39
generalisability saturation
orange
green blue red yellow
purple black
nothing new
nothing new
40
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
41
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
42
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
43
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
44
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
45
Steps in the year
Introduction to students
Overview
Mentors are trained
No
First portfolio submission
Formative review
Examiner training (benchmark portfolios)
2nd portfolio submission
Summative review
Mentor/student Recommendation F/P/D
Exam committee decision
46
from methods to programmes
? multiple instruments, various formats
? strengths-weaknesses combined
? assessment moment ? decision moment
? multiple quality approaches
? many instruments many competency domains
47
1 role ?? 1 instrument
instruments
A
B
C
D
med expert
scientist
domains
worker in HCS
person
48
multi-modal assessment
instruments
med expert
scientist
domains
worker in HCS
person
49
from methods to programmes
? multiple instruments, various formats
? strengths-weaknesses combined
? assessment moment ? decision moment
? multiple quality approaches
? many instruments many competency domains
? integrative ? holistic not reductionist
50
overview
  • a bit of history
  • programmatic assessment
  • the assessment programme

51
assignment
  • Build an assessment programme for a
    workplace-based learning curriculum.
  • GP training
  • practice, assignments plus day-release education
  • 1 year, two semesters, 4 terms
  • supervisor, medical educator, on-line platform

52
design
  • goals and stated purpose
  • programme in action
  • supporting the programme
  • documenting of the programme
  • improvement approaches to the programme
  • accounting for the programme

Dijkstra J, Van der Vleuten C, Schuwirth L. A new
framework for designing programmes of assessment.
Advances in health sciences education 201015.
37993.
53
If incompetence were an illness, how would we
diagnose and treat it?
54
design
  • multiple instruments
  • meaningful collation
  • learning focused
  • self-regulation
  • assessment moment ? decision moment
  • longitudinal
  • feasible and efficient

55
purpose
56
safe independent practitioner
  • medical expert
  • worker in het healthcare system
  • person
  • scholar

57
what is safe?
58
what is safe?
  • mastery skill competence .

self regulation
59
self regulation
  • self driven
  • analyses
  • external information seeking
  • goal orientation
  • prioritisation
  • realisation/attainment
  • time management

1 Bandura A. social cognitive theory an agentic
perspective. Annual Review Psychology
2001521-26. 2 Dochy F, M.Segers, Sluijsmans D.
The Use of Self-, Peer and Co-assessment in
Higher Education a review. Studies in Higher
Education 199924(3)331-50. 3 Eva KW, Cunnington
JPW, Reiter HI, Keane D, G N. How can I know what
I don't know? Poor self assessment in a
well-defined domain. Advances in Health Sciences
Education 20049211-24.
60
The opposite of good is...
well intended
61
perfect assessment program
62
relevant research findings
  • meaningfulness

1. Posner MI. What is it to be an expert? In Chi
MTH, Glaser R, Farr MJ, editors. The nature of
expertise. Hillsdale, NJ, US Lawrence Erlbaum
Associates, Inc, 1988xxix - xxxvi. 2. Schmidt
HG, Boshuizen HP. On acquiring expertise in
medicine. Special Issue European educational
psychology. Educational Psychology Review
19935(3)205-21.
63
learning in context
  • a newspaper is better than a glossy magazine
  • the seashore is better than the street
  • first it is better to run than to walk
  • you will have to try several several times
  • some skills are required but it is easy to learn
  • even small children can enjoy it
  • once successful the risk of complications is
    minimal
  • birds seldom get too close
  • rain soaks in very fast
  • a rock can serve as an anchor
  • once it breaks loose there is not second chance

64
learning in context flying a kite
  • a newspaper is better than a glossy magazine
  • the seashore is better than the street
  • first it is better to run than to walk
  • you will have to try several several times
  • some skills are required but it is easy to learn
  • even small children can enjoy it
  • once successful the risk of complications is
    minimal
  • birds seldom get too close
  • rain soaks in very fast
  • a rock can serve as an anchor
  • once it breaks loose there is not second chance

65
relevant research findings
  • meaningfulness
  • transfer and domain specificity

1. Eva K. On the generality of specificity.
Medical Education 200337587-8. 2. Eva KW,
Neville AJ, G.R. N. Exploring the etiology of
content specificity Factors influencing analogic
transfer and problem solving. Academic Medicine
199873(10)s1-5.
66
analogous transfer
67
relevant research findings
  • meaningfulness
  • transfer and domain specificity
  • deliberate practice

Ericsson KA. An expert-performance perspective of
research on medical expertise the study of
clinical performance. Medical Education
2007411124-30.
68
deliberate practice
69
feedback
  • concrete
  • constructive
  • focused on improvement
  • connected
  • leading to learning goals/learning plans

Shute V. Focus on formative feedback. Review of
educational research 200878(n)153-89.
70
loop
71
relevant research findings
  • meaningfulness
  • transfer and domain specificity
  • deliberate practice
  • self-regulated learning

72
self-regulated learning
areas
phases
forethought, planning activation
cognition
motivation
behaviour
context
cognition
motivation
monitoring
behaviour
context
cognition
motivation
control
behaviour
context
cognition
motivation
reaction reflection
behaviour
context
cf. Schunk DH (2005). Self-regulated learning
The educational legacy of Paul R. Pintrich.
Educational Psychologist, 40, 85-94
73
relevant research findings
  • meaningfulness
  • transfer and domain specificity
  • deliberate practice
  • self-regulated learning
  • reasoning and decision making

1. Boreham NC. The dangerous practice of
thinking. Medical Education 199428172-79. 2.
Klein G. Naturalistic Decision Making. Human
Factors 200850(3)456-60. 3. Plous S. The
psychology of judgment and decision making. New
Jersey McGraw-Hill inc., 1993. 4. Schmidt HG,
Machiels-Bongaerts M, Hermans H, ten Cate TJ,
Venekamp R, Boshuizen HPA. The Development of
Diagnostic Competence Comparison of a
Problem-based, and Integrated, and a Conventional
Medical Curriculum. Academic Medicine
199671(6)658-64.
74
relevant research findings
  • reliability
  • validity
  • quality frameworks
  • organisational reliability

1. Williams M, Klamen D, McGaghie W. Cognitive,
Social and Environmental Sources of Bias in
Clinical Performance Ratings. Teaching and
Learning in Medicine 200315(4)270-92. 2. Kane
MT. Validation. In Brennan RL, editor.
Educational Measurement. Westport ACE/Praeger,
200617 - 64. 3. Govaerts MJB. Climbing the
pyramid Towards understanding performance
assessment. Maastricht University, 2011. 4.
Dijkstra J, Van der Vleuten C, Schuwirth L. A new
framework for designing programmes of assessment.
Advances in health sciences education 201015.
37993
75
term 1
mid-term
end-term
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
CCA
direct obs.
direct obs.
direct obs.
direct obs.
direct obs.
mcq test
MSF
mini-releases
mini-releases
mini-releases
mini-releases
mini-releases
portfolio
76
term 2
mid-term
end-term
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
CCA
direct obs.
direct obs.
direct obs.
mcq test
mcq test
mini-releases
mini-releases
mini-releases
mini-releases
mini-releases
audit
audit
MSF
portfolio
77
critical case analysis
  • 5 write-ups of real patient consultations
  • relevance
  • analysis
  • learning activities
  • produce exam questions (EMI, KFP, MCQ)
  • increasingly original literature
  • any discussion minuted by registrar

78
directly observed consultations
  • 9 real patient consultations
  • relevance
  • analysis
  • learning goals (practical theoretical)
  • learning activity
  • demonstration of success in next observed
    consultation
  • discussion minuted by registrar

79
clinical audit
  • analysis of the practice environment
  • determination of specific question
  • collection of data
  • draw conclusions
  • describe plan for change
  • 3 months look back and annotate
  • any discussion minuted by registrar

80
multiple-choice tests
  • 3 tests of 60 items each
  • blueprinted
  • sit and submit your answers
  • review items, answer key
  • comment on an criticise questions for correctness
  • present in min-release
  • lodge appeal against questions
  • score calculation and feedback to registrars

81
mini-releases
  • flexible agenda
  • building informal networks
  • discuss MCQ test items
  • compile
  • appeal against questions
  • list of informal network

82
multi-source feedback
  • 2 times per year
  • nurses, practice manager, receptionist, other
    practice staff and registrar
  • discussed with supervisor (end-term assessment)
    and with ME (minuted by registrar)
  • simple form dealing with tasks, other and
    yourself
  • simple ordinal scale
  • ample room for qualitative comments

83
mid and end-term assessment
  • integrative
  • reviewing all the information
  • learning goals and/or remediation plans
  • advisory to performance review committee
  • minuted by registrar

84
portfolio
  • complete dossier including minutes
  • individual extra information (only if relevant)
  • audit trail
  • basis for future CV or position applications

85
example of a line
CCA
meaning
learning
test enhanced learning
feedback
MCQs
analysis
test
informal/social networks
appeal
group
group appeal
transformation
research narratives for feedback
feedback
86
design
  • goals and stated purpose
  • programme in action
  • supporting the programme
  • documentation of the programme
  • improvement approaches to of the programme
  • accounting for the programme

Dijkstra J, Van der Vleuten C, Schuwirth L. A new
framework for designing programmes of assessment.
Advances in health sciences education 201015.
37993.
87
rules and regulation
  • self responsibility comes with accountability
    (minutes, plagiarism, fraud)
  • focus on learning and remediation
  • information provision to the registrar
  • documentation
  • transparency
  • second opinion/appeals/switch of ME or supervisor
  • organisation reliability/credibility

88
staff development
  • efficiency
  • short analyses
  • concrete learning goals
  • focus on learning
  • training of staff (expertise ? efficiency)
  • admin support by admin staff
  • division of advisory and decision roles

89
further requirements
  • goals and stated purpose
  • programme in action
  • supporting the programme regulations
  • documentation of the programme ELO, fact sheets,
  • improvement approaches to of the programme
    systematic evaluation
  • accounting for the programme research

90
Thank you
About PowerShow.com