Assessment in Education - PowerPoint PPT Presentation

About This Presentation
Title:

Assessment in Education

Description:

Assessment in Education Patricia O Sullivan Office of Educational Development UAMS Last Month Dr. Allen described a situation about an individual asking her how to ... – PowerPoint PPT presentation

Number of Views:463
Avg rating:3.0/5.0
Slides: 40
Provided by: JohnRo162
Category:

less

Transcript and Presenter's Notes

Title: Assessment in Education


1
Assessment in Education
  • Patricia OSullivan
  • Office of Educational Development
  • UAMS

2
  • Last Month Dr. Allen described a situation about
    an individual asking her how to get to Freeway
    Medical Center. How would you determine how
    successful Dr. Allen was?

3
Objectives
  • Provide terminology and principles fundamental to
    assessment
  • Introduce various assessment approaches
  • Review tips for a common assessment Multiple
    Choice Questions
  • Review elements for a less common assessment
    Portfolio

4
What is Assessment?
  • Appraising performance through measurement to
    make an evaluation
  • For what purpose?
  • Formative
  • Summative

5
What Assessments are Used at UAMS?
6
What Makes a Good Summative Assessment?
  • Fair and practical
  • Based on appropriate criteria that are shared
    with the learners
  • Valid
  • Reliable

7
Validity
  • Key to the interpretability and relevance of
    scores
  • Do the scores have a plausible meaning?
  • What are the implications of interpreting the
    scores?
  • What is the utility of the scores?
  • What are the actual social consequences?

8
Validity Evidence for Proposed Interpretation
  • Content/Construct Evidence
  • Content matches the objectives teaching
  • Score is shown to be meaningful
  • Criterion/Predictor Evidence
  • Score relates to future outcomes

9
Reliability
  • Assessment data must be reproducible
  • Written tests
  • Internal consistency
  • Cronbachs alpha or KR-20 (values range 0-1)
  • Test-retest
  • Correlation ranging from 0-1

10
Reliability (cont.)
  • Rater-based assessments
  • Interrater consistency or agreement (generally
    correlation coefficient or intraclass
    correlation)
  • OSCE performance assessments
  • Generalizability Theory

11
Improve Validity
  • Match assessment with objectives
  • Increase the sample of objectives and content on
    the assessment
  • Review blueprint (or produce one)
  • Use test methods appropriate to objectives
  • Ensure test security

12
Improve Reliability
  • Clear questions
  • Appropriate time allotted
  • Simple, clear and unambiguous instructions
  • High quality scoring
  • Increase number of observations or questions

13
Norm or Criterion Referenced
  • Norm-referenced (relative standard)
  • Use the results of the assessment to set the
    standards (e.g. proportion to receive each
    grade) performance judged by comparison with
    others
  • Criterion-referenced (absolute standard)
  • Learners achieve some minimal standard of
    competency

14
Standard Setting Procedures
  • Judgments based on
  • Holistic impressions of the exam or item pool
  • Content of individual test items
  • Examinees test performance

15
Anghoff Standard Setting Procedure
  • Judges are instructed to think of a group of
    minimally acceptable persons
  • For each item estimate the proportion of this
    group that will answer item correctly
  • Sum up proportions to get the minimum passing
    score for a single judge
  • Average across judges

16
Assessment Methods
  • ACGME Toolbox
  • Additional methods
  • Other written assessments
  • Essay
  • Short answer and computation
  • Patient management problems
  • Modified essay questions
  • Create a game
  • Self-assessment

17
Matching Assessment Method and Objectives
  • Examine ACGME matrix

18
Alternative Testing Modes
  • Take-home tests
  • Open-book tests
  • Group exams
  • Paired testing

19
Reducing Test Anxiety
  • Make the first exam relatively easier
  • Give more than one exam
  • Give advice on how to study
  • Encourage study groups
  • Give a diagnostic test early

20
Reducing Test Anxiety
  • Before a test explain format
  • Provide sample items
  • Provide a pool of final items with syllabus

21
Technology and Assessment
  • Computer based testing
  • Use of images and video clips
  • Computer Adaptive Testing (CAT)

22
Take a Test
23
Multiple Choice Questions Writing Tips
  • http//www.nbme.org/about/itemwriting.asp
  • Simple suggestions
  • Design for one best answer
  • Place hand over stem. Can you guess the answer?
  • Use a template
  • Attend to Graveyard of Test Items

24
Template Examples
  • A (patient description) is unable to (functional
    disability). Which of the following is most
    likely to have been injured?
  • Following (procedure), a (patient description)
    develops (symptoms and signs). Laboratory
    findings show (findings). Which of the following
    is the most likely cause?

25
Graveyard
  • B-type matching heading with words or phrases
    (can use heading more than once)
  • D-type complex matchingtwo step process
  • K-type stem with four options and choices are
  • A1,2,3 only B1,3 only C2,4 only
  • D4 only, Eall are correct

26
Portfolios
27
(No Transcript)
28
Portfolios
  • Definition
  • A purposeful collection of student work that
    exhibits to the student (and/or other) the
    students efforts, progress, or achievement in
    (a) given area(s).

29
  • This collection must include student
    participation in
  • Selection of portfolio content
  • The criteria for selection
  • The criteria for judging merit
  • Evidence of student
  • reflection

30
Portfolios
  • Learning
  • Showing progress
  • Showing effort
  • Showing remediation
  • Assessment
  • Showing competence

31
Presentations
Logs
Case Records
Evaluations
32
Criteria ______ ______ ______
33
Criteria
34
Portfolio
Reflection
  • Why selected
  • How reflects skills and abilities

35
Using Portfolios the Learner Engages in Assessment
  • Making choices
  • Critically self-examining
  • Explaining what they are doing

36
Well-structured Portfolio
  • Menu
  • Selection criteria
  • Scoring rubric
  • Benchmarks
  • Trained raters

37
Determine Reliability and Validity
  • Reliability Generalizability theory
  • Found needed 6 entries with 2 raters
  • Validity number of studies
  • Resident performance across years
  • Resident performance in areas
  • Resident perception
  • Resident performance when program made an
    intervention
  • Correlation of portfolio performance and other
    areas

38
10-Step Approach
  1. Define the purpose
  2. Determine competencies to be assessed
  3. Select portfolio materials
  4. Develop an evaluation system
  5. Select and train examiners

39
10-Step Approach
  1. Plan the examination process
  2. Orient learners
  3. Develop guidelines for decisions
  4. Establish reliability and validity evidence
  5. Design evaluation procedures
Write a Comment
User Comments (0)
About PowerShow.com