An Innovative Approach to Field Assessments of Student Competence - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

An Innovative Approach to Field Assessments of Student Competence

Description:

Development of exemplar vignettes ... A vignette matching (VM) score was calculated for each student: Average of scores for all vignettes selected by instructor ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 18
Provided by: educatio8
Category:

less

Transcript and Presenter's Notes

Title: An Innovative Approach to Field Assessments of Student Competence


1
An Innovative Approach to Field Assessments of
Student Competence
C. Regehr, M. Bogo, R. Power, G. Regehr
Faculty of Social Work University of
TorontoFunding provided by Social Sciences and
Humanities Research Council of Canada
2
Evaluating Competence to Practice
  • Global narratives
  • Highlighting strengths and areas for improvement
  • Based on implicit criteria and the practice
    wisdom of evaluator
  • Unstandardized
  • Competency-based evaluation
  • Checklists explicitly defining practice in
    discrete behavioral terms
  • Questionable reliability and validity, limited
    variability, ceiling effects

3
Our Previous Studies
  • Evaluated the reliability of the schools
    competency based evaluation tool (CBE)- good
    internal consistency, poor predictive validity
  • Designed new Practice-Based Evaluation (PBE) Tool
  • based on the dimensions and language field
    instructors use to describe various levels of
    student performance
  • Compared 2 tools - did not find improved
    variability or predictive validity in the new PBE
    tool vs. the previous CBE tool

4
Purpose of Current Study
  • Design and test innovative approach for field
    instructors to assess student competence
  • New approach involves having field instructors
    represent students in a more holistic manner
  • Asks instructors to match their student to set of
    standardized descriptions of typical students
    performing at various levels

5
Tool Development Development of exemplar
vignettes
  • In-depth interviews with 19 experienced field
    instructors from 3 sectors
  • Asked for detailed descriptions of an
    exemplary, average, and problematic student
  • From 57 descriptions produced, 20 exemplar
    student vignettes created
  • representing full range of performance levels
  • representing typical manifestations

6
Tool Development Ranking the Vignettes
  • 10 experienced field instructors
  • Independently
  • sort 20 vignettes into as many categories as
    needed to represent levels of performance
  • rank vignettes within each category
  • In two groups of 5
  • compare individual categories
  • achieve consensus on category membership
  • name the categories

7
Tool DevelopmentRanking Results
  • Individual rankings of 20 vignettes- Interclass
    correlation coefficient 0.83
  • Group ranking of vignettes- Interclass
    correlation coefficient 0.99
  • Five categories generated- exceptional- ready
    for practice- on the cusp- need more training-
    unsuitable for the profession

8
Current Study Procedure
  • Recall most recent student
  • Vignette matching process
  • Given a package of 20 vignettes ordered randomly
  • Asked to read the vignettes and select those
    vignettes that are similar to their student.
  • Select from the similar vignettes one or two
    vignettes that are most similar to their
    student.
  • Evaluate same student using Practice-Based
    Evaluation (PBE) Tool and Competency-Based
    Evaluation (CBE) Tool

9
Current Study Participants
  • 28 field instructors
  • experienced practitioners
  • experienced field instructors
  • range of settings
  • supervised student in previous year

10
Scoring of Student Performance
  • A vignette matching (VM) score was calculated for
    each student
  • Average of scores for all vignettes selected by
    instructor as similar
  • similar vignettes given weighting of 0.5
  • most similar vignettes given weighting of 1.0
  • CBE and PBE scores calculated as average across
    all scale dimensions

11
Students Matches to Vignettes
12
Quantitative Results
13
Distribution of Student Scores for the Three
Measures
14
Distribution of Student Scores for the Three
Measures
15
Distribution of Student Scores for the Three
Measures
16
Conclusions
  • The matching method produced greater variability
    in student evaluations than either the CBE or PBE
    tool
  • Field instructors were more likely to place
    students at both ends of the continuum- poorly
    performing students- exceptional students

17
Discussion
  • Problems with differentiating students with CBE
    tools may not lie in the properties of individual
    tools, but rather the existence of scales
    themselves
  • Continue to struggle with the problem of
    identifying students who need more assistance or
    who should never practice
  • Need to go back to the drawing board to find a
    balance between the two methods of evaluation
Write a Comment
User Comments (0)
About PowerShow.com