Implications of Program Evaluation for Faculty Development Initiatives - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Implications of Program Evaluation for Faculty Development Initiatives

Description:

Emphasized the need to collect baseline data ... Collect different types of data from stakeholders. Allow sufficient time for evaluation ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 15
Provided by: kevin347
Category:

less

Transcript and Presenter's Notes

Title: Implications of Program Evaluation for Faculty Development Initiatives


1
Implications of Program Evaluation for Faculty
Development Initiatives
  • Kevin Chin
  • Department of Educational and Counselling
    Psychology
  • McGill University, Montreal
  • Canada

2
Presentation Outline
  • Background
  • Specific projects
  • General evaluation benefits for all projects
  • Specific evaluation benefits from each project
  • Possible implications for DUO/ICTO
  • Comments, questions, discussion

3
Relevant Background
  • M.A. Educational Technology
  • Education Specialist with the Canadian Human
    Rights Foundation
  • Ph.D. Candidate, Educational Psychology
  • Consultant with McGill Molson Medical Informatics
    project

4
Specific Projects
  • Human rights education
  • (classroom-based)
  • Medical student education
  • (ICT)
  • Medical student/faculty/staff education
  • (classroom-based/ICT)

5
International Human Rights Training Program
(IHRTP)
  • Conducted evaluation of annual program involving
    100 participants/60 countries
  • Data collected through questionnaires,
    observation, informal feedback, debriefings,
    projects, follow-up (regional meetings/e-mail)
  • Formal review of program every 5 years
  • Facilitators saw benefits of evaluation process
  • Organization addressed any concerns with
    participants next day
  • Federally funded

6
Computer-Based Patient Simulation
  • Conducted formative evaluation of virtual patient
    aimed at educating medical students
  • Used think-alouds, questionnaires, interviews
  • Changes were made to interface/content/structure
    based on feedback
  • Federally funded

7
Interprofessional Practice and Education
  • Developed evaluation plan for initiative on
    interprofessional practice and education of
    healthcare professionals
  • 10 of budget mandated for evaluation
  • Collecting baseline data for evaluating any
    changes that are program-related
  • To date, used interviews, type of pre/posttest
  • Future observation, questionnaires, interviews,
    video, audio
  • Federally funded

8
General Benefits
  • In all cases, funding agencies mandated
    evaluation and helped
  • Developers think about evaluation early on
  • Promote systematic approach to evaluation
  • Influence program development
  • Promote accountability
  • In two cases
  • Subsequent actions were taken based on evaluation
    results, with positive outcomes

9
Design Cycle
Design
Analysis
Development
Evaluation
Implementation
10
Specific Benefits (1/3)
  • IHRTP
  • Improved program dramatically
  • Facilitators who saw the changes take place
    championed the idea with participants
  • Follow-up motivated participants
  • Provided opportunity to educate about evaluation
  • Key points
  • Modeled an approach regarding the importance of
    program evaluation
  • Annual evaluation reports described changes to
    program, promoted transparency
  • Established/promoted a culture of evaluation

11
Specific Benefits (2/3)
  • CBPS
  • Improved product to better meet needs of students
  • Triangulated methods to help validate student
    responses
  • Positive learning experience for group members
  • Key points
  • Facilitated dynamic between teacher and students
  • Advocated the benefits of on-line technology with
    various stakeholders, e.g., senior
    administration, senior faculty, work group

12
Specific Benefits (3/3)
  • IPP/IPE
  • Adopted an evaluation framework that offered a
    useful starting point for people new to the field
    of program evaluation
  • Emphasized the need to collect baseline data
  • Help to show changes over time/validates the work
    being done
  • Key points
  • Highlighted importance for evaluation
  • Forced people to think about how to collect
    relevant data for ethics approval

13
Possible Implications for DUO/ICTO
  • Sell the idea of evaluation to stakeholders/promot
    e transparency
  • Collect different types of data from stakeholders
  • Allow sufficient time for evaluation
  • Provide rationale for acting (or not) based on
    data
  • Create/circulate reports that discuss changes
    made based on feedback

14
Questions, comments, concerns?
Write a Comment
User Comments (0)
About PowerShow.com