Quasi-Experimental Research Designs: Increasing Rigor in Evaluation Elena Kirtcheva, M.A. Research for Better Schools, Philadelphia - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

Quasi-Experimental Research Designs: Increasing Rigor in Evaluation Elena Kirtcheva, M.A. Research for Better Schools, Philadelphia

Description:

Used idiosyncratic instrumentation without evidence of reliability & validity. 3 ... Lack of perfect alignment with the content covered, but chosen for its properties ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 13
Provided by: tomlo4
Category:

less

Transcript and Presenter's Notes

Title: Quasi-Experimental Research Designs: Increasing Rigor in Evaluation Elena Kirtcheva, M.A. Research for Better Schools, Philadelphia


1
Quasi-Experimental Research DesignsIncreasing
Rigor in EvaluationElena Kirtcheva,
M.A.Research for Better Schools, Philadelphia
  • Presented at The American Evaluation Association
    Annual Conference
  • November 14, 2009

2
Introduction
  • An application of two quasi-experimental
    approaches to increasing the rigor of evaluations
  • Evaluating science teacher professional
    development is of interest to policy makers,
    program makers, and evaluators alike
  • Focus on evaluating gains in content knowledge
    (Loucks-Hoursley et al, 2003)

3
Existing Research Limitations
  • Problems with around 90 of the studies on the
    topic identified by Weiss and Miller (2009)
  • Not research-oriented
  • Studies of pre-service teachers only
  • Not measuring content knowledge gains
  • Had selection or context bias
  • Lacked comparison groups
  • Used idiosyncratic instrumentation without
    evidence of reliability validity

4
Project Background
  • A Math and Science Partnership (MSP) grant
  • Four institutions of higher education and ten
    school districts
  • Professional development for elementary and
    middle school teachers for a period of 15 months
    content, pedagogical skills, use of technology
  • Four separate groups of teachers two received
    physics training and two received chemistry

5
Evaluation Design
  • Multiple levels of evaluation
  • Data collected from Summer 2008 through Summer
    2009
  • content assessments (teachers and students)
  • surveys (teachers and students)
  • observations (PD and classroom)
  • interviews

6
Teacher Content Assessment
  • Three administrations
  • Pre Day 1 of training week 2008
  • Post Day 5 of training week 2008
  • Follow-up Day 5 of training week 2009
  • MOSART Content Assessment in chemistry and
    physics
  • Lack of perfect alignment with the content
    covered, but chosen for its properties

7
Physics Academy
Physics Test expected to be affected by
treatment
Chemistry Test expected to be unaffected by
treatment
?NEDV?
? NEG ?
Chemistry Test expected to be affected by
treatment
Chemistry Academy
8
Participants With Content Assessment Data
Available

Chemistry A 26
Chemistry B 29
Total Chemistry Participants 55
Physics C 25
Physics D 22
Total Physics Participants 47
9
Non-Equivalent Group
10
Non-Equivalent Dependent Variable
11
Remaining Areas for Improvement
  • Better aligned instrument
  • Increased response rate
  • Controlling for additional threats to validity

12
Conclusion
  • If you have questions, or comments,please feel
    free to call or e-mail.
  • Elena Kirtcheva
  • kirtcheva_at_rbs.org
  • 215-568-6150, ext. 327
Write a Comment
User Comments (0)
About PowerShow.com