Using a Yardstick to Measure a Meter: Growth, Projection, and Value-Added Models in the Context of School Accountability - PowerPoint PPT Presentation

Loading...

PPT – Using a Yardstick to Measure a Meter: Growth, Projection, and Value-Added Models in the Context of School Accountability PowerPoint presentation | free to view - id: 15410a-YzAzN



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Using a Yardstick to Measure a Meter: Growth, Projection, and Value-Added Models in the Context of School Accountability

Description:

Gives schools 3 (or 4) years to bring low performing students ... value-added model (attempts to fairly compare school's effectiveness) 18. 18. 174 Points (33 ... – PowerPoint PPT presentation

Number of Views:159
Avg rating:3.0/5.0
Slides: 45
Provided by: michae480
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Using a Yardstick to Measure a Meter: Growth, Projection, and Value-Added Models in the Context of School Accountability


1
Using a Yardstick to Measure a Meter Growth,
Projection, and Value-Added Models in the Context
of School Accountability
  • Michael J. Weiss
  • University of Pennsylvania
  • January 14th, 2008

The research reported here was supported by the
Institute of Education Sciences, U.S. Department
of Education, through Grant R305C050041-05 to the
University of Pennsylvania. The opinions
expressed are those of the authors and do not
represent views of the U.S. Department of
Education.
2
Outline
  • Introduction
  • Background
  • NCLBs Accountability Measures
  • Growth Model Pilot Program (projection models)
  • Research Questions
  • Three States Projection Model Proposals
  • Data Source
  • Analyses (Methods) / Results
  • Study Limitations
  • Conclusions

3
Introduction
  • Increased importance of state-wide assessments
    used to measure school performance
  • NCLBs current measures are highly criticized
  • Popular alternative
  • Individual-Level Growth Models
  • Value-Added Models
  • Projection Models (currently being piloted in
    several states)
  • My research examines the application of different
    growth models for the purpose of measuring school
    performance in education accountability systems

4
The Accountability Context
  • NCLB uses two measures of school performance to
    determine if a school made Adequate Yearly
    Progress (AYP)
  • Current Status Looks at the percent proficient
    in a school in any given year
  • Improvement The Safe Harbor provision
    examines the change in percent not proficient
    from one year to the next

5
The Accountability Context
  • NCLBs Measures Highly Criticized
  • Status
  • Fails to account for initial achievement levels
  • Outside influences
  • Safe Harbor
  • Compares different cohorts
  • Statistically unreliable

Lissitz et al. 2006 McCaffrey et al., 2003
Linn, 2004 Kane Staiger, 2002.
6
Growth Model Pilot Program (GMPP)
  • A response to requests by educators and
    policymakers that states be allowed to use growth
    models to recognize the progress schools are
    making
  • Must uphold the core principles of NCLB
  • All students must become proficient

U.S. Department of Education, 2005.
7
Projection Models
  • Models are used to project students future
    proficiency status based on past achievement
  • Schools receive credit for those students who are
    not yet proficient, but are on track to become
    proficient
  • Gives schools 3 (or 4) years to bring low
    performing students up to proficiency, rather
    than 1 year
  • 8 states have been fully approved to use growth
    models (5 unique models)

8
(No Transcript)
9
Starting Scale Score (1169) of Hypothetical
Student
10
Starting Scale Score (1169) of Hypothetical
Student
11
Starting Scale Score (1169) of Hypothetical
Student
12
Starting Scale Score (1169) of Hypothetical
Student
13
Starting Scale Score (1169) of Hypothetical
Student
14
Research Question 1
  • How accurate are projection models at forecasting
    whether individual students will become
    proficient at a set point in the future?
  • Do certain state-proposed projection models
    forecast future proficiency more accurately than
    others?

15
Research Question 2
  • How accurate are state-proposed projection models
    at forecasting the percentage of students who
    will become proficient in the future at the
    school-level?

16
Research Question 3
  • Are growth expectations generally realistic under
    NCLBs measures of school performance? When
    using projection models?

17
Research Question 4
  • How similar or different are assessments of
    schools performance under a
  • status model (percent proficient)
  • projection model (the percentage of students on
    track to become proficient)
  • value-added model (attempts to fairly compare
    schools effectiveness)

18
174 Points (33)
174 Points (33)
523 Points (100)
174 Points (33)
19
174 Points (33)
174 Points (33)
523 Points (100)
174 Points (33)
20
60 Points (14)
423 Points (100)
188 Points (44)
175 Points (41)
21
74 Points (14)
232 Points (44)
523 Points (100)
216 Points (41)
22
74 Points (14)
232 Points (44)
523 Points (100)
216 Points (41)
23
(No Transcript)
24
Tennessee Projection Model
25
Data Source
  • Florida Comprehensive Assessment Test (FCAT)
  • 2001-02 through 2004-05 school years
  • Large urban school district in Florida
  • NStudents 10,007 3rd graders with achievement
    data in 2001-02
  • NSchools 96 elementary schools

26
Data (Attrition)
Frequency Counts for the 3rd Grade Cohort of
2001-2002
27
Data (Attrition Demographics)
28
Analysis Question 1
  • How accurate are individual-level projections?
  • Using 3rd, 4th, and 5th grade data, calculate
    whether a student is on track to become
    proficient by 6th grade using each states model
    (a 1 year projection)
  • Compare projected 6th grade proficiency with
    observed 6th grade proficiency
  • Repeat using 3rd and 4th grade data to project
    6th grade proficiency (a 2 year projection)

29
Analysis Question 1 Accuracy of
individual-level projections?
30
Analysis Question 2
  • How accurate are school-level projections?
  • Aggregate projections to the school-level to
    determine the percent of students in each school
    who were on track to become proficient
  • Compare the percent on track to become
    proficient to the percent who actually became
    proficient

31
Analysis Question 2 Accuracy of school-level
projections (1-year)?
32
Analysis Question 2 Accuracy of school-level
projections (2-year)?
33
Study Limitations
  • Four years of data cant test accuracy of
    models that project 3 years into the future
  • One district, one state results may not be
    generalizable
  • One assessment instrument results may not be
    generalizable

34
Conclusions (Improving the Models)
  • If you have historical data, test your models
    accuracy before implementing it!
  • If you dont have enough historical data to test
    accuracy, consider the shape of the developmental
    scale and regression to the mean.
  • Determine likelihood of future proficiency,
    rather than on track or not on track using
  • Better for individual level reporting
  • Likely to produce more accurate results when
    aggregated

35
Conclusions (Policy)
  • Projection models will help some schools get
    credit for students making progress towards
    proficiency, but the models are not ready yet
  • Moreover, the GMPP does not address the
    unfairness of the current system, where schools
    are judged in large part by that which is out of
    their control

36
(No Transcript)
37
Using a Yardstick to Measure a Meter Growth,
Projection, and Value-Added Models In the Context
of School Accountability
Michael J. Weiss University of Pennsylvania weissm
j_at_dolphin.upenn.edu January 14, 2008
38
(No Transcript)
39
(No Transcript)
40
(No Transcript)
41
Value-Added Model
Where
Lockwood, J. R., Doran, H. C., McCaffrey, D. F.
(2003), Using R for estimating longitudinal
student achievement models. The Newsletter of the
R Project, 3(3) 17.
42
Analyses Question 3 Are Growth Expectation
Realistic (NCLB)?
NCLB
GMPP
43
Analyses Question 4 Comparing Measures of School
Performance
r .99
44
Analyses Question 4 Comparing Measures of School
Performance
r .47
About PowerShow.com