Title: Rate of Improvement Version 2.0: Research Based Calculation and Decision Making
1Rate of Improvement Version 2.0 Research Based
Calculation and Decision Making
- Caitlin S. Flinn, EdS, NCSP
- Andrew E. McCrea, MS, NCSP
- Matthew Ferchalk EdS, NCSP
- ASPP Conference 2010
2Todays Objectives
- Explain what RoI is, why it is important, and how
to compute it. - Establish that Simple Linear Regression should be
the standardized procedure for calculating RoI. - Discuss how to use RoI within a problem
solving/school improvement model.
3RoI Definition
- Algebraic term Slope of a line
- Vertical change over the horizontal change
- Rise over run
- m (y2 - y1) / (x2 - x1)
- Describes the steepness of a line (Gall Gall,
2007)
4RoI Definition
- Finding a students RoI finding the slope of a
line - Using two data points on that line
- Finding the line itself
- Linear regression
- Ordinary Least Squares
5How does Rate of Improvement Fit into the Larger
Context?
6School Improvement/Comprehensive School Reform
Response to Intervention
Dual Discrepancy Level Growth
Rate of Improvement
7School Improvement/Comprehensive School Reform
- Grade level content expectations (ELA, math,
science, social studies, etc.). - Work toward these expectations through classroom
instruction. - Understand impact of instruction through
assessment.
8Assessment
- Formative Assessments/High Stakes Tests
- Does student have command of content expectation
(standard)? - Universal Screening using CBM
- Does student have basic skills appropriate for
age/grade?
9Assessment
- Q For students who are not proficient on grade
level content standards, do they have the basic
reading/writing/math skills necessary? - A Look at Universal Screening if above
criteria, intervention geared toward content
standard, if below criteria, intervention geared
toward basic skill.
10Progress Monitoring
- Frequent measurement of knowledge to inform our
understanding of the impact of instruction/interve
ntion. - Measures of basic skills (CBM) have demonstrated
reliability validity (see table at
www.rti4success.org).
11Classroom Instruction (Content Expectations)
Measure Impact (Test)
Proficient!
Non Proficient
Content Need?
Basic Skill Need?
Use Diagnostic Test to Differentiate
Intervention Progress Monitor With CBM
Intervention Progress Monitor
If CBM is Appropriate Measure
Rate of Improvement
12So
- Rate of Improvement (RoI) is how we understand
student growth (learning). - RoI is reliable and valid (psychometrically
speaking) for use with CBM data. - RoI is best used when we have CBM data, most
often when dealing with basic skills in
reading/writing/math. - RoI can be applied to other data (like behavior)
with confidence too! - RoI is not yet tested on typical Tier I formative
classroom data.
13RoI is usually applied to
- Tier One students in the early grades at risk for
academic failure (low green kids). - Tier Two Three Intervention Groups.
- Special Education Students (and IEP goals)
- Students with Behavior Plans
14RoI Foundations
- Deno, 1985
- Curriculum-based measurement
- General outcome measures
- Short
- Standardized
- Repeatable
- Sensitive to change
15RoI Foundations
- Fuchs Fuchs, 1998
- Hallmark components of Response to Intervention
- Ongoing formative assessment
- Identifying non-responding students
- Treatment fidelity of instruction
- Dual discrepancy model
- One standard deviation from typically performing
peers in level and rate
16RoI Foundations
- Ardoin Christ, 2008
- Slope for benchmarks (3x per year)
- More growth from fall to winter than winter to
spring - Might be helpful to use RoI for fall to winter
- And a separate RoI for winter to spring
17RoI Foundations
- Fuchs, Fuchs, Walz, Germann, 1993
- Typical weekly growth rates
- Needed growth
- 1.5 to 2.0 times typical slope to close gap in a
reasonable amount of time
18RoI Foundations
- Deno, Fuchs, Marston, Shin, 2001
- Slope of frequently non-responsive children
approximated slope of children already identified
as having a specific learning disability
19RoI Statistics
- Gall Gall, 2007
- 10 data points are a minimum requirement for a
reliable trendline - How does that affect the frequency of
administering progress monitoring probes?
20Importance of Graphs
- Vogel, Dickson, Lehman, 1990
- Speeches that included visuals, especially in
color, improved - Immediate recall by 8.5
- Delayed recall (3 days) by 10.1
21Importance of Graphs
- Seeing is believing.
- Useful for communicating large amounts of
information quickly - A picture is worth a thousand words.
- Transcends language barriers (Karwowski, 2006)
- Responsibility for accurate graphical
representations of data
22Skills Typically Graphed
- Reading
- Oral Reading Fluency
- Word Use Fluency
- Reading Comprehension
- MAZE
- Retell Fluency
- Early Literacy Skills
- Initial Sound Fluency
- Letter Naming Fluency
- Letter Sound Fluency
- Phoneme Segmentation Fluency
- Nonsense Word Fluency
- Spelling
- Written Expression
- Behavior
- Math
- Math Computation
- Math Facts
- Early Numeracy
- Oral Counting
- Missing Number
- Number Identification
- Quantity Discrimination
23Importance of RoI
- Visual inspection of slope
- Multiple interpretations
- Instructional services
- Need for explicit guidelines
24Ongoing Research
- RoI for instructional decisions is not a perfect
process - Research is currently addressing sources of
error - Christ, 2006 standard error of measurement for
slope - Ardoin Christ, 2009 passage difficulty and
variability - Jenkin, Graff, Miglioretti, 2009 frequency of
progress monitoring
25Future Considerations
- Questions yet to be empirically answered
- What parameters of RoI indicate a lack of RtI?
- How does standard error of measurement play into
using RoI for instructional decision making? - How does RoI vary between standard protocol
interventions? - How does this apply to non-English speaking
populations?
26How is RoI Calculated? Which way is best?
27Multiple Methods for Calculating Growth
- Visual Inspection Approaches
- Eye Ball Approach
- Split Middle Approach
- Tukey Method
- Quantitative Approaches
- Last point minus First point Approach
- Split Middle Tukey plus
- Linear Regression Approach
28The Visual Inspection Approaches
29Eye Ball Approach
30Split Middle Approach
- Drawing through the two points obtained from the
median data values and the median days when the
data are divided into two sections - (Shinn, Good, Stein, 1989).
31Split Middle
X(14)
X (9)
X(9)
32Tukey Method
- Divide scores into 3 equal groups
- Divide groups with vertical lines
- In 1st and 3rd groups, find median data point and
median week and mark with an X - Draw line between two Xs
- (Fuchs, et. al., 2005. Summer Institue Student
progress monitoring for math. http//www.studentpr
ogress.org/library/training.asp)
33Tukey Method
X(14)
X(8)
34The Quantitative Approaches
35Last minus First
- Iris Center last probe score minus first probe
score over last administration period minus first
administration period. - Y2-Y1/X2-X1 RoI
- http//iris.peabody.vanderbilt.edu/resources.html
36Last minus First
37Split Middle Plus
X(14)
X(9)
(14-9)/80.63
38Tukey Method Plus
X(14)
X(8)
(14-8)/80.75
39Linear Regression
40RoI Consistency?
Any Method of Visual Inspection ???
Last minus First 0.75
Split Middle Plus 0.63
Tukey Plus 0.75
Linear Regression 1.10
41RoI Consistency?
- If we are not all using the same model to compute
RoI, we continue to have the same problems as
past models, where under one approach a student
meets SLD criteria, but under a different
approach, the student does not. - Hypothetically, if the RoI cut-off was 0.65 or
0.95, different approaches would come to
different conclusions on the same student.
42RoI Consistency?
- Last minus First (Iris Center) and Linear
Regression (Shinn, etc.) only quantitative
methods discussed in CBM literature. - Study of 37 at risk 2nd graders
-
Difference in RoI b/w LmF LR Methods Difference in RoI b/w LmF LR Methods
Whole Year 0.26 WCPM
Fall 0.31 WCPM
Spring 0.24 WCPM
McCrea (2010) Unpublished data McCrea (2010) Unpublished data
43Technical Adequacy
- Without a consensus on how to compute RoI, we
risk falling short of having technical adequacy
within our model.
44So, Which RoI Method is Best?
45Literature shows that Linear Regression is Best
Practice
- Students daily test scoreswere entered into a
computer programThe data analysis program
generated slopes of improvement for each level
using an Ordinary-Least Squares procedure (Hayes,
1973) and the line of best fit. - This procedure has been demonstrated to represent
CBM achievement data validly within individual
treatment phases (Marston, 1988 Shinn, Good,
Stein, in press Stein, 1987). - Shinn, Gleason, Tindal, 1989
46Growth (RoI) Research using Linear Regression
- Christ, T. J. (2006). Short-term estimates of
growth using curriculum based measurement of oral
reading fluency Estimating standard error of the
slope to construct confidence intervals. School
Psychology Review, 35, 128-133. - Deno, S. L., Fuchs, L. S., Marston, D., Shin,
J. (2001). Using curriculum based measurement to
establish growth standards for students with
learning disabilities. School Psychology Review,
30, 507-524. - Good, R. H. (1990). Forecasting accuracy of slope
estimates for reading curriculum based
measurement Empirical evidence. Behavioral
Assessment, 12, 179-193. - Fuchs, L. S., Fuchs, D., Hamlett, C. L., Walz, L.
Germann, G. (1993). Formative evaluation of
academic progress How much growth can we expect?
School Psychology Review, 22, 27-48.
47Growth (RoI) Researchusing Linear Regression
- Jenkins, J. R., Graff, J. J., Miglioretti, D.L.
(2009). Estimating reading growth using
intermittent CBM progress monitoring. Exceptional
Children, 75, 151-163. - Shinn, M. R., Gleason, M. M., Tindal, G.
(1989). Varying the difficulty of testing
materials Implications for curriculum-based
measurement. The Journal of Special Education,
23, 223-233. - Shinn, M. R., Good, R. H., Stein, S. (1989).
Summarizing trend in student achievement A
comparison of methods. School Psychology Review,
18, 356-370.
48So, Why Are There So Many Other RoI Models?
- Ease of application
- Focus on Yes/No to goal acquisition, not degree
of growth - How many of us want to calculate OLS Linear
Regression formulas (or even remember how)?
49Pros and Cons of Each Approach
Pros Cons
Eye Ball Easy Understandable Subjective
Split Middle Tukey No software needed Compare to Aim/Goal line Yes/No to goal acquisition No statistic provided, no idea of the degree of growth
50Pros and Cons of Each Approach
Pros Cons
Last minus First Provides a growth statistic Easy to compute Does not consider all data points, only two
Split Middle Tukey Plus Considers all data points. Easy to compute No support for plus part of methodology
Linear Regression All data points Best Practice Calculating the statistic
51An Easy and Applicable Solution
52Get Out Your Laptops!
I love ROI
53Graphing RoIFor Individual Students
- Programming Microsoft Excel to Graph Rate of
Improvement - Fall to Winter
54Setting Up Your Spreadsheet
- In cell A1, type 3rd Grade ORF
- In cell A2, type First Semester
- In cell A3, type School Week
- In cell A4, type Benchmark
- In cell A5, type the Students Name (Swiper
Example)
55Labeling School Weeks
- Starting with cell B3, type numbers 1 through 18
going across row 3 (horizontal). - Numbers 1 through 18 represent the number of the
school week. - You will end with week 18 in cell S3.
56Labeling Dates
- Note You may choose to enter the date of that
school week across row 2 to easily identify the
school week.
57Entering Benchmarks(3rd Grade ORF)
- In cell B4, type 77. This is your fall benchmark.
- In cell S4, type 92. This is your winter
benchmark.
58Entering Student Data (Sample)
- Enter the following numbers, going across row 5,
under corresponding week numbers. - Week 1 41
- Week 8 62
- Week 9 63
- Week 10 75
- Week 11 64
- Week 12 80
- Week 13 83
- Week 14 83
- Week 15 56
- Week 17 104
- Week 18 74
59CAUTION
- If a student was not assessed during a certain
week, leave that cell blank - Do not enter a score of Zero (0) it will be
calculated into the trendline and interpreted as
the student having read zero words correct per
minute during that week.
60Graphing the Data
- Highlight cells A4 and A5 through S4 and S5
- Follow Excel 2003 or Excel 2007 directions from
here
61Graphing the Data
- Excel 2003
- Across the top of your worksheet, click on
Insert - In that drop-down menu, click on Chart
- Excel 2007
- Click Insert
- Find the icon for Line
- Click the arrow below Line
62Graphing the Data
- Excel 2003
- A Chart Wizard window will appear
- Excel 2007
- 6 graphics appear
63Graphing the Data
- Excel 2003
- Choose Line
- Choose Line with markers
- Excel 2007
- Choose Line with markers
64Graphing the Data
- Excel 2003
- Data Range tab
- Columns
- Excel 2007
- Your graph appears
65Graphing the Data
- Excel 2003
- Chart Title
- School Week X Axis
- WPM Y Axis
- Excel 2007
- Change your labels by right clicking on the graph
66Graphing the Data
- Excel 2003
- Choose where you want your graph
- Excel 2007
- Your graph was automatically put into your data
spreadsheet
67Graphing the Trendline
- Excel 2003
- Right click on any of the student data points
68Graphing the Trendline
69Graphing the Trendline
- Excel 2003
- Choose Custom and check box next to Display
equation on chart
70Graphing the Trendline
- Clicking on the equation highlights a box around
it - Clicking on the box allows you to move it to a
place where you can see it better
71Graphing the Trendline
- You can repeat the same procedure to have a
trendline for the benchmark data points - Suggestion label the trendline Expected ROI
- Move this equation under the first
72Individual Student Graph
73Individual Student Graph
- The equation indicates the slope, or rate of
improvement. - The number, or coefficient, before "x" is the
average improvement, which in this case is the
average number of words per minute per week
gained by the student.
74Individual Student Graph
- The rate of improvement, or trendline, is
calculated using a linear regression, a simple
equation of least squares. - To add additional progress monitoring/benchmark
scores once youve already created a graph, enter
additional scores in Row 5 in the corresponding
school week.
75Individual Student Graph
- The slope can change depending on which week
(where) you put the benchmark scores on your
chart. - Enter benchmark scores based on when your school
administers their benchmark assessments for the
most accurate depiction of expected student
progress.
76Assuming Linear Growth
Why Graph only 18 Weeks at a Time?
- Finding Curve-linear Growth
77Non-Educational Example of Curve-linear Growth
78Academic Example of Curvilinear Growth
79McCrea, 2010
- Looked at Rate of Improvement in small 2nd grade
sample - Found differences in RoI when computed for fall
and spring - Ave RoI for fall 1.47 WCPM
- Ave RoI for spring 1.21 WCPM
80Ardoin Christ, 2008
- Slope for benchmarks (3x per year)
- More growth from fall to winter than winter to
spring
81Christ, Yeo, Silberglitt, in press
- Growth across benchmarks (3X per year)
- More growth from fall to winter than winter to
spring - Disaggregated special education population
82Graney, Missall, Martinez, 2009
- Growth across benchmarks (3X per year)
- More growth from winter to spring than fall to
winter with R-CBM.
83Fien, Park, Smith, Baker, 2010
- Investigated relationship b/w NWF gains and
ORF/Comprehension - Found greater NWF gains in fall than in spring.
84DIBELS (6th) ORF Change in Criteria
Fall to Winter Winter to Spring
2nd 24 22
3rd 15 18
4th 13 13
5th 11 9
6th 11 5
85AIMSweb Norms
Based on 50th Percentile Fall to Winter Winter to Spring
1st 18 31
2nd 25 17
3rd 22 15
4th 16 13
5th 17 15
6th 13 12
86Speculation as to why Differences in RoI within
the Year
- Relax instruction after high stakes testing in
March/April a PSSA effect. - Depressed BOY benchmark scores due to summer
break a rebound effect (Clemens). - Instructional variables could explain differences
in Graney (2009) and Ardoin (2008) Christ (in
press) results (Silberglitt). - Variability within progress monitoring probes
(Ardoin Christ, 2008) (Lent).
87Programming Excel
- Calculating Needed RoI
- Calculating Actual (Expected) RoI Benchmark
- Calculating Actual RoI - Student
88Calculating Needed RoI
- In cell T3, type Needed RoI
- Click on cell T5
- In the fx line (at top of sheet) type this
formula ((S4-B5)/18) - Then hit enter
- Your result should read 2
- This formula simply subtracts the students
actual middle of year (MOY) benchmark from the
expected end of year (EOY) benchmark, then
dividing by 18 for the first 18 weeks (1st
semester).
89Calculating Actual (Expected) RoI - Benchmark
- In cell U3, type Actual RoI
- Click on cell U4
- In the fx line (at top of sheet) type this
formula SLOPE(B4S4,B3S3) - Then hit enter
- Your result should read 1.06
- This formula considers 18 weeks of benchmark data
and provides an average growth or change per week.
90Calculating Actual RoI - Student
- Click on cell U5
- In the fx line (at top of sheet) type this
formula SLOPE(B5S5,B3S3) - Then hit enter
- Your result should read 1.89
- This formula considers 18 weeks of student data
and provides an average growth or change per week.
91ROI as a Decision Tool
- within a Problem-Solving Model
92Steps
- Gather the data
- Ground the data set goals
- Interpret the data
- Figure out how to fit Best Practice into Public
Education
93Step 1 Gather Data
- Universal Screening
- Progress Monitoring
94Common Screenings in PA
- DIBELS
- AIMSweb
- MBSP
- 4Sight
- PSSA
95Validated Progress Monitoring Tools
- DIBELS
- AIMSweb
- MBSP
- www.studentprogress.org
96Step 2 Ground the Data
- 1) To what will we compare our student growth
data? - 2) How will we set goals?
97Multiple Ways toLook at Growth
- Needed Growth
- Expected Growth Percent of Expected Growth
- Fuchs et. al. (1993) Table of Realistic and
Ambitious Growth - Growth Toward Individual Goal
- Best Practices in Setting Progress Monitoring
Goals for Academic Skill Improvement (Shapiro,
2008)
98Needed Growth
- Difference between students BOY (or MOY) score
and benchmark score at MOY (or EOY). - Example MOY ORF 10, EOY benchmark is 40, 18
weeks of instruction (40-10/181.67). Student
must gain 1.67 wcpm per week to make EOY
benchmark.
99Expected Growth
- Difference between two benchmarks.
- Example MOY benchmark is 20, EOY benchmark is
40, expected growth (40-20)/18 weeks of
instruction 1.11 wcpm per week.
100Looking at Percent of Expected Growth
Tier I Tier II Tier III
Greater than 150
Between 110 150 Possible LD
Between 95 110 Likely LD
Between 80 95 May Need More May Need More Likely LD
Below 80 Needs More Needs More Likely LD
101Oral Reading Fluency Adequate Response Table
Realistic Growth Ambitious Growth
1st 2.0 3.0
2nd 1.5 2.0
3rd 1.0 1.5
4th 0.9 1.1
5th 0.5 0.8
102Digit Fluency Adequate Response Table
Realistic Growth Ambitious Growth
1st 0.3 0.5
2nd 0.3 0.5
3rd 0.3 0.5
4th 0.75 1.2
5th 0.75 1.2
103From Where Should Benchmarks/Criteria Come?
- Appears to be a theoretical convergence on use of
local criteria (what scores do our students need
to have a high probability of proficiency?) when
possible.
104Test GloballyBenchmark Locally
105Objectives
- Rationale for developing Local Benchmarks
- Fun with Excel!
- Fun with Algebra!
- Local Benchmarks in Action
106Rational for Developing Local Benchmarks
- Stage Jacobson (2001)
- Slope in Oral Reading Fluency reliably predicted
performance on Washington Assessment of Student
Learning - McGlinchy Hixon (2004)
- Results support the use of CBM for determine
which students are at risk for reading failure
and who will fail state tests - Hintze Silberglitt (2005)
- Oral Reading Fluency is highly connected to state
test performance and is and is accurate at
predicting those students who are likely to not
meet proficiency. - Shapiro et al. (2006)
- Results of this study show that CBM and be a
valuable source to identify which student are
likely to be successful or fail state tests. - Ask Jason Pedersen!
107Rational for Developing Local Benchmarks
- Identify and validate problems
- Creating ideas for instructional grouping, focus,
or intensity - Goal setting
- Determining the focus and frequency of progress
monitoring - Exiting student or moving students to different
level or tiers of intervention - Systems level resource allocation and evaluation
108Rationale for Developing Local Benchmarks
- Silberglitt (2008)
- Districts should refrain from simply adopting a
set of national target scores, as these scores
may or may not be relevant to the high-stakes
outcomes for which their students must be
adequately prepared. (p. 1871) - By linking local assessments to high-stakes
tests, users are able to establish target scores
on these local assessments, scores that divide
students between those who are likely and those
who are unlikely to achieve success on the
high-stakes test. (p. 1870)
109Rationale for Developing Local Benchmarks
- Discrepancy across states, in terms of the
percentile ranks on a nationally administered
assessment necessary to predict successful state
test performance (Kingsbury et al., 2004) - Using cut scores based on the probability of
success on an upcoming state-mandated assessment,
might be a useful alternative to normative date
for making these decisions. (Silberglitt Hintz,
2005) - Can be used to separate students into groups in
an RtII framework (Silberglitt, 2008)
110Rationale for Developing Local Benchmarks
- Useful in calculating discrepancy in level
(Burns, 2008) - Represent the school population where the
students are getting their education (Stewart
Silberglitt, 2008) - Teachers often use comparisons between students
in their classroom, this helps to objectify those
decisions (Stewart Silberglitt, 2008)
111Rationale for Developing Local Benchmarks
- How accurately does it predict proficiency level
in Third Grade?
112Rationale for Developing Local Benchmarks
- Percentage of students in Third Grade predicted
to be successful on the PSSA who were actually
Successful
113Rationale for Developing Local Benchmarks
- Percentage of Third Grade students predicted to
be unsuccessful who actually failed to meet
proficiency on the PSSA
114Getting Started
- Collect 3 or more years of student CBM and PSSA
data - Match student data for each student
- Use data extract and data farming features
offered through PSSA / DIBELS / AIMSweb websites - Download with student ID numbers
- If you have a data warehousethen use your
special magiclucky!
115Getting Started
- Reliable and valid data
- Linear / highly correlated data
- Gather data with integrity
- Do not teach to the test
- All students should be included in the norm group
- Be cautious of cohort effects
116Getting Started
- PSSA Cut Scores
- http//www.portal.state.pa.us/portal/server.pt/com
munity/cut_scores/7441 - Use the lower end score for Proficiency
- Download the data set from
- http//sites.google.com/site/rateofimprovement/
117Wisdom from Teachers(especially from our
reading specialists Tina and Kristin!)
- Children do not equal dots!
- They are not numbers or data points!
- Having said that
?
118Fun with Excel!
119Fun with Algebra!
- Matt Burns University of Minnesota
- X(Y-a)/b
- Y Proficiency Score on the PSSA
- a Intercept
- b Slope
- XLocal Benchmark Score
120(No Transcript)
121(No Transcript)
122More Fun with Algebra!
- Predict student Proficiency Score
- Resolve the equation
- X(Y-a)/b
- Y(Xb)a
- YPredicted PSSA Score
- Use with Caution!
- Student
- 93wcpm in the fall
- Data Sample
- Slope 2.56
- Intercept 1108
- Y(93X2.56)1108
- Y1306
123Local Benchmark Applications
- Northern Lebanon School District Local Benchmarks
124Local Benchmark Applications
- For those that like the DIBELS Graphs
125Local Benchmark Applications
126Diagnostic Accuracy
- Sensitivity
- Of all the students who failed the PSSA, what
percentage were accurately predicted to fail
based on their ORF score - Specificity
- Of all of the students who passed the PSSA, what
percentage were accurately predicted to pass
based on their ORF score - Negative Predictive Power
- Percentage of students predicted to be successful
on the PSSA who were actually Successful - Positive Predictive Power
- Percentage of students predicted to be
unsuccessful who actually failed to meet
proficiency on the PSSA
127(No Transcript)
128(No Transcript)
129Local Benchmarks - Method 2
- Fun with SPSS!
- Logistic Regression Roc Curves
- More accurate
- Helps to balance Sensitivity, Specificity,
Negative Positive Predictive Power - For more information see
- Best Practices in Using Technology for Data-Based
Decision Making (Silberglitt, 2008)
130If Local Criteria are Not an Option
- Use norms that accompany the measure (DIBELS,
AIMSweb, etc.). - Use national norms.
131Making Decisions Best Practice
- Research has yet to establish a blue print for
grounding student RoI data. - At this point, teams should consider multiple
comparisons when planning and making decisions.
132Making Decisions Lessons From the Field
- When tracking on grade level, consider an RoI
that is 100 of expected growth as a minimum
requirement, consider an RoI that is at or above
the needed as optimal. - So, 100 of expected and on par with needed
become the limits of the range within a student
should be achieving.
133Is there an easy way to do all of this?
134(No Transcript)
135(No Transcript)
136Access to Spreadsheet Templates
- http//sites.google.com/site/rateofimprovement/hom
e - Click on Charts and Graphs.
- Update dates and benchmarks.
- Enter names and benchmark/progress monitoring
data.
137What about Students not on Grade Level?
138Determining Instructional Level
- Independent/Instructional/Frustrational
- Instructional often b/w 40th or 50th percentile
and 25th percentile. - Frustrational level below the 25th percentile.
- AIMSweb Survey Level Assessment (SLA).
139Setting Goals off of Grade Level
- 100 of expected growth not enough.
- Needed growth only gets to instructional level
benchmark, not grade level. - Risk of not being ambitious enough.
- Plenty of ideas, but limited research regarding
Best Practice in goal setting off of grade level.
140Possible Solution (A)
- Weekly probe at instructional level and compare
to expected and needed growth rates at
instructional level. - Ambitious goal 200 of expected RoI
141(No Transcript)
142Possible Solution (B)
- Weekly probe at instructional level for sensitive
indicator of growth. - Monthly probes (give 3, not just 1) at grade
level to compute RoI. - Goal based on grade level growth (more than 100
of expected).
143Step 3 Interpreting Growth
144What do we do when we do not get the growth we
want?
- When to make a change in instruction and
intervention? - When to consider SLD?
145When to make a change in instruction and
intervention?
- Enough data points (6 to 10)?
- Less than 100 of expected growth.
- Not on track to make benchmark (needed growth).
- Not on track to reach individual goal.
146When to consider SLD?
- Continued inadequate response despite
- Fidelity with Tier I instruction and Tier II/III
intervention. - Multiple attempts at intervention.
- Individualized Problem-Solving approach.
- Evidence of dual discrepancy
147(No Transcript)
148Three Levels of Examples
- Whole Class
- Small Group
- Individual Student
- - Academic Data
- - Behavior Data
149Whole Class Example
1503rd Grade Math Whole Class
- Whos responding?
- Effective math instruction?
- Who needs more?
- N19
- 4 gt 100 growth
- 15 lt 100 growth
- 9 w/ negative growth
151Small Group Example
152Intervention Group
- Intervention working for how many?
- Can we assume fidelity of intervention based on
results? - Who needs more?
153Individual Kid Example
154Individual Kid
- Making growth?
- How much (65 of expected growth).
- Atypical growth across the year (last 3 data
points). - Continue? Make a change? Need more data?
155RoI and Behavior?
156(No Transcript)
157Step 4 Figure out how to fit Best Practice into
Public Education
158Things to Consider
- Who is At-Risk and needs progress monitoring?
- Who will collect, score, enter the data?
- Who will monitor student growth, when, and how
often? - What changes should be made to instruction
intervention? - What about monitoring off of grade level?
159Who is At-Risk and needs progress monitoring?
- Below level on universal screening
Entering 4th Grade Example Entering 4th Grade Example Entering 4th Grade Example Entering 4th Grade Example Entering 4th Grade Example
DORF (110) ISIP TRWM (55) 4Sight (1235) PSSA (1235)
Student A 115 58 1255 1232
Student B 85 48 1216 1126
Student C 72 35 1056 1048
160Who will collect, score, and enter the data?
- Using MBSP for math, teachers can administer
probes to whole class. - DORF probes must be administered one-on-one, and
creativity pays off (train and use art, music,
library, etc. specialists). - Schedule for progress monitoring math and reading
every-other week.
161Week 1 Week 1 Week 2 Week 2
Reading Math Reading Math
1st X X
2nd X X
3rd X X
4th X X
5th X X
162Who will monitor student growth, when, and how
often?
- Best Practices in Data-Analysis Teaming
(Kovaleski Pedersen, 2008) - Chambersburg Area School District Elementary
Response to Intervention Manual (McCrea et. al.,
2008) - Derry Township School District Response to
Intervention Model (http//www.hershey.k12.pa.us/5
6039310111408/lib/56039310111408/_files/Microsoft_
Word_-_Response_to_Intervention_Overview_of_Hershe
y_Elementary_Model.pdf)
163What changes should be made to instruction
intervention?
- Ensure treatment fidelity!!!!!!!!
- Increase instructional time (active and engaged)
- Decrease group size
- Gather additional, diagnostic, information
- Change the intervention
164Final Exam
- Student Data 27, 29, 26, 34, 27, 32, 39, 45, 43,
49, 51, --, --, 56, 51, 52, --, 57. - Benchmark Data BOY 40, MOY 68.
- What is students RoI?
- How does RoI compare to expected and needed RoIs?
- What steps would your team take next?
- What if Benchmarks were 68 and 90 instead?
165Questions? Comments!
166The RoI Web Site
- http//sites.google.com/site/rateofimprovement/
- Download powerpoints, handouts, Excel graphs,
charts, articles, etc. - Caitlin Flinn
- CaitlinFlinn_at_hotmail.com
- Andy McCrea
- andymccrea70_at_gmail.com
- Matt Ferchalk
- mferchalk_at_norleb.k12.pa.us
167Resources
- www.interventioncentral.com
- www.aimsweb.com
- http//dibels.uoregon.edu
- www.nasponline.org
168Resources
- www.fcrr.org
- Florida Center for Reading Research
- http//ies.ed.gov/ncee/wwc//
- What Works Clearinghouse
- http//www.rti4success.org
- National Center on RtI
169References
- Ardoin, S. P., Christ, T. J. (2009).
Curriculum-based measurement of oral reading
Standard errors associated with progress
monitoring outcomes from DIBELS, AIMSweb, and an
experimental passage set. School Psychology
Review, 38(2), 266-283. - Ardoin, S. P. Christ, T. J. (2008). Evaluating
curriculum-based measurement slope estimates
using triannual universal screenings. School
Psychology Review, 37(1), 109-125.
170References
- Christ, T. J. (2006). Short-term estimates of
growth using curriculum-based measurement of oral
reading fluency Estimating standard error of the
slope to construct confidence intervals. School
Psychology Review, 35(1), 128-133. - Deno, S. L. (1985). Curriculum-based measurement
The emerging alternative. Exceptional Children,
52, 219-232.
171References
- Deno, S. L., Fuchs, L.S., Marston, D., Shin, J.
(2001). Using curriculum-based measurement to
establish growth standards for students with
learning disabilities. School Psychology Review,
30, 507-524. - Flinn, C. S. (2008). Graphing rate of improvement
for individual students. InSight, 28(3), 10-12.
172References
- Fuchs, L. S., Fuchs, D. (1998). Treatment
validity A unifying concept for
reconceptualizing the identification of learning
disabilities. Learning Disabilities Research and
Practice, 13, 204-219. - Fuchs, L. S., Fuchs, D., Hamlett, C. L., Walz,
L., Germann, G. (1993). Formative evaluation of
academic progress How much growth can we expect?
School Psychology Review, 22, 27-48.
173References
- Gall, M.D., Gall, J.P. (2007). Educational
research An introduction (8th ed.). New York
Pearson. - Jenkins, J. R., Graff, J. J., Miglioretti, D.L.
(2009). Estimating reading growth using
intermittent CBM progress monitoring. Exceptional
Children, 75, 151-163.
174References
- Karwowski, W. (2006). International encyclopedia
of ergonomics and human factors. Boca Raton, FL
Taylor Francis Group, LLC. - Shapiro, E. S. (2008). Best practices in setting
progress monitoring goals for academic skill
improvement. In A. Thomas and J. Grimes (Eds.),
Best practices in school psychology V (Vol. 2,
pp. 141-157). Bethesda, MD National Association
of School Psychologists.
175References
- Vogel, D. R., Dickson, G. W., Lehman, J. A.
(1990). Persuasion and the role of visual
presentation support. The UM/3M study. In M.
Antonoff (Ed.), Presentations that persuade.
Personal Computing, 14.
176References
- Burns, M. (2008, October). Data-based problem
analysis and interventions within RTI Isnt that
what school psychology is all about? Paper
presented at the Association of School
Psychologists of Pennsylvania Annual Conference,
State College, PA. - Ferchalk, M. R., Richardson, F. Cogan-Ferchalk,
J.R. (2010, October). Using oral reading fluency
data to create an accurate prediction model for
PSSA Performance. Poster session presented at the
Association of School Psychologists of
Pennsylvania Annual Conference, State College,
PA. - Hintze, J., Silberglitt, B. (2005). A
Longitudinal Examination of the Diagnostic
Accuracy and Predictive Validity of R-CBM and
High-Stakes Testing. School Psychology Review,
34(3), 372-386. - McGlinchey, M., Hixson, M. (2004). Using
Curriculum-Based Measurement to Predict
Performance on State Assessments in Reading.
School Psychology Review, 33(2), 193-203. - Shapiro, E., Keller, M., Lutz, J., Santoro, L.,
Hintze, J. (2006). Curriculum-Based Measures and
Performance on State Assessment and Standardized
Tests Reading and Math Performance in
Pennsylvania. Journal of Psychoeducational
Assessment, 24(1), 19-35.
177References
- Silberglitt, B. (2008). Best practices in Using
Technology for Data-Based Decision Making. In A.
Thomas and J. Grimes (eds.) Best practices in
school psychology V. Bethesda, MD National
Association of School Psychologists. - Silberglitt, B., Burns, M., Madyun, N., Lail,
K. (2006). Relationship of reading fluency
assessment data with state accountability test
scores A longitudinal comparison of grade
levels. Psychology in the Schools, 43(5),
527-535. - Stage, S., Jacobsen, M. (2001). Predicting
Student Success on a State-mandated
Performance-based Assessment Using Oral Reading
Fluency. School Psychology Review, 30(3), 407. - Stewart, L.H. Silberglitt, B. (2008). Best
practices in Developing Academic Local Norms. In
A. Thomas and J. Grimes (eds.) Best practices in
school psychology V. Bethesda, MD National
Association of School Psychologists.