Title: RTI: How to Collect Data to Understand and Fix Student Academic and Behavioral Problems Jim Wright www.interventioncentral.org
1RTI How to Collect Data to Understandand Fix
Student Academic and BehavioralProblems Jim
Wrightwww.interventioncentral.org
2Data Collection Defining Terms
Evaluation. the process of using information
collected through assessment to make decisions or
reach conclusions. (Hosp, 2008 p. 364).
Example A student can be evaluated for
problems in fluency with text by collecting
information using various sources (e.g., CBM ORF,
teacher interview, direct observations of the
student reading across settings, etc.), comparing
those results to peer norms or curriculum
expectations, and making a decision about whether
the students current performance is acceptable.
Assessment. the process of collecting
information about the characteristics of persons
or objects by measuring them. (Hosp, 2008 p.
364). Example The construct fluency with
text can be assessed using various measurements,
including CBM ORF, teacher interview, and direct
observations of the student reading in different
settings and in different material.
Measurement. the process of applying numbers to
the characteristics of objects or people in a
systematic way (Hosp, 2008 p. 364). Example
Curriculum-Based Measurement Oral Reading Fluency
(CBM ORF) is one method to measure the construct
fluency with text
3Use Time Resources Efficiently By Collecting
Information Only on Things That Are Alterable
- Time should be spent thinking about things
that the intervention team can influence through
instruction, consultation, related services, or
adjustments to the students program. These are
things that are alterable.Beware of statements
about cognitive processes that shift the focus
from the curriculum and may even encourage
questionable educational practice. They can also
promote writing off a student because of the
rationale that the students insufficient
performance is due to a limited and fixed
potential. p.359
Source Howell, K. W., Hosp, J. L., Kurns, S.
(2008). Best practices in curriculum-based
evaluation. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp.349-362).
Bethesda, MD National Association of School
Psychologists.
4Formal Tests Only One Source of Student
Assessment Information
- Tests are often overused and misunderstood in
and out of the field of school psychology. When
necessary, analog i.e., test observations can
be used to test relevant hypotheses within
controlled conditions. Testing is a highly
standardized form of observation. .The only
reason to administer a test is to answer
well-specified questions and examine
well-specified hypotheses. It is best practice to
identify and make explicit the most relevant
questions before assessment begins. The process
of assessment should follow these questions. The
questions should not follow assessment. p.170
Source Christ, T. (2008). Best practices in
problem analysis. In A. Thomas J. Grimes
(Eds.), Best practices in school psychology V
(pp. 159-176). Bethesda, MD National Association
of School Psychologists.
5Relevant Academic Information Sources and Purpose
- Tier 1 Instructional information. Teachers do
classroom assessments (both formal and informal).
Results are used to make day-to-day decisions
about pacing of instruction, to determine
students who need additional support, etc. - Tier 1/Tier 2 Schoolwide screenings. Brief
universal screenings are administered to all
students at a grade level to measure academic
skills that predict future school success.
Results reflect on quality of core instruction
and drive recruitment for Tier 2 programs. - Tier 3 Analytic/diagnostic instructional
assessment. Struggling students with more severe
needs picked up in screenings may be administered
a more detailed assessment (using qualitative
and/or quantitative measures) to map out pattern
of deficits in basic academic skills. Results
are used to create a customized intervention plan
that meets that students unique needs.
6Making Use of Existing (Extant) Data
7Universal Screening at Secondary Schools Using
Existing Data Proactively to Flag Signs of
Disengagement
- Across interventions, a key component to
promoting school completion is the systematic
monitoring of all students for signs of
disengagement, such as attendance and behavior
problems, failing courses, off track in terms of
credits earned toward graduation, problematic or
few close relationships with peers and/or
teachers, and then following up with those who
are at risk.
Source Jimerson, S., Reschly, A.L., Hess, R.
(2008). Best practices in increasing the
likelihood of school completion. In A. Thomas
J. Grimes (Eds). Best Practices in School
Psychology - 5th Ed (pp. 1085-1097). Bethesda,
MD National Association of School
Psychologists.. p.1090
8Extant (Existing) Data (Chafouleas et al., 2007)
- Definition Information that is collected by
schools as a matter of course. - Extant data comes in two forms
- Performance summaries (e.g., class grades,
teacher summary comments on report cards, state
test scores). - Student work products (e.g., research papers,
math homework, PowerPoint presentation).
Source Chafouleas, S., Riley-Tillman, T.C.,
Sugai, G. (2007). School-based behavioral
assessment Informing intervention and
instruction. New York Guilford Press.
9Advantages of Using Extant Data (Chafouleas et
al., 2007)
- Information is already existing and easy to
access. - Students will not show reactive effects during
data collection, as the information collected is
part of the normal routine of schools. - Extant data is relevant to school data
consumers (such as classroom teachers,
administrators, and members of problem-solving
teams).
Source Chafouleas, S., Riley-Tillman, T.C.,
Sugai, G. (2007). School-based behavioral
assessment Informing intervention and
instruction. New York Guilford Press.
10Drawbacks of Using Extant Data (Chafouleas et
al., 2007)
- Time is required to collate and summarize the
data (e.g., summarizing a weeks worth of
disciplinary office referrals). - The data may be limited and not reveal the full
dimension of the students presenting problem(s). - There is no guarantee that school staff are
consistent and accurate in how they collect the
data (e.g., grading policies can vary across
classrooms instructors may have differing
expectations regarding what types of assignments
are given a formal grade standards may fluctuate
across teachers for filling out disciplinary
referrals). - Little research has been done on the
psychometric adequacy of extant data sources.
Source Chafouleas, S., Riley-Tillman, T.C.,
Sugai, G. (2007). School-based behavioral
assessment Informing intervention and
instruction. New York Guilford Press.
11Universal Screening at Secondary Schools Using
Existing Data Proactively to Flag Signs of
Disengagement
- Across interventions, a key component to
promoting school completion is the systematic
monitoring of all students for signs of
disengagement, such as attendance and behavior
problems, failing courses, off track in terms of
credits earned toward graduation, problematic or
few close relationships with peers and/or
teachers, and then following up with those who
are at risk.
Source Jimerson, S., Reschly, A.L., Hess, R.
(2008). Best practices in increasing the
likelihood of school completion. In A. Thomas
J. Grimes (Eds). Best Practices in School
Psychology - 5th Ed (pp. 1085-1097). Bethesda,
MD National Association of School
Psychologists.. p.1090
12Mining Archival Data What Are the Early Warning
Flags of Student Drop-Out?
- A sample of 13,000 students in Philadelphia were
tracked for 8 years. These early warning
indicators were found to predict student drop-out
in the sixth-grade year - Failure in English
- Failure in math
- Missing at least 20 of school days
- Receiving an unsatisfactory behavior rating
from at least one teacher
Source Balfanz, R., Herzog, L., MacIver, D. J.
(2007). Preventing student disengagement and
keeping students on the graduation path in urban
middle grades schools Early identification and
effective interventions. Educational
Psychologist,42, 223235. .
13What is the Predictive Power of These Early
Warning Flags?
Number of Early Warning Flags in Student Record Probability That Student Would Graduate
None 56
1 36
2 21
3 13
4 7
Source Balfanz, R., Herzog, L., MacIver, D. J.
(2007). Preventing student disengagement and
keeping students on the graduation path in urban
middle grades schools Early identification and
effective interventions. Educational
Psychologist,42, 223235. .
14Grades Other Teacher Performance Summary Data
(Chafouleas et al., 2007)
- Teacher test and quiz grades can be useful as a
supplemental method for monitoring the impact of
student behavioral interventions. - Other data about student academic performance
(e.g., homework completion, homework grades,
etc.) can also be tracked and graphed to judge
intervention effectiveness.
Source Chafouleas, S., Riley-Tillman, T.C.,
Sugai, G. (2007). School-based behavioral
assessment Informing intervention and
instruction. New York Guilford Press.
15Marc Ripley
(From Chafouleas et al., 2007)
Source Chafouleas, S., Riley-Tillman, T.C.,
Sugai, G. (2007). School-based behavioral
assessment Informing intervention and
instruction. New York Guilford Press.
16Elbow Group Activity What Extant/Archival Data
Should Your RTI Team Review Regularly?
- Discuss the essential extant/archival data that
your RTI Team should review as early warning
indicators of students who are struggling (see
p. 20 of packet). - What process should your school adopt to ensure
that these data are reviewed regularly (e.g.,
every five weeks) to guarantee timely
identification of students who need intervention
assistance?
17RIOT/ICEL Framework Organizing Information to
Better Identify Student Behavioral Academic
Problems
18Assessment Data Reaching the Saturation Point
- During the process of assessment, a point of
saturation is always reached that is, the point
when enough information has been collected to
make a good decision, but adding additional
information will not improve the decision making.
It sounds simple enough, but the tricky part is
determining when that point has been reached.
Unfortunately, information cannot be measured in
pounds, decibels, degrees, or feet so there is no
absolute amount of information or specific
criterion for enough information. p. 373
Source Hosp, J. L. (2008). Best practices in
aligning academic assessment with instruction. In
A. Thomas J. Grimes (Eds.), Best practices in
school psychology V (pp.363-376). Bethesda, MD
National Association of School Psychologists.
19pp. 25-28
20RIOT/ICEL Framework
- Sources of Information
- Review (of records)
- Interview
- Observation
- Test
- Focus of Assessment
- Instruction
- Curriculum
- Environment
- Learner
21RIOT/ICEL Definition
- The RIOT/ICEL matrix is an assessment guide to
help schools efficiently to decide what relevant
information to collect on student academic
performance and behaviorand also how to organize
that information to identify probable reasons why
the student is not experiencing academic or
behavioral success. - The RIOT/ICEL matrix is not itself a data
collection instrument. Instead, it is an
organizing framework, or heuristic, that
increases schools confidence both in the quality
of the data that they collect and the findings
that emerge from the data.
22RIOT Sources of Information
- Select Multiple Sources of Information RIOT
(Review, Interview, Observation, Test). The top
horizontal row of the RIOT/ICEL table includes
four potential sources of student information
Review, Interview, Observation, and Test (RIOT).
Schools should attempt to collect information
from a range of sources to control for potential
bias from any one source.
23- Select Multiple Sources of Information RIOT
(Review, Interview, Observation, Test) - Review. This category consists of past or present
records collected on the student. Obvious
examples include report cards, office
disciplinary referral data, state test results,
and attendance records. Less obvious examples
include student work samples, physical products
of teacher interventions (e.g., a sticker chart
used to reward positive student behaviors), and
emails sent by a teacher to a parent detailing
concerns about a students study and
organizational skills.
24- Select Multiple Sources of Information RIOT
(Review, Interview, Observation, Test) - Interview. Interviews can be conducted
face-to-face, via telephone, or even through
email correspondence. Interviews can also be
structured (that is, using a pre-determined
series of questions) or follow an open-ended
format, with questions guided by information
supplied by the respondent. Interview targets can
include those teachers, paraprofessionals,
administrators, and support staff in the school
setting who have worked with or had interactions
with the student in the present or past.
Prospective interview candidates can also consist
of parents and other relatives of the student as
well as the student himself or herself.
25- Select Multiple Sources of Information RIOT
(Review, Interview, Observation, Test) - Observation. Direct observation of the students
academic skills, study and organizational
strategies, degree of attentional focus, and
general conduct can be a useful channel of
information. Observations can be more structured
(e.g., tallying the frequency of call-outs or
calculating the percentage of on-task intervals
during a class period) or less structured (e.g.,
observing a student and writing a running
narrative of the observed events).
26- Select Multiple Sources of Information RIOT
(Review, Interview, Observation, Test) - Test. Testing can be thought of as a structured
and standardized observation of the student that
is intended to test certain hypotheses about why
the student might be struggling and what school
supports would logically benefit the student
(Christ, 2008). An example of testing may be a
student being administered a math computation CBM
probe or an Early Math Fluency probe.
27Formal Tests Only One Source of Student
Assessment Information
- Tests are often overused and misunderstood in
and out of the field of school psychology. When
necessary, analog i.e., test observations can
be used to test relevant hypotheses within
controlled conditions. Testing is a highly
standardized form of observation. .The only
reason to administer a test is to answer
well-specified questions and examine
well-specified hypotheses. It is best practice to
identify and make explicit the most relevant
questions before assessment begins. The process
of assessment should follow these questions. The
questions should not follow assessment. p.170
Source Christ, T. (2008). Best practices in
problem analysis. In A. Thomas J. Grimes
(Eds.), Best practices in school psychology V
(pp. 159-176). Bethesda, MD National Association
of School Psychologists.
28ICEL Factors Impacting Student Learning
- Investigate Multiple Factors Affecting Student
Learning ICEL (Instruction, Curriculum,
Environment, Learner). The leftmost vertical
column of the RIO/ICEL table includes four key
domains of learning to be assessed Instruction,
Curriculum, Environment, and Learner (ICEL). A
common mistake that schools often make is to
assume that student learning problems exist
primarily in the learner and to underestimate the
degree to which teacher instructional strategies,
curriculum demands, and environmental influences
impact the learners academic performance. The
ICEL elements ensure that a full range of
relevant explanations for student problems are
examined.
29- Investigate Multiple Factors Affecting Student
Learning ICEL (Instruction, Curriculum,
Environment, Learner) - Instruction. The purpose of investigating the
instruction domain is to uncover any
instructional practices that either help the
student to learn more effectively or interfere
with that students learning. More obvious
instructional questions to investigate would be
whether specific teaching strategies for
activating prior knowledge better prepare the
student to master new information or whether a
student benefits optimally from the large-group
lecture format that is often used in a classroom.
A less obvious example of an instructional
question would be whether a particular student
learns better through teacher-delivered or
self-directed, computer-administered instruction.
30- Investigate Multiple Factors Affecting Student
Learning ICEL (Instruction, Curriculum,
Environment, Learner) - Curriculum. Curriculum represents the full set
of academic skills that a student is expected to
have mastered in a specific academic area at a
given point in time. To adequately evaluate a
students acquisition of academic skills, of
course, the educator must (1) know the schools
curriculum (and related state academic
performance standards), (2) be able to inventory
the specific academic skills that the student
currently possesses, and then (3) identify gaps
between curriculum expectations and actual
student skills. (This process of uncovering
student academic skill gaps is sometimes referred
to as instructional or analytic assessment.)
31- Investigate Multiple Factors Affecting Student
Learning ICEL (Instruction, Curriculum,
Environment, Learner) - Environment. The environment includes any
factors in the students school, community, or
home surroundings that can directly enable their
academic success or hinder that success. Obvious
questions about environmental factors that impact
learning include whether a students educational
performance is better or worse in the presence of
certain peers and whether having additional adult
supervision during a study hall results in higher
student work productivity. Less obvious questions
about the learning environment include whether a
student has a setting at home that is conducive
to completing homework or whether chaotic hallway
conditions are delaying that students
transitioning between classes and therefore
reducing available learning time.
32- Investigate Multiple Factors Affecting Student
Learning ICEL (Instruction, Curriculum,
Environment, Learner) - Learner. While the student is at the center of
any questions of instruction, curriculum, and
learning environment, the learner domain
includes those qualities of the student that
represent their unique capacities and traits.
More obvious examples of questions that relate to
the learner include investigating whether a
student has stable and high rates of inattention
across different classrooms or evaluating the
efficiency of a students study habits and
test-taking skills. A less obvious example of a
question that relates to the learner is whether a
student harbors a low sense of self-efficacy in
mathematics that is interfering with that
learners willingness to put appropriate effort
into math courses.
33(No Transcript)
34- The teacher collects several student math
computation worksheet samples to document work
completion and accuracy.
35- The students parent tells the teacher that her
sons reading grades and attitude toward reading
dropped suddenly in Gr 4.
36- An observer monitors the students attention on
an independent writing assignmentand later
analyzes the works quality and completeness.
37- A student is given a timed math worksheet to
complete. She is then given another timed
worksheet offered a reward if she improves.
38- Comments from several past report cards describe
the student as preferring to socialize rather
than work during small-group activities.
39- The teacher tallies the number of redirects for
an off-task student during discussion. She
designs a high-interest lesson, still tracks
off-task behavior.
40Uses of RIOT/ICEL
- The RIOT/ICEL framework is adaptable and can be
used flexibly e.g. - The teacher can be given the framework to
encourage fuller use of available classroom data,
examination of environmental and curiculum
variables impacting learning. - The RTI Team case manager can use the framework
when pre-meeting with the teacher to better
define the student problem, select data to bring
to the initial RTI Team meeting. - Any RTI consultant working at any Tier can
internalize the framework as a mental guide to
prompt fuller consideration of available data,
efficiency in collecting data, and stronger
formulation of student problems.
41Activity Use the RIOT/ICEL Framework
- Review the RIOT/ICEL matrix.
- Discuss how you might use the framework to ensure
that information that you collect on a student is
broad-based, comes from multiple sources, and
answers the right questions about the identified
student problem(s). - Be prepared to report out.
42Breaking Down Complex Academic Goals into Simpler
Sub-Tasks Discrete Categorization
43Identifying and Measuring Complex Academic
Problems at the Middle and High School Level
- Students at the secondary level can present with
a range of concerns that interfere with academic
success. - One frequent challenge for these students is the
need to reduce complex global academic goals into
discrete sub-skills that can be individually
measured and tracked over time.
44Discrete Categorization A Strategy for Assessing
Complex, Multi-Step Student Academic Tasks
- Definition of Discrete Categorization Listing
a number of behaviors and checking off whether
they were performed. (Kazdin, 1989, p. 59). - Approach allows educators to define a larger
behavioral goal for a student and to break that
goal down into sub-tasks. (Each sub-task should
be defined in such a way that it can be scored as
successfully accomplished or not
accomplished.) - The constituent behaviors that make up the larger
behavioral goal need not be directly related to
each other. For example, completed homework may
include as sub-tasks wrote down homework
assignment correctly and created a work plan
before starting homework
Source Kazdin, A. E. (1989). Behavior
modification in applied settings (4th ed.).
Pacific Gove, CA Brooks/Cole..
45Discrete Categorization Example Math Study Skills
- General Academic Goal Improve Tinas Math Study
Skills - Tina was struggling in her mathematics course
because of poor study skills. The RTI Team and
math teacher analyzed Tinas math study skills
and decided that, to study effectively, she
needed to - Check her math notes daily for completeness.
- Review her math notes daily.
- Start her math homework in a structured school
setting. - Use a highlighter and margin notes to mark
questions or areas of confusion in her notes or
on the daily assignment. - Spend sufficient seat time at home each day
completing homework. - Regularly ask math questions of her teacher.
46Discrete Categorization Example Math Study Skills
- General Academic Goal Improve Tinas Math Study
Skills - The RTI Teamwith teacher and student
inputcreated the following intervention plan.
The student Tina will - Approach the teacher at the end of class for a
copy of class note. - Check her daily math notes for completeness
against a set of teacher notes in 5th period
study hall. - Review her math notes in 5th period study hall.
- Start her math homework in 5th period study hall.
- Use a highlighter and margin notes to mark
questions or areas of confusion in her notes or
on the daily assignment. - Enter into her homework log the amount of time
spent that evening doing homework and noted any
questions or areas of confusion. - Stop by the math teachers classroom during help
periods (T Th only) to ask highlighted
questions (or to verify that Tina understood that
weeks instructional content) and to review the
homework log.
47Discrete Categorization Example Math Study Skills
- Academic Goal Improve Tinas Math Study Skills
- General measures of the success of this
intervention include (1) rate of homework
completion and (2) quiz test grades. - To measure treatment fidelity (Tinas
follow-through with sub-tasks of the checklist),
the following strategies are used - Approached the teacher for copy of class notes.
Teacher observation. - Checked her daily math notes for completeness
reviewed math notes, started math homework in 5th
period study hall. Student work products random
spot check by study hall supervisor. - Used a highlighter and margin notes to mark
questions or areas of confusion in her notes or
on the daily assignment. Review of notes by
teacher during T/Th drop-in period. - Entered into her homework log the amount of
time spent that evening doing homework and noted
any questions or areas of confusion. Log reviewed
by teacher during T/Th drop-in period. - Stopped by the math teachers classroom during
help periods (T Th only) to ask highlighted
questions (or to verify that Tina understood that
weeks instructional content). Teacher
observation student sign-in.
48CBM Developing a Process to Collect Local
Norms/Screening Data Jim Wrightwww.intervention
central.org
49RTI Literacy Assessment Progress-Monitoring
- To measure student response to
instruction/intervention effectively, the RTI
model measures students academic performance and
progress on schedules matched to each students
risk profile and intervention Tier membership. - Benchmarking/Universal Screening. All children in
a grade level are assessed at least 3 times per
year on a common collection of academic
assessments. - Strategic Monitoring. Students placed in Tier 2
(supplemental) reading groups are assessed 1-2
times per month to gauge their progress with this
intervention. - Intensive Monitoring. Students who participate in
an intensive, individualized Tier 3 intervention
are assessed at least once per week.
Source Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools Procedures to
assure scientific-based practices. New York
Routledge.
50Local Norms Screening All Students (Stewart
Silberglit, 2008)
- Local norm data in basic academic skills are
collected at least 3 times per year (fall,
winter, spring). - Schools should consider using curriculum-linked
measures such as Curriculum-Based Measurement
that will show generalized student growth in
response to learning. - If possible, schools should consider avoiding
curriculum-locked measures that are tied to a
single commercial instructional program.
Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
51Local Norms Using a Wide Variety of Data
(Stewart Silberglit, 2008)
- Local norms can be compiled using
- Fluency measures such as Curriculum-Based
Measurement. - Existing data, such as office disciplinary
referrals. - Computer-delivered assessments, e.g., Measures of
Academic Progress (MAP) from www.nwea.org
Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
52Measures of Academic Progress (MAP)www.nwea.org
53Applications of Local Norm Data (Stewart
Silberglit, 2008)
- Local norm data can be used to
- Evaluate and improve the current core
instructional program. - Allocate resources to classrooms, grades, and
buildings where student academic needs are
greatest. - Guide the creation of targeted Tier 2
(supplemental intervention) groups - Set academic goals for improvement for students
on Tier 2 and Tier 3 interventions. - Move students across levels of intervention,
based on performance relative to that of peers
(local norms).
Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
54Local Norms Supplement With Additional Academic
Testing as Needed (Stewart Silberglit, 2008)
- At the individual student level, local norm
data are just the first step toward determining
why a student may be experiencing academic
difficulty. Because local norms are collected on
brief indicators of core academic skills, other
sources of information and additional testing
using the local norm measures or other tests are
needed to validate the problem and determine why
the student is having difficulty. Percentage
correct and rate information provide clues
regarding automaticity and accuracy of skills.
Error types, error patterns, and qualitative data
provide clues about how a student approached the
task. Patterns of strengths and weaknesses on
subtests of an assessment can provide information
about the concepts in which a student or group of
students may need greater instructional support,
provided these subtests are equated and reliable
for these purposes. p. 237
Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
55pp. 2-5
56pp. 17-24
57(No Transcript)
58(No Transcript)
59(No Transcript)
60(No Transcript)
61(No Transcript)
62(No Transcript)
63(No Transcript)
64(No Transcript)
65Steps in Creating Process for Local Norming Using
CBM Measures
- Identify personnel to assist in collecting data.
A range of staff and school stakeholders can
assist in the school norming, including - Administrators
- Support staff (e.g., school psychologist, school
social worker, specials teachers,
paraprofessionals) - Parents and adult volunteers
- Field placement students from graduate programs
Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
66Steps in Creating Process for Local Norming Using
CBM Measures
- Determine method for screening data collection.
The school can have teachers collect data in the
classroom or designate a team to conduct the
screening - In-Class Teaching staff in the classroom collect
the data over a calendar week. - Schoolwide/Single Day A trained team of 6-10
sets up a testing area, cycles students through,
and collects all data in one school day. - Schoolwide/Multiple Days Trained team of 4-8
either goes to classrooms or creates a central
testing location, completing the assessment over
multiple days. - Within-Grade Data collectors at a grade level
norm the entire grade, with students kept busy
with another activity (e.g., video) when not
being screened.
Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
67Steps in Creating Process for Local Norming Using
CBM Measures
- Select dates for screening data collection. Data
collection should occur at minimum three times
per year in fall, winter, and spring. Consider - Avoiding screening dates within two weeks of a
major student break (e.g., summer or winter
break). - Coordinate the screenings to avoid state testing
periods and other major scheduling conflicts.
Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
68Steps in Creating Process for Local Norming Using
CBM Measures
- Create Preparation Checklist. Important
preparation steps are carried out, including - Selecting location of screening
- Recruiting screening personnel
- Ensure that training occurs for all data
collectors - Line up data-entry personnel (e.g., for rapid
computer data entry).
Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
69Local Norms Set a Realistic Timeline for
Phase-In (Stewart Silberglit, 2008)
- If local norms are not already being collected,
it may be helpful to develop a 3-5 year planned
rollout of local norm data collection, reporting,
and use in line with other professional
development and assessment goals for the school.
This phased-in process of developing local norms
could start with certain grade levels and expand
to others. p. 229
Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
70Team Activity Discuss a Plan to Conduct an
Academic Screening in Your School or District
- Directions
- Review the relevant materials in your handout
that relate to school-wide screening tools - Elementary screening literacy pp. 2-5
- Middle and high school screening pp. 17-34
- Discuss how you might create a building-wide
academic and/or behavioral screening process for
your school or expand/improve the one you already
have. - Be prepared to report out to the larger group.
71Monitoring Student Academic or General
BehaviorsDaily Behavior Report Cards
72Daily Behavior Report Cards (DBRCs) Are
- brief forms containing student behavior-rating
items. The teacher typically rates the student
daily (or even more frequently) on the DBRC. The
results can be graphed to document student
response to an intervention.
73http//www.directbehaviorratings.com/
74Daily Behavior Report Cards Can Monitor
- Hyperactivity
- On-Task Behavior (Attention)
- Work Completion
- Organization Skills
- Compliance With Adult Requests
- Ability to Interact Appropriately With Peers
75Jim Blalock
May 5
Mrs. Williams
Rm 108
Daily Behavior Report Card Daily Version
76Jim Blalock
Mrs. Williams
Rm 108
Daily Behavior Report Card Weekly Version
05 05 07
05 06 07
05 07 07
05 08 07
05 09 07
40
0
60
60
50
77Daily Behavior Report Card Chart
78Establishing RTI Guidelines to Diagnose Learning
Disabilities What Schools Should KnowJim
Wrightwww.interventioncentral.org
79(No Transcript)
80Using RTI to Determine Special Education
Eligibility Building the Foundation
- Ensure Tier 1 (Classroom) Capacity to Carry Out
Quality Interventions. The classroom teacher is
the first responder available to address
emerging student academic concerns. Therefore,
general-education teachers should have the
capacity to define student academic concerns in
specific terms, independently choose and carry
out appropriate evidence-based Tier 1 (classroom)
interventions, and document student response to
those interventions.
81Tier 1 (Classroom) Interventions Building Your
Schools Capacity
- ? Train Teachers to Write Specific, Measureable,
Observable Problem Identification Statements. - ? Inventory Tier 1 Interventions Already in Use.
- ? Create a Standard Menu of Evidence-Based Tier 1
Intervention Ideas for Teachers. - ? Establish Tier 1 Coaching and Support
Resources. - ? Provide Classroom (Tier 1) Problem-Solving
Support to Teachers. - ? Set Up a System to Locate Additional
Evidence-Based Tier 1 Intervention Ideas. - ? Create Formal Guidelines for Teachers to
Document Tier 1 Strategies. - ? Develop Decision Rules for Referring Students
from Tier 1 to Higher Levels of Intervention.
82Using RTI to Determine Special Education
Eligibility Building the Foundation
- Collect Benchmarking/Universal Screening Data on
Key Reading and Math (and Perhaps Other) Academic
Skills for Each Grade Level. Benchmarking data is
collected on all students at least three times
per year (fall, winter, spring). Measures
selected for benchmarking should track student
fluency and accuracy in basic academic skills
that are key to success at each grade level.
83Using RTI to Determine Special Education
Eligibility Building the Foundation
- Hold Data Meetings With Each Grade Level. After
each benchmarking period (fall, winter, spring),
the school organizes data meetings by grade
level. The building administrator, classroom
teachers, and perhaps other staff (e.g., reading
specialist, school psychologist) meet to - review student benchmark data.
- discuss how classroom (Tier 1) instruction should
be changed to accommodate the student needs
revealed in the benchmarking data. - select students for Tier 2 (supplemental group)
instruction/intervention.
84Tier 2 Supplemental (Group-Based) Interventions
- Tier 2 interventions are typically delivered in
small-group format. About 15 of students in the
typical school will require Tier 2/supplemental
intervention support. - Group size for Tier 2 interventions is limited
to 4-6 students. Students placed in Tier 2
interventions should have a shared profile of
intervention need. - The reading progress of students in Tier 2
interventions are monitored at least 1-2 times
per month.
Source Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools. Routledge New
York.
85Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Establish the Minimum Number of Intervention
Trials Required Prior to a Special Education
Referral. Your district should require a
sufficient number of intervention trials to
definitively rule out instructional variables as
possible reasons for student academic delays.
Many districts require that at least three Tier 2
(small-group supplemental) / Tier 3 (intensive,
highly individualized) intervention trials be
attempted before moving forward with a special
education evaluation.
86Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Determine the Minimum Timespan for Each Tier 2 or
Tier 3 Intervention Trial. An intervention trial
should last long enough to show definitively
whether it was effective. One expert
recommendation (Burns Gibbons, 2008) is that
each academic intervention trial should last at
least 8 instructional weeks to allow enough time
for the school to collect sufficient data to
generate a reliable trend line.
87Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Level of Student Academic Delay That
Will Qualify as a Significant Skill Discrepancy.
Not all students with academic delays require
special education services those with more
modest deficits may benefit from
general-education supplemental interventions
alone. Your district should develop guidelines
for determining whether a students academic
skills should be judge as significantly delayed
when compared to those of peers - If using local Curriculum-Based Measurement
norms, set an appropriate cutpoint score (e.g.,
at the 10th percentile). Any student performing
below that cutpoint would be identified as having
a significant gap in skills. - If using reliable national or research norms
(e.g., reading fluency norms from Hasbrouck
Tindal, 2004), set an appropriate cutpoint
score (e.g., at the 10th percentile). Any
student performing below that cutpoint would be
identified as having a significant gap in skills.
88Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning. The question of whether a student has
made adequate progress when on intervention is
complex. While each student case must be
considered on its own merits, however, your
district can bring consistency to the process of
judging the efficacy of interventions by
discussing the following factors
89Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning (Cont.). - Define grade level performance. The goal of
academic intervention is to bring student skills
to grade level. However, your district may want
to specify what is meant by grade level
performance. Local CBM norms or reliable
national or research norms can be helpful here.
The district can set a cutpoint that sets a
minimum threshold for typical student
performance (e.g., 25th percentile or above on
local or research norms). Students whose
performance is above the cutpoint would fall
within the reachable, teachable range and could
be adequately instructed by the classroom teacher.
90Estimate the academic skill gap between the
target student and typically-performing peers
- There are three general methods for estimating
the typical level of academic performance at a
grade level - Local Norms A sample of students at a school are
screened in an academic skill to create grade
norms (Shinn, 1989) - Research Norms Norms for typical growth are
derived from a research sample, published, and
applied by schools to their own student
populations (e.g., Shapiro, 1996) - Criterion-Referenced Benchmarks A minimum level,
or threshold, of competence is determined for an
skill. The benchmark is usually defined as a
level of proficiency needed for later school
success (Fuchs, 2003)
91Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning (Cont.). - Set ambitious but realistic goals for student
improvement. When an intervention plan is put
into place, the school should predict a rate of
student academic improvement that is ambitious
but realistic. During a typical intervention
series, a student usually works toward
intermediate goals for improvement, and an
intermediate goal is reset at a higher level each
time that the student attains it. The school
should be able to supply a rationale for how it
set goals for rate of student improvement. - When available, research guidelines (e.g., in
oral reading fluency) can be used. - Or the school may use local norms to compute
improvement goals. - Sometimes the school must rely on expert
opinion if research or local norms are not
available.
92Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning (Cont.). - Decide on a reasonable time horizon to catch
the student up with his or her peers.
Interventions for students with serious academic
delays cannot be successfully completed
overnight. It is equally true, though, that
interventions cannot stretch on without end if
the student fails to make adequate progress. Your
district should decide on a reasonable span of
time in which a student on intervention should be
expected to close the gap and reach grade level
performance (e.g., 12 months). Failure to close
that gap within the expected timespan may be
partial evidence that the student requires
special education support.
93Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning (Cont.). - View student progress-monitoring data in relation
to peer norms. When viewed in isolation, student
progress-monitoring data tells only part of the
story. Even if students shows modest progress,
they may still be falling farther and farther
behind their peers in the academic skill of
concern. Your district should evaluate student
progress relative to peers. If the skill gap
between the student and their peers (as
determined through repeated school-wide
benchmarking) continues to widen, despite the
schools most intensive intervention efforts,
this may be partial evidence that the student
requires special education support.
94Using RTI to Determine Special Education
Eligibility Creating Decision Rules
- Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning (Cont.). - Set uniform expectations for how
progress-monitoring data are presented at special
education eligibility meetings. Your district
should adopt guidelines for schools in collecting
and presenting student progress-monitoring
information at special education eligibility
meetings. For example, it is recommended that
curriculum-based measurement or similar data be
presented as time-series charts. These charts
should include trend lines to summarize visually
the students rate of academic growth, as well as
a goal line indicating the intermediate or
final performance goal toward which the student
is working.
95Confidence in Eligibility Decision
96Curriculum-Based Measurement Lab
97One way I have used the Maze in the past at the
secondary level, is as a targeted screener to
determine an instructional match between the
student and the text materials. By screening all
students on one to three Maze samples from the
text and/or books that were planned for the
course, we could find the students who could not
handle the materials without support (study
guides, highlighted texts, alternative reading
material). This assessment is efficient and it
seems quite reliable in identifying the potential
underachievers, achievers, and overachievers.
The real pay back is that success can be built
into the courses from the beginning, by providing
learning materials and supports at the students'
instructional levels. Lynn Pennington, Executive
Director, SSTAGE (Student Support Team
Association for Georgia Educators)
98Team Activity Exploring Data Tools on the
Internet
- Directions
- Consider the free CBM and other data tools
demonstrated during this workshop. - Discuss how your school might experiment with or
pilot the use of some of these measures to
discover whether they might be useful universal
screening tools or assessment options for Tier 1
(classroom) practice.