Monitoring Student Progress to Develop Standards-Based IEPs - PowerPoint PPT Presentation

About This Presentation
Title:

Monitoring Student Progress to Develop Standards-Based IEPs

Description:

They may not be reading fluently enough to earn a proficient score on the AYP test ... Word Reading Fluency ... Passage Fluency Design ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 72
Provided by: specia5
Learn more at: https://nceo.umn.edu
Category:

less

Transcript and Presenter's Notes

Title: Monitoring Student Progress to Develop Standards-Based IEPs


1
Monitoring Student Progress to Develop
Standards-Based IEPs
  • OSEP GSEG Project Managers Meeting
  • Gerald Tindal, Ph. D
  • University of Oregon
  • Martin Ikeda, Ph. D.
  • Iowa Department of Education

2
Overview
  • Intent is to provide an overview of some ideas we
    have been thinking about
  • Jerry has researched CBM for 25 years
  • Marty has supported implementation for 12 years
  • Conversational presentation-take comments and
    questions as we go
  • Representative of our best-thinking to-date

3
Big Ideas
  • CBM is a viable tool for decision making about
    participation in the alternate assessment against
    modified academic achievement standards
  • CBM is a viable tool for decision making about
    progress against grade level standards

4
Issues around 2
  • IEP Participation Decision
  • Objective evidence demonstrating that the
    students disability has precluded the student
    from achieving grade-level proficiency in the
    content area assessed.
  • The IEP team is reasonably certain that, even if
    significant growth occurs, the student is not
    likely to achieve grade level proficiency within
    the year covered by the IEP
  • IEP Development Issues
  • IEP goals based on State grade level academic
    content standards
  • Means for an annual determination of progress

5
Our interpretation
  • Need a way to operationalize proficient
    performance on grade level content standards
  • State Test 1X/year depiction
  • Need a way to predict if the child can
    realistically achieve grade level proficiency
    within one year
  • 1-2 years behind?
  • Need a way to monitor performance toward the
    operationalization

6
What we want
  • Align decisions about participation, present
    levels of academic achievement and functional
    performance and IEP goals referencing grade-level
    proficiency
  • Assess progress more frequently than annually, so
    that instructional effects can be assessed and
    changes made to programs if needed

7
Consider Golf
  • Many Components of a Good Golf Game
  • Grip
  • Choosing the Correct Club
  • Backswing Follow Through
  • Putting Skill
  • General Outcome Measure for Golf
  • Number of Strokes

8
Curriculum-Based Measures A potential solution
  • CBM is a validated technique for a variety of
    decisions, but particularly for monitoring
    performance over time
  • General Outcomes
  • Brief
  • Repeatable
  • Sensitive to Changes in Performance over Time
  • Operationalize content standards at grade level
    (ambitious)
  • Support instructional decision-making

9
Jacob Grade 5
  • Grade level proficiency standard
  • 75 wpm local norm
  • 100 wpm published performance level
  • Jacob 25 wpm in Grade level material
  • Problem?

10
Illustration Jacob
  • Examination of performance against other 5th
    graders in the district (local norm)
  • Data generated during Spring for fifth graders on
    Grade 5 material
  • In the Fall, Jacob would be given probes from
    Grade 5 from years end material

150
140
130
Oral
120
Reading
110
Fluency
100
90
Jacob's
80
Performance
70
Compared to Peers
60
50
40
30
20
5
10
11
What are realistic growth rates in reading?
  • Grade 1 2 words correct/week
  • Grade 2 1.5 words correct/week
  • Grade 3 1 word correct/week
  • Grade 4 .85 words correct/week
  • Grade 5 .50 words correct/week
  • Grade 6 .30 words correct/week
  • It may be difficult for Jacob to catch up by
    years end. The IEP team might decide he is a
    candidate for the Alternate Assessment against
    Modified Academic Achievement Standards.

12
Illustration Juarez
  • In the Fall, on year end material, Juarez is
    reading 80 wpm.
  • Students reading at this rate are getting meaning
    from text
  • They may not be reading fluently enough to earn a
    proficient score on the AYP test
  • However, Juarez is performing within grade level
    is likely to catch up by years end
  • It would be defensible for the IEP team to
    conclude that Juarez is not a candidate for the
    alternate assessment against modified academic
    achievement standards and instead participate in
    the general assessment with accommodations

150
Oral
140
Reading
130
Fluency
120
110
Juarez
100
Performance
90
Compared to Peers And Performance Standard
80
70
60
50
40
30
20
5
10
13
CBM and Participation Decisions
  • Potentially useful framework
  • Grade Level Proficiency
  • Projected Growth
  • Establishing alignment between the CBM metric,
    Grade Level Content Standards, and Grade Level
    Proficiency

14
Connecting CBM in Reading with Grade Level
Standards
Alternate Assessments based on Modified Academic
Achievement Standards
  • Gerald Tindal
  • University of Oregon

15
Alternate Forms
  • Progress monitoring requires alternate forms to
    allow meaningful interpretation of student data
    across time. Without such cross-form equivalence,
    changes in scores from one testing session to the
    next are difficult to attribute to changes in
    student skill or knowledge.
  • As student reading skills progresses through the
    different skill areas in the broad construct of
    reading, it is necessary to use different reading
    measures to be able to continue to track the
    progress students are making as developing readers

16
Technical Reports
  • Alonzo, J. Tindal, G. (2007). The Development
    of Early Literacy Measures for use in a Progress
    Monitoring Assessment System Letter Names,
    Letter Sounds, and Phoneme Segmenting. Technical
    Report 39. University of Oregon, Eugene
    Behavioral Research and Teaching.
  • Alonzo, J. Tindal, G. (2007). The Development
    of Word and Passage Reading Fluency Measures for
    use in a Progress Monitoring Assessment System.
    (Technical Report 40). University of Oregon,
    Eugene Behavioral Research and Teaching.
  • Alonzo, J., Liu, K., Tindal, G. (2007).
    Examining the Technical Adequacy of Reading
    Comprehension Measures in a Progress Monitoring
    Assessment System. (Technical Report 41).
    University of Oregon, Eugene Behavioral Research
    and Teaching.

17
Design of Alternate Measures
  • Defined universe of items in a pilot
  • Used common items and nonequivalent groups
    design
  • Scored tests at the item level
  • Reassembled items for equivalent forms

18
Distribution of the Measures Across the Grades
Grade Ltr Names Ltr Sounds Phon. Seg Word Fluency Passage Fluency MC Comp
K X X X X
1 X X X X X
2 X
3 X X
4 X X
19
Data Analyses
  • One-parameter Rasch model
  • Estimates the difficulty of individual test items
    and the ability level of each individual test
    taker
  • Standard error of measure
  • Mean square outfit to evaluate goodness of fit
    (values in the range of 0.50 to 1.50)

20
Letter Names, Sounds, Segmenting
  • 16 letter names exceeded mean sq outfit of 1.5
    but were included given low SEM-3 letters found
    to not fit (g, H, and Y)
  • 16 letter sounds exceeded mean sq outfit of 1.5
    but were included given low SEM-6 letter sounds
    found to not fit (B, C, d, j, p, and Qu)
  • A total of 181 words used in segmenting remained
    in the item bank

21
Word Reading Fluency
  • Tests students ability to read both sight-words
    and words following regular patterns of
    letter/sound correspondence in the English
    language
  • Students are shown a series of words organized in
    a chart on one side of a single sheet of paper
    and given a set amount of time (30-60 seconds)
  • The words we used during the pilot study came
    from a variety of sources Dolch word lists,
    online grade-level word lists, and a list of the
    first 1000 words found in Fryes Book of lists
    (1998).

22
Word List Design
  • Between 144 and 2654 students provided pilot test
    data on each word
  • We kept each of the pilot forms short (68 words
    in Kindergarten, 80 in grades 1-3)
  • We administered 5 different forms of the Word
    Reading Fluency test to students in Kindergarten,
    4 forms to students in first grade, and 3 forms
    to students in third and fourth grade.
  • Each form contained 5 words that served as anchor
    items, common across all 15 forms of the test
    (and appearing in the same location)

23
Passage Reading Fluency
  • Tests students ability to read connected
    narrative text accurately. In this
    individually-administered measure, students are
    shown a short narrative passage (approximately
    250 words)
  • Omissions, hesitations, and misidentifications
    were counted as errors

24
Passage Fluency Design
  • Measures were all written specifically for use in
    this progress monitoring assessment system.
  • All 80 passages were written by graduate students
    enrolled in College of Education courses in the
    winter of 2006
  • Passage writers followed written test
    specifications and were systematically reviewed
    by Lead Coordinator and then teachers in field
  • Each passage was divided into three paragraphs of
    approximately even length and checked the
    readability of each paragraph using the
    Flesch-Kinkaid readability index (1.5, 2.5, 3.5,
    4.5)

25
Analysis
  • On word list, we used Rasch analysis to scale
    words on difficulty and ability
  • For passages, we analyzed correlations and mean
    differences between the different forms of the
    measures using a repeated measures analysis
  • Variations in passage outcomes were reduced by
    rewriting passages

26
Results of Word List
  • Initial analyses revealed 283 words outside the
    acceptable Mean Square Outfit range of 0.50
    1.50. These items were dropped from the item
    bank, resulting in 465 remaining words
  • List created with the easiest words appearing
    first in the list and subsequent words increasing
    in difficulty

27
Word List Easiest 10
Word Count Measure Mean Square Outfit
I 238 -7.33 1.36
is 195 -6.31 1.29
the 1960 -6.21 1.10
it 195 -6.01 1.21
ten 243 -5.65 1.00
top 195 -5.37 0.93
and 2654 -5.20 1.15
an 195 -4.90 0.95
sun 195 -4.84 0.71
man 245 -4.32 1.37
28
Word List Most Difficult 10
Word Count Measure Mean Square Outfit
produce 208 4.09 1.11
cultivate 243 4.11 1.30
period 193 4.24 0.69
irrigate 243 4.41 1.00
divided 254 4.65 0.66
deception 210 4.70 1.14
thousands 254 4.76 0.76
commercial 243 4.78 1.31
though 254 5.33 1.37
compromise 210 5.36 1.19
29
Grade 3 Passages
Passage Title n M SD
Gr3PR_1_C Susans New School 239 128.79 39.00
Gr3PR_2_C Saras Fun Visit 240 131.38 44.29
Gr3PR_3_C Horses at the Fair 241 125.15 39.18
Gr3PR_4_C Bens Truck 240 127.55 43.61
Gr3PR_5_C Surprise Sandwiches 242 128.55 36.85
Gr3PR_6_C Swiming 243 131.77 40.99
Gr3PR_7_C Boring Weekends 243 121.67 43.07
Gr3PR_8_C Birthday Wishes 243 124.92 40.79
Gr3PR_9_C A Special Bike 239 126.45 39.93
Gr3PR_10_C The New Puppy 240 118.22 38.49
Gr3PR_11_C Childhood Dreams 239 103.28 41.57
Gr3PR_12_C The Perfect Instrument 240 121.64 40.47
Gr3PR_13_C The Breaking Story 237 118.58 39.99
Gr3PR_14_C The Dream House 237 124.06 41.83
Gr3PR_15_C American Sports 237 110.19 37.67
Gr3PR_16_C The Backpacking Trip 236 119.29 40.80
Gr3PR_17_C The Garden 231 116.26 37.23
Gr3PR_18_C Abbys Birthday 230 126.10 39.16
Gr3PR_19_C Sammy the Shark 231 143.02 45.36
Gr3PR_20_C Mikes Red Sneakers 231 119.28 44.62
30
Grade 4 Passages
Passage Title n M SD
Gr4PR_1_C Birthday Surprise 207 134.82 35.00
Gr4PR_2_C Amusement Park 208 139.96 37.74
Gr4PR_3_C Farm Dog Goes to Town 208 135.29 36.77
Gr4PR_4_C A Day of Celebration 208 137.56 38.45
Gr4PR_5_C Billys Garden with Grandpa 204 143.63 38.65
Gr4PR_6_C Marias Secret Friend 204 130.35 34.83
Gr4PR_7_C Lisa Gets to Drive 204 139.11 42.22
Gr4PR_8_C Toni the Shark 203 132.88 39.62
Gr4PR_9_C Martas New Sweater 203 139.84 41.27
Gr4PR_10_C Back to School 203 132.83 38.68
Gr4PR_11_C The Perfect Present 200 131.39 36.65
Gr4PR_12_C The Perfect Assignment 200 136.51 40.32
Gr4PR_13_C President David 198 141.40 38.44
Gr4PR_14_C Above the Clouds 199 138.70 37.68
Gr4PR_15_C Super Powers 198 131.42 38.79
Gr4PR_16_C A Friend for Jared 199 131.19 42.27
Gr4PR_17_C Fieldtrip to the Zoo 196 139.05 42.69
Gr4PR_18_C Hurt Feelings 195 136.56 39.41
Gr4PR_19_C Billy and Spike 195 135.96 44.92
Gr4PR_20_C The Rainy Day Jar 195 136.76 43.55
31
MC Reading Comprehension
  • We developed the MC Comprehension Tests in a
    two-step process.
  • First, we wrote the stories that were used as the
    basis for each test
  • Then, we wrote the test items associated with
    each story
  • We embedded quality control and content review
    processes in both these steps throughout
    instrument development
  • Stories were narrative fiction of approximately
    1500 words with three types of items written from
    them literal, inferential, and evaluative
  • 20 items per story were developed with 6-7 items
    of each type noted above 3-options were provided

32
Authors of MC Test
  • The lead author, who oversaw the creation and
    revision of the stories and test items earned her
    Bachelor of Arts degree in Literature from
    Carleton College in 1990, worked for twelve years
    as an English teacher in California public
    schools, was awarded National Board for
    Professional Teaching Standards certification in
    Adolescent and Young Adulthood English Language
    Arts in 2002, and was a Ph.D. candidate in the
    area of Learning Assessments / System Performance
    at the University of Oregon at the time the
    measures were created.
  • The item writer earned his Ph.D. in education
    psychology, measurement and methodology from the
    University of Arizona. He has worked in education
    at the elementary and middle school levels, as
    well as in higher education and at the state
    level. He held a position as associate professor
    in the distance learning program for Northern
    Arizona University and served as director of
    assessment for a large metropolitan school
    district in Phoenix, Arizona. In addition, he
    served as state Director of Assessment and Deputy
    Associate Superintendent for Standards and
    Assessment at the Arizona Department of
    Education. He was a test development manager for
    Harcourt Assessment and has broad experience in
    assessment and test development

33
Design of MC Test
  • We used a common-person / common item piloting
    design
  • The 20 different forms of each grade level
    measure were clustered into 5 groups, with 5
    forms in each group
  • Each test grouping contained two overlapping
    forms, enabling concurrent analysis of all
    measures across the different student samples

34
Sample Analysis
Item Number Raw Score Count Measure Standard Error Outfit Mean Squares
1 88 95 -1.78 0.42 .37
2 86 95 -1.47 0.37 .50
3 90 95 -2.18 0.48 .41
4 62 95 0.52 0.24 1.12
5 71 95 -0.05 0.26 1.03
6 25 95 2.53 0.25 2.32
7 72 95 -0.13 0.27 .97
8 75 95 -0.35 0.28 .94
9 74 95 -0.27 0.28 .61
10 48 95 1.29 0.23 1.20
11 64 95 0.4 0.25 1.06
12 58 95 0.75 0.24 1.04
13 74 95 -0.27 0.28 .84
14 77 95 -0.51 0.29 .80
15 66 95 0.28 0.25 .92
16 42 95 1.6 0.23 1.19
17 80 95 -0.78 0.31 .75
18 67 95 0.21 0.25 .91
19 76 95 -0.43 0.28 .99
20 60 95 0.64 0.24 .99
35
Distractor Analysis
Entry Data Code Score Value Count Average Measure S.E. Mean
1 A 0 2 2 -0.77 .27
C 0 5 5 -0.37 .26
B 1 88 93 1.50 .13
Missing
2 C 0 4 4 -0.39 .32
B 0 4 4 -0.27 .51
A 1 86 91 1.53 .13
Missing 1 1 0.24
36
Getting Started
Menu
PM or BYOA
37
More on Getting Started
Grade Group to CBM   Difficulty
Measure
38
Fluency and Comprehension
39
Administering a Measure
40
Reporting Outcomes
41
Reporting Outcomes - Diagnostic
42
Instructional Records
43
Reporting Outcomes - Formative
44
Standards-Based IEPs
Alternate Assessments based on Modified Academic
Achievement Standards
  • Gerald Tindal
  • University of Oregon

45
Menu and Options
46
Overview
47
Flow Chart
48
Participation Options
49
Perceptions
50
Curriculum-Based Measures
51
Curriculum-Based Measures
52
IEP Goal Wizard
53
IEP Goal Wizard
54
IEP Goal
In Math, the content of the IEP goal in Data
Analysis and Probability is to propose and
justify conclusions and predictions that are
based on data and design studies to further
investigate the conclusions or predictions.
Student will classify numbers, shapes, or
objects, 10 tasks, with the competency level of
90. The goal should be fulfilled by 2008-05-23.
The following contingencies apply Read aloud
problems
55
IEP Goal Analysis
56
CBM Sophistication
  • National Center on Student Progress Monitoring
    (studentprogress.org)
  • Intervention Central (interventioncentral.com)
  • Research Institute on Progress Monitoring
    (progressmonitoring.net)
  • National Research Center on Learning Disablities
    (nrcld.org)
  • National Center on Response to Intervention
    (rti4success.org)
  • Microsoft Excel
  • Graph paper and pencil

57
(No Transcript)
58
(No Transcript)
59
(No Transcript)
60
(No Transcript)
61
(No Transcript)
62
Data Collection and Charting
Ruso
East Elementary
South Iowa
Expected Level of Performance
1
2 3 4
Baseline
1
100
90
80
Goal
70
60
50
40
30
20
10
M M M M M M M M M M M M M M M M M M M M M M M M M
M M M M M M M M M
10
4
12
11
8
13
11
12
13
63
Data Collection and Charting
Ruso
East Elementary
South Iowa
Expected Level of Performance
1
2 3 4
1
2
Baseline
100
90
80
Goal
70
60
50
40
30
20
10
M M M M M M M M M M M M M M M M M M M M M M M M M
M M M M M M M M M
10
15
4
19
20
23
12
11
8
13
11
12
13
25
19
21
64
Data Collection and Charting
Ruso
Expected Level of Performance
1
2 3 4

3
1
2
Baseline
100
90
80
Goal
70
60
50
40
30
20
10
M M M M M M M M M M M M M M M M M M M M M M M M M
M M M M M M M M M
10
15
4
19
20
23
35
12
11
16
8
13
35
11
12
13
25
19
18
21
44
38
65
Instructional Decision Making
Instructional Intervention Plan
Decision Making Plan
Data will be collected at least once per week and
charted. If three consecutive data points fall
below the goal line the problem solving team will
reconvene and an instructional change will be
made.
Goal Area
Reading
Student
Jacob
Intervention Designer
Advisor
M.Ikeda/G. Tindal
Tammy Tyler
Phase Instructional Procedure
Materials
Arrangements Time
Motivational Strategies
During small group reading in the classroom. Time
added to Jacobs group each day for this
instruction
Grade level vocabulary
Vocabulary Preteaching
15 minutes Daily
Verbal Praise
1
Instruction provided by general and sp ed
teacher. Phonemic awareness training. Begin
rereadings of passages.
Special Ed teacher will coteach Small groups will
rotate between teachers increasing teacher
contact time.
2
Word Walk curriculum 5.0 reading passages
30 minutes Daily
Verbal Praise Classroom motivators
3
Same Tradebooks
At the end of each day, Jacob will read orally to
resource teacher
15 minutes Daily
Verbal Praise Classroom motivators
Same instructional procedures Add oral reading
time each day
66
IEP Goals example 2
  • Juarez attends a 5-6-7 building. He has a natural
    transition between Grades 7 and 8.
  • Juarez is currently reading 80 words per minute
    in Grade 5 material. In order to increase the
    likelihood that he scores proficient on the State
    test, Juarez needs to increase his proficiency to
    120 words per minute in Grade level material. By
    June 2010, given passages from 7th grade reading
    curriculum material, Juarez will read 120 words
    correct in one minute with five or fewer errors.

67
Data Collection and Charting
Ruso
Expected Level of Performance
1
2 3 4

Year 2
Year 3
Year 1
140
130
Goal
120
110
100
90
80
70
60
M M M M M M M M M M M M M M M M M M M M M M M M M
M M M M M M M M M
68
Data Collection and Charting
Ruso
Expected Level of Performance
1
2 3 4

Year 2
Year 3
Year 1
140
130
Goal
120
110
100
90
80
70
60
M M M M M M M M M M M M M M M M M M M M M M M M M
M M M M M M M M M
69
CBM in a Standards Based IEP
  • Grade level reference
  • Content Standard Aligned
  • Performance benchmarks
  • Multiple measures
  • Sensitive to growth
  • Established research base
  • High expectations
  • Focus on Instruction

70
CBM Cautions
  • Requires thoughtful decisions
  • Present Level
  • Projected Level
  • A Metric may not be diagnostic
  • May not align with all relevant standards
  • Balanced Instruction-not a teach to the test

71
Iowas Accountability System for Students with
Disabilities
Core Content Standards and Benchmarks
Fair measurement of student knowledge
Alternate Assessment
Alternate Achievement Standards
Without Accommodations
With Accommodations
Write a Comment
User Comments (0)
About PowerShow.com