PROGRESS MONITORING with the - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

PROGRESS MONITORING with the

Description:

Early failure in reading ripples through upper grades and other ... Good oral language; poor phonic skills. Lower socioeconomic status (SES) with broad ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 45
Provided by: Gre1186
Category:

less

Transcript and Presenter's Notes

Title: PROGRESS MONITORING with the


1
PROGRESS MONITORINGwith the
  • Gale H. Roid, PhD and Mark F. Ledbetter, PsyD

2
Outline of Workshop
  • Why progress monitoring?
  • Review of newest IDEA and RTI criteria
  • CBM/DIBELS versus improved models
  • WRAT4-PMV Design, administration, scoring,
    research, uses
  • Case studies
  • Recommended applications

3
Why Progress Monitoring?
  • Early failure in reading ripples through upper
    grades and other curriculum areas
  • New Individuals with Disabilities Education Act
    (IDEA) and No Child Left Behind Act (NCLB)
    guidelines suggest progress monitoring within the
    response to intervention (RTI) model
  • National Assessment of Educational Progress
    (NAEP) shows 37 of fourth graders are below
    basic level in reading skills

4
Benefits of Intervention with Progress Monitoring
  • Two types of problem readers1
  • Good oral language poor phonic skills
  • Lower socioeconomic status (SES) with broad
    weaknesses
  • Two third graders from the northwest given
    intensive tutoring with frequent brief tests
  • DaronPrimary to Grade 3 oral reading in 14
    months
  • MiaGrade 1 to Grade 3 in 13 months
  • 1 Torgesen, J. K. (2004, Fall). Preventing early
    reading failureand its devastating
  • downward spiral. American Educator, 28.

5
Progress Monitoring in NCLB, RTI, and IDEA
  • Annual yearly progress (AYP) in special
    education
  • Monitoring changes in classroom instruction (Tier
    2 of RTI)
  • Intensive assessment in Tier 3 for possible
    special education

6
History of theRTI Model
  • According to Heller, Holtzman, and Messick
    (1982),2 there are three criteria for judging the
    validity of special education placements3
  • General education classroom OK?
  • Special education more effective?
  • Is assessment method accurate?
  • 2 Heller, K. A., Holtzman, W. H., Messick, S.
    (Eds.) (1982). Placing children in special
    education A strategy for equity. Washington, DC
    National Academy Press.
  • 3 Fuchs, L. S., Vaughn, S. R. (2006, March).
    Response to intervention as a framework for the
    identification of learning disabilities. NASP
    Communiqué, 34, 1-6.

7
History of theRTI Model (cont.)
  • Three-phase adaptation of Heller et al.s plan4
  • Students rate of growth in general education
  • Low-performing students response to better
    instruction
  • Intensive assessment and further response to
    evidence-based instruction
  • 4 Fuchs, L. S., Fuchs, D. (1998). Treatment
    validity A unifying concept for
    reconceptualizing the identification of learning
    disabilities. Learning Disabilities Research and
    Practice, 13, 204-219.

8
History of theRTI Model (cont.)
  • Three-tiered prevention model5,6,7
  • Tier 1 Screening in general education
  • Tier 2 Fixed duration remediation with progress
    monitoring
  • Tier 3 Assessment for special education using
    progress monitoring
  • 5 Individuals with Disabilities Education
    Improvement Act of 2004 (IDEA) (2004). Public Law
    No. 108-446, 632, 118 Stat. 2744.
  • 6 Vaughn, S., Linan-Thompson, S., Hickman, P.
    (2003). Response to instruction as a means of
    identifying
  • students with reading/learning disabilities.
    Exceptional Children, 69, 391-409.
  • 7 Gresham, F. M. (2002). Responsiveness to
    intervention An alternative approach to the
    identification of
  • learning disabilities. In R. Bradley, L.
    Danielson, D. P. Hallahan (Eds.),
    Identification of learning
  • disabilities Research to practice (pp.
    467-519). Mahwah, NJ Erlbaum.

9
CBM and DIBELS
  • 1975 Stanley Deno (University of Minnesota)
    develops easy-to-use basic skills assessments for
    teachers
  • 1976 to 2005 Denos grad students Lynn Fuchs
    (Vanderbilt), Gerald Tindal (Univ. of Oregon),
    Mark Shinn, and others continue development of
    curriculum-based measurement (CBM) major federal
    grant support
  • 1998 Roland Goods Dynamic Indicators of Basic
    Early Literacy Skills (DIBELS)
  • 2004 IDEA reauthorization recommends CBM (see
    http//IDEA.ed.gov)

10
Attributes of the Best CBM4
  • Easy-to-use individual or small group tests that
    teachers understand
  • Measures improvement over time
  • Brief tests given frequently
  • Assesses program effectiveness
  • No progress ? changes in instruction

11
Attributes of the Best CBM (cont.)8,9
  • Word reading performance is highly related to
    other CBM measures (e.g., fluency,
    comprehension), especially in Grades 1-3
  • Feedback to teachers and students is not enough.
    Guidance and follow-up on methods of reading
    instruction is necessary.
  • 8 Hosp, M. K., Fuchs, L. S. (2005). Using
    CBM as an indicator of decoding, word reading,
    and comprehension Do the relations change with
    grade? School Psychology Review, 34, 9-26.
  • 9 Graney, S. B., Shinn, M. R. (2005). Effects
    of reading curriculum-based measurement (R-CBM)
    teacher feedback in general education
    classrooms. School Psychology Review, 34, 184-201.

12
Limitations of Some CBM Applications
  • Criterion-referenced CBM may not have
    grade-based expectations (norms)
  • CBM test forms not always equivalent
    statistically (variation in difficulty)
  • Scores not always good for program effectiveness
    or across-grade comparisons
  • Available CBM tests not in upper grades

13
WRAT4-PMVFeatures and Benefits
  • Simple and easy to use
  • Long tradition in special education
  • Four subtests Word Reading, Sentence
    Comprehension, Spelling, and Math Computation
  • Allows dual comparisons
  • Rate of growth of the student
  • National norms for grade-level expectations

14
WRAT4-PMVFeatures and Benefits (cont.)
  • Four equivalent test forms containing 15 items at
    each level (six levels)
  • Covers Grades K-12 and college
  • Across-grade Level Equivalent (LE) scores are
    available
  • Computer scoring program is available

15
Design of WRAT4-PMV
  • Four forms for each level
  • Four subtests Word Reading, Sentence
    Comprehension, Spelling, and Math Computation
  • Six levels
  • - Level 1 Grades K-1
  • - Level 2 Grades 2-3
  • - Level 3 Grades 4-5
  • - Level 4 Grades 6-8
  • - Level 5 Grades 9-12
  • - Level 6 Grades 13-16 (i.e., college)

16
Test AdministrationWord Reading
  • Start at the grade level, then adjust
    (out-of-level testing is OK)
  • Present card with letters and words
  • Say, Look.read across.
  • If not clear, say Please say the word again.

17
Sample Test Form Word Reading Level 3 (Grades
4-5)
18
Test AdministrationSentence Comprehension
  • Find the missing word.
  • Present the sample card and see if the student
    finds the missing word
  • Read the other sample sentences
  • Student silently reads the remaining sentences in
    the subtest

19
Test AdministrationSentence Comprehension
(cont.)
  • Mark and score responses

20
Test AdministrationSpelling
  • Spell the word in context
  • Write (or print) letters or words
  • You read the word by itself, then read the word
    in a sentence
  • Student uses Response Booklet to write responses

21
Sample Response Booklet Spelling Level 2 (Grades
2-3)
22
Test AdministrationMath Computation
  • Oral math for Grades K-5 (Levels 1-3)Show me 3
    fingers.
  • Math calculation problems
  • Level 1 7 or 8 items
  • Level 2 10 or 11 items
  • Level 3 13 items
  • Levels 4-6 15 items
  • Student uses Response Booklet
  • No calculators

23
Sample Oral Math Card Levels 1-3 (Grades K-5)
24
Sample Examiner Instructions Math Computation
Card, Level 2 (Grades 2-3)
25
Scoring Plot Raw Scores on the Profile to
Monitor Progress
26
Score Difference Tables
27
Technical Aspects Reliability
  • High level of
  • reliability in
  • Grades K-12
  • Test-retest 30-
  • day practice
  • effect less
  • than .5 point

28
Technical Aspects Test Form Equivalence
  • Nearly perfect
  • equivalence
  • among the
  • four test forms
  • at all levels
  • Gulliksen
  • method10 with
  • Wilks Lambda11

10 Gulliksen, H. (1950). Theory of mental tests.
New York Wiley. 11 Wilks, S. S. (1932). Certain
generalizations in the analysis of variance.
Biometrika, 24, 471-494.
29
Technical Aspects Validity
30
Technical Aspects Word Reading and LD
  • Study of 30
  • students with
  • reading learning
  • disability (LD)
  • SD difference in
  • scores of LD
  • versus controls
  • .5-1.00 (usually 2
  • raw score points)

31
Developmental Trends in Level Equivalent Scores
32
Case Example 1 Ananta, Grade 2 Catching Up
33
Dual Criteria for LDs
  • Look for two trends4
  • Shows no improvementa flat profile based on
    slope of the graph line
  • Performs below grade level despite classroom
    interventionsthe graph line stays below the
    grade norms

34
Case Example 2 Grade 3Flat ProfileDual
Discrepancy
35
Case Example 3Julio,Grade 4Progress
AcrossGrades
36
Applications ofthe WRAT4-PMV
  • Monitoring students identified by NCLB
  • Measuring RTI in Tier 2 (fixed duration
    remediation)
  • Verification of qualification for special
    education (Tier 3)
  • Long-term progress monitoring in special
    education (AYP)

37
Applications of the WRAT4-PMV (cont.)
  • See reference list handout for examples of
  • empirically-based instructional interventions
  • Five methods of reading intervention12
  • - Repeated reading Read passage twice
  • - Listening passage preview You read it, have
  • student follow with finger
  • - Phrase drill Read error words, student repeats
  • three times
  • - Syllable segmentation Read each syllable
  • - Reward Contingency If score is improved
  • 12 Daly, E. J., Persampieri, M., McCurdy, M.,
    Gortmaker, V. (2005). Generating reading
    interventions through
  • experimental analysis of academic skills
    Demonstration and empirical evaluation. School
    Psychology Review,
  • 34, 395-414.

38
SampleReport From theWRAT4-PMVScoring Program
39
SampleReport From theWRAT4-PMVScoring Program
(cont.)
40
SampleReport From theWRAT4-PMVScoring Program
(cont.)
41
SampleReport From theWRAT4-PMVScoring Program
(cont.)
42
Sample Report From the WRAT4-PMV Scoring Program
(cont.)
43
Sample Report From the WRAT4-PMV Scoring Program
(cont.)
44
For More Information
  • See sample materialsafter workshop.
  • Visit www.parinc.com and click on Assessment
    Consultants to contact a sales representative or
    to arrange a workshop in your school district.
Write a Comment
User Comments (0)
About PowerShow.com