Putting the - PowerPoint PPT Presentation

Loading...

PPT – Putting the PowerPoint presentation | free to download - id: 596d8-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Putting the

Description:

Courtney LeClair, M. A. Stephanie Schmitz, Ed.S. Agenda. Assessment. Curriculum Based Measurement ... Contents of the assessment are based on the instructional ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 57
Provided by: Aman178
Category:
Tags: putting

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Putting the


1
Putting the R in RtI Assessing Student
Responsiveness through Norming, Screening and
Progress Monitoring
  • Summer RtI Institute
  • July 30-31st, 2007
  • Amanda Albertson, M. A.
  • Courtney LeClair, M. A.
  • Stephanie Schmitz, Ed.S.

2
Agenda
  • Assessment
  • Curriculum Based Measurement
  • Norming
  • Uses
  • Strengths Limitations
  • Procedures and Tips
  • Screening
  • Choosing a measure
  • Procedures and Tips
  • Decisions
  • Progress Monitoring
  • Procedures
  • Data examples
  • Decisions
  • RtI and Special Education Placement

3
Direct Assessment of Academic Skills
  • Curriculum-Based Measurement (CBM)
  • Contents of the assessment are based on the
    instructional curriculum.
  • Measures are presented in a standardized format.
  • Material for assessment is controlled for
    difficulty by grade levels.
  • Measures are generally brief.
  • Shapiro, E. S. (2004). Academic skills problems
    Direct assessment and intervention (3rd ed.).
    New York The Guilford Press.

4
Curriculum Based Measurement (cont.)
  • Advantages
  • Can be used efficiently by teachers
  • Produces accurate, meaningful information to
    index growth
  • Answers questions about the effectiveness of
    programs in producing academic growth
  • Provides information to help teachers plan better
    instructional programs
  • Fuchs, L. S. Fuchs, D. (1997) Use of
    curriculum-based measurement in identifying
    students with disabilities. Focus on Exceptional
    Children, 30, 3, 1-15.

5
Norming (a.k.a. Obtaining Normative Data)
6
Normative Data
  • Provide information on student levels and range
    of performance at different grades, by indexing
    achievement cross-sectionally
  • Provide appropriate standards for weekly rates
    of academic growth
  • Fuchs, L. and Fuchs, D. (1993). Formative
    Evaluation of Academic Progress How much growth
    can we expect?. School Psychology Review 22, 1,
    1-30.

7
Uses of Local Normative Data
  • Make decisions about referred students
  • Report individual and/or group scores to
    teachers, parents, or other agencies
  • Identify students proactively who arent keeping
    up with peers or benchmarks
  • Detect academic and behavioral trends over time
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

8
Strengths of Local Normative Data
  • Decrease the likelihood of bias in decision
    making
  • Provide meaningful comparison group
  • Promote identification of educational needs in a
    systematic problem-solving orientation
  • Follow changing patterns of local performance
  • Clear expectations of what is expected and ranges
    in performance
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

9
Limitations of Local Normative Data
  • Threat of Misinterpretation
  • Sample measurement tasks must be defined
  • Small sample can cause the norms to be unstable
  • Local performance is not necessarily acceptable
  • May use empirically derived benchmark rates to
    determine if students performance is acceptable
  • Local norms may not necessarily advocate the use
    of certain curricula
  • Norms show level of performance and rate of
    growth in curricula
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

10
Steps in Developing Local Norms
  • 1. Identify norm sample
  • 2. Choose materials
  • 3. Decide who and how many students will be
    assessed
  • 4. Collect the data
  • 5. Organize the data for use
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

11
1. Identify norm sample
  • 3 Basic Levels
  • Classroom
  • School-Building
  • School-District
  • Consider
  • Decisions for which data shall be used
  • Amount of curriculum chaos in the district
  • Political and economic structure of the area
  • Characteristics of the population
  • Economic and other resources available
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

12
2. Choosing Norming Measurement Tools
  • Tools should
  • Be reliable
  • Be accurate
  • Have relatively normal distributions
  • Be sensitive to change
  • Provide enough opportunities to respond (limit
    ceiling effects)
  • Have standardized administration and scoring
  • Reliably differentiate student level of skill
  • Be time efficient
  • Be affordable
  • Provide data important to general education
    expectations
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

13
Examples of Norming Measurement Tools
  • Dynamic Indicators of Basic Early Literacy Skills
    (DIBELS http//dibels.uoregon.edu/)
  • Reading
  • K-6
  • Spanish and English
  • Aimsweb (www.aimsweb.com)
  • Reading
  • Spanish and English
  • Math
  • Written Expression
  • K-8

14
3. Implement a Sampling Plan
  • Balance the resources available,
    representativeness of the sample, and the
    information desired
  • Some questions can be answered without testing
    every child every year
  • Your questions should drive the sampling plan!
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

15
Implement a Sampling Plan
  • Classroom Norms
  • Minimum of 7-10 students
  • Selected randomly (every nth student on list)
  • Selected randomly from a pool of typical
    students
  • Building Norms
  • Minimum of 15-20 of students in each grade
  • Minimum of 20 students per grade
  • Selected randomly
  • To compute percentile ranks, a minimum of 100
    students per grade is needed
  • District Norms
  • Random sample of 100 students per grade
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

16
National vs. Local Norms
  • National norms require less time and effort
  • Dont have to collect normative data
  • National norms are readily accessible
  • Local norms are more representative of your
    population
  • Local norms are more sensitive
  • Local norms allow you to choose the materials
    that are most appropriate to your
    building/district

17
4. Collect the Data
  • Trimester norming (Fall, Winter, Spring)
  • Use equivalent but not identical materials each
    time
  • Prepare student and examiner materials ahead of
    time
  • Examiners should be trained to administer and
    score
  • Determine suitable locations for testing
  • Determine appropriate dates for testing
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

18
5. Organize Data for Use
  • Data Can be Summarized at Four Levels
  • Individual student raw scores
  • Classroom ranges of scores, medians, and rank
    orderings
  • Building ranges of scores, medians, rank
    orderings, and percentile ranks
  • District ranges of scores, descriptive
    statistics, within grade frequency distributions,
    percentile ranks, and across grade comparisons
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.

19
Computing Percentile Ranks
  • 1. Construct a frequency distribution of the raw
    scores
  • 2. For a given raw score, determine the
    cumulative frequency for all scores lower than
    the score of interest
  • 3. Add half the frequency for the score of
    interest to the cumulative frequency value
    determined in Step 2
  • 4. Divide the total by N, the number of examinees
    in the norm group and multiply by 100
  • Crocker, L., Algina, A. (1986). Introduction to
    classical and modern test theory. New York Holt,
    Rinehart and Winston.

20
Organizing Data for Use
21
Universal Screening
22
Universal Screening
  • A classroom-wide, school-wide, or district-wide
    assessment which involves assessing all students
    to identify students who are at risk for academic
    failure or behavioral difficulties and could
    potentially benefit from specific instruction or
    intervention.
  • National Association of State Directors of
    Special Education, Inc. (2005). Response to
    Intervention Policy considerations and
    implementation. New York, NY The Guilford
    Press.

23
Choosing a Screening Measure
  • Compatibility with local service delivery needs
  • Alignment with constructs of interest
  • Theoretical and empirical support
  • Population fit
  • Practical to administer
  • Glover, T. A., Albers, C. A. (in press).
    Considerations for evaluating universal screening
    assessments. Journal of School Psychology.

24
Choosing a Screening Measure
  • Appropriately standardized for use with the
    target population
  • Consistent in measurement
  • Accurate in its identification of individuals at
    risk

25
Examples of Screening Measures
  • CBM
  • Dynamic Indicators of Basic Early Literacy Skills
    (DIBELS http//dibels.uoregon.edu/)
  • Aimsweb (www.aimsweb.com)
  • Teacher recommendations
  • Classroom assessments
  • National assessments (e.g., MAT)
  • Report card rubrics

26
Pre-Screening Procedures with CBM
  • 1. Decide who will conduct the screening.
  • 2. Ensure that the individuals who are
    administering the screening have been trained in
    using the chosen CBM materials.
  • 3. Organize CBM materials (e.g., make sure there
    are enough, write student names on them, etc.).
  • 4. Decide whether to use local or national
    (published) norms to determine which students
    need additional academic assistance.
  • 5. Ensure that you give the type of probe
    recommended for that specific grade level and
    time of year

27
Possible DIBELS probes
  • Example of DIBELS chart

28
CBM Screening Tips
  • Reading measures need to be administered
    individually. It is best to have several
    administrators and to bring entire classrooms
    into a central location at one time.
  • Math and writing can be administered to students
    as a group, so administer these probes to entire
    classrooms.
  • It is also helpful to prepare materials so that
    each student has their own materials with their
    names on them.

29
Post-Screening Procedures
  • 1. Enter student scores into a computer program
    (e.g., Excel) that can easily sort the data.
  • 2. Sort the data so that students are
    rank-ordered.
  • 3. Determine which students fell below the
    previously specified cut-off

30
Example Spreadsheet
  Median ORF
Student A 3
Student B 5
Student C 6
Student D 8
Student E 8
Student F 9
Student G 11
Student H 11
Student I 13
Student J 14
Student K 15
Student L 16
Student M 17
Student N 18
Student O 19
31
Screening Results Example
32
Screening Decisions
  • Students who fall below pre-specified cutoff
  • Based on scores, supporting documentation, and
    prior knowledge of student abilities, determine
    the necessary educational intervention.
  • Decide who is going to implement the
    intervention(s).
  • Decide who is going to monitor student progress
    over time.

33
Progress Monitoring
34
Progress Monitoring
  • The practice of assessing students to determine
    if academic or behavioral interventions are
    producing desired effects.
  • Provides critical information about student
    progress that is used to ensure the use of
    effective educational practices and to verify
    that students are progressing at an adequate
    rate.
  • National Association of State Directors of
    Special Education, Inc. (2005). Response to
    Intervention Policy considerations and
    implementation. New York, NY The Guilford Press.

35
Progress Monitoring
  • Those students who did not make the screening
    cutoff will be monitored on a frequent (generally
    once per week) basis.
  • It is recommended that the same form of CBM be
    used for screening and progress monitoring.
  • Use the recommended form for the students grade
    and time of year.

36
Progress Monitoring
  • Typically occurs at least once per week
  • Provides ongoing information regarding student
    progress
  • Can be used to determine whether interventions
    need to be strengthened or modified

37
Progress Monitoring Procedures
  • 1. Based upon the norms you have decided to use
    and each students screening results, set a goal
    for each student.
  • This goal should reflect an average gain per week
    as determined by the norms that you are using.
  • 2. Once the students intervention has begun,
    monitor the students progress once per week.

38
Progress Monitoring Procedures (cont.)
  • 3. Graph the students scores (e.g., correct read
    words/minute, correct writing sequences, digits
    correct) on a chart.
  • 4. Periodically review the chart to determine
    whether progress is being made.
  • 5. After the student has been in an intervention
    for a specified amount of time, hold a meeting
    with your decision making team.
  • Look at the level, and the rate of progress
  • Determine whether the goal was attained and/or
    exit criteria met

39
Progress Monitoring Example 1
Intervention
Intervention
Baseline
Baseline
40
Progress Monitoring Decisions (Example 1)
  • What you can do in this situation
  • Continue with the intervention and monitoring.
  • Continue with the intervention and monitor less
    frequently.
  • Discontinue intervention but monitor to ensure
    that progress doesnt cease/reverse.

41
Progress Monitoring Example 2
Intervention
Intervention
Baseline
Baseline
42
Progress Monitoring Decisions Example 2
  • Decision that needs to be made in this situation
  • 1.Modify the current intervention, or
  • 2. Implement a different intervention in place of
    the current intervention.

43
Progress Monitoring Examples
  • In example 1, adequate rate and level were being
    achieved
  • The team will decide whether or not to continue
    to monitor student progress.
  • The student will still be involved in universal
    screenings.

44
Progress Monitoring Examples
  • In example 2, neither adequate rate nor level
    were being achieved.
  • It is necessary to modify the current
    intervention or introduce a new intervention.
  • Progress monitoring is still necessary.

45
Progress Monitoring Example 2
  • Establish a new goal based on the last three data
    points obtained by the student.
  • After the intervention is modified or a new
    intervention is implemented, progress monitoring
    continues until the next evaluation period.

46
Progress Monitoring Example 2a
Intervention 2
Intervention 1
Baseline
47
Progress Monitoring Example 2a
  • What you can do in this situation
  • Continue with the intervention and monitoring
  • Continue with the intervention and monitor less
    frequently
  • Discontinue intervention but monitor to ensure
    that progress doesnt decrease

48
Progress Monitoring Example 2b
49
Progress Monitoring Example 2b
  • After two periods of intensive, empirically based
    intervention in which the student has not
    achieved the level and rate goal established from
    baseline data, the team should consider special
    education placement.

50
RtI and Special Education Placement
51
RtI Is Not a Special Education Initiative!
  • Assessment is conducted within a RtI framework
    first and foremost to improve instruction and
    enhance student growth.
  • RtI is NOT a stand alone special education
    initiative, a means for increasing or decreasing
    special education numbers, or focused primarily
    on disability determination and documented
    through a checklist.
  • RtI is about determining the intensity of support
    needed to help students succeed!
  • Nebraska Department of Education. (2006).
    Technical Assistance Document

52
Special Education Placement
  • Before placing a student in special education
    using the RtI model, several factors need to be
    considered
  • 1. Was the measurement of progress accurate?
  • 2. Was the intervention appropriate for the
    child?
  • 3. Were high rates of treatment integrity
    observed?
  • 4. Did the student attend sessions regularly?
  • 5. Does the students ELL status or other
    cultural/language factors need to be considered?
  • 6. Is there evidence that the student could
    benefit from special education?

53
What About IQ Tests?
  • The Individuals with Disabilities Act, 2004
    became effective October 13, 2006
  • It states that the severe discrepancy approach
    shall not be required to identify students with
    specific learning disabilities
  • When determining whether a child has a specific
    learning disability as defined under this Act,
    the local education agency shall not be required
    to take into consideration whether a child has a
    severe discrepancy between achievement and
    intellectual ability

54
IDEA 2004 Continues
  • In determining whether a child has a specific
    learning disability, a local educational agency
    may use a process which determines if a child
    responds to scientific, research-based
    intervention.
  • Thus, IQ tests are an option, but not necessary,
    for LD verification
  • IQ tests are still necessary for MH verification
  • For more information, see http//idea.ed.gov/exp
    lore/home

55
Conclusions
  • Norming, universal screening and progress
    monitoring are important components of the RtI
    process.
  • Each process is used to ensure that students
    receive the services that they need to increase
    performance.

56
Additional Resources/References
  • Bollman, K. Johnson, C. Used with permission
    from FSDS.org. Based on Stewart, L. H.,
    Kaminski, R. (2002). Best practices in developing
    local norms for academic problem-solving. In A.
    Thomas J. Grimes (Eds.), Best Practices in
    School Psychology IV (pp. 737-752). Bethesda, MD
    NASP.
  • Crocker, L., Algina, A. (1986). Introduction to
    classical and modern test theory. New York Holt,
    Rinehart and Winston.
  • Edformation. (2004). AIMSweb, retrieved from
    www.edformation.com/.
  • Glover, T. A., Albers, C. A. (2007).
    Considerations for evaluating universal screening
    assessments. Journal of School Psychology, 45,
    117-135.
  • Good, R. H. Kaminski, R. A. (Eds.). (2002).
    Dynamic Indicators of Basic Early Literacy Skills
    (6th ed.). Eugene, OR Institute for the
    Development of Educational Achievement.
    Retrieved from dibels.uoregon.edu/
  • Fuchs, L. and Fuchs, D. (1993). Formative
    Evaluation of Academic Progress How much growth
    can we expect?. School Psychology Review 22, 1,
    1-30.
  • Fuchs, L. S. Fuchs, D. (1997) Use of
    curriculum-based measurement in identifying
    students with disabilities. Focus on Exceptional
    Children, 30, 3, 1-15.
  • National Association of State Directors of
    Special Education, Inc. (2005). Response to
    Intervention Policy considerations and
    implementation. New York, NY The Guilford
    Press.
  • Nebraska Department of Education. (2006).
    Technical Assistance Document.
  • Shapiro, E. S. (2004). Academic skills problems
    Direct assessment and intervention (3rd ed.).
    New York The Guilford Press.
About PowerShow.com