Issues in Selecting Assessments for Measuring Outcomes for Young Children - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Issues in Selecting Assessments for Measuring Outcomes for Young Children

Description:

Curriculum-Based Pros/Cons. Provides link between assessment and curriculum ... Pros and Cons. Recommended practices. Need to summarize information generated ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 24
Provided by: KathyHe7
Learn more at: https://nectac.org
Category:

less

Transcript and Presenter's Notes

Title: Issues in Selecting Assessments for Measuring Outcomes for Young Children


1
Issues in Selecting Assessments for Measuring
Outcomes for Young Children
  • Dale Walker Kristie Pretti-Frontczak
  • ECO Center and Kent State University
  • Presentation at OSEP Early Childhood Conference
  • Washington, DC, December 2005

2
Why Assessment?
  • Gather information about skills and capabilities
    to make decisions about practice
  • To determine eligibility for services
  • To determine if a child is benefiting from
    services or if changes need to be made
  • To measure development over time
  • To document outcomes

3
Purpose of Assessments Its all about the
question(s) you want to answer
  • Screening Is there a suspected delay? Does the
    child need further assessment?
  • Eligibility determination Is the child eligible
    for specialized services?
  • Program planning What content should be taught?
    How should content be taught?
  • Progress monitoring Are children making desired
    progress?
  • Program evaluation/Accountability Is the
    program achieving it intended outcomes and/or
    meeting required outcomes?

4
Assessment Options
  • Norm-Referenced
  • Criterion-Referenced
  • Curriculum-Based
  • Direct Observation
  • Progress Monitoring
  • Parent or Professional Report
  • Any combination of assessments

5
Norm-Referenced Pros/Cons
  • Provides information on development in relation
    to others
  • Already used for eligibility in many states
  • Diagnosis of developmental delay
  • Standardized procedures
  • Do not inform intervention
  • Information removed from context of childs
    routines
  • Usually not developed or validated with children
    with disabilities
  • Do not meet many recommended practice standards
  • May be difficult to administer or require
    specialized training

6
Norm-Referenced Assessment Table
  • Table consists of a review of 18 norm-referenced
    assessments
  • Information regarding each assessment is provided
    including
  • Publisher information
  • Areas of development assessed
  • Test norms provided
  • Scores produced
  • Age range covered

http//fpsrv.dl.kent.edu/ecis/Web/Research/OSEP/NR
T.pdf
7
Criterion-Referenced Pros/Cons
  • Measure childs performance of specific
    objectives
  • Direct link between assessment and intervention
  • Provides information on childrens strengths and
    emerging skills
  • Helps teams plan and meet individual childrens
    needs
  • Meets recommended assessment practice standards
  • Measures intra-child progress
  • May be used to measure program effectiveness
  • Requires agreement on criteria and standards
  • Criteria must be clear and appropriate
  • Usually does not show performance compared to
    other children
  • Do not have standard administration procedures
  • May not move child toward important goals
  • Scores may not reflect increasing proficiency
    toward outcomes

8
Curriculum-Based Pros/Cons
  • Provides link between assessment and curriculum
  • Expectations based upon the curriculum and
    instruction
  • Can be used to plan intervention
  • Measure childs current status on curriculum
  • Evaluate program effects
  • Often team based
  • Meets DEC and NAEYC recommended standards
  • Represents picture of the childs performance
  • May not have established reliability and validity
  • May not have procedures for comparing child to a
    normal distribution
  • Generally linked to a specific curriculum
  • Often composed of milestones that may not be in
    order of importance

9
Curriculum-Based Assessment Rating Rubric
  • Evaluates the quality of CBAs for use with young
    children
  • Composed of 17 quality elements
  • Used to guide teams in selecting appropriate CBAs

http//fpsrv.dl.kent.edu/ecis/Web/Research/OSEP/CB
Arubric.pdf
10
Sample of CBA Rubric
Element Unsatisfactory (0) Basic (1) Satisfactory (2) Excellent (3)
Adaptable for Special Needs No consideration of special needs Limited consideration of special needs through the assessment process and instrument does not allow for additional accommodations or modifications for special needs Upfront considerations for special needs are not comprehensive, but assessment allows for some accommodations and/or modifications for special needs Considers and provides specific strategies and procedures for accommodating and/or modifying the assessment for special needs
Aligns with Federal/State/Agency Standards and/or Outcomes Does not align with Federal/State/Agency Standards and/or Outcomes Aligns with less than half of the big ideas or concepts from Federal/State/Agency Standards and/or Outcomes Aligns with more than half of the big ideas or concepts from Federal/State/Agency Standards and/or Outcomes Aligns with a clear majority or all of the big ideas or concepts from Federal/State/Agency Standards and/or Outcomes
11
Progress Monitoring Pros/Cons
  • Used to monitor ongoing progress toward important
    outcomes over time
  • Compare to children of similar ages over time
  • Repeatable measures for monitoring progress
  • Standardized administration
  • Standards for technical adequacy
  • Efficient to administer
  • May also be used as a screening tool
  • Indicators of progress may be viewed as not being
    comprehensive
  • Not used for eligibility determination
  • May not provide specific skills to teach but
    indicators of important skills

12
Parent Professional Report Pros/Cons
  • High social validity
  • Provides diverse perspective
  • Important for informing intervention, program,
    IFSP/IEP
  • Parents and professionals know the child, the
    environments in which they interact
  • Collaboration requires time and effort to
    establish
  • May not be reliable across time
  • Does not permit comparison across children
  • May include personal bias

13
Using Multiple Sources of Data or Single Source
to Measure Outcomes?
  • Pros and Cons
  • Recommended practices
  • Need to summarize information generated
  • Ways data can be used beyond reporting OSEP
    outcomes

14
Using Data Beyond OSEP Reporting
  • Good assessment data can be used to.
  • Reveal patterns regarding childrens strengths
    and emerging skills
  • Develop functional and meaningful IFSPs/IEPs
  • Inform program staff and families about strengths
    and weaknesses
  • Guide the development of intervention
  • Monitor childrens progress to inform
    intervention efforts
  • Enhance collaboration
  • Inform providers, programs, districts/parishes,
    regions, and states regarding important trends

15
Ongoing work and challenges
  • Existing assessment tools were not developed to
    measure the three outcomes
  • ECOs response Cross-walking or mapping
    frequently used assessments to the outcomes
  • Work with publishers and state staff to develop
    guidance for how to use assessment results to
    generate OSEP-requested data

16
Work with Publishers and Developers
  • Finalizing crosswalks
  • Alignment with OSEP outcomes
  • How to determine what is typical performance
  • Age-anchored benchmarks to measures
  • How scores can be summarized using the ECO
    Summary Form
  • Possible recalibration of scores in a way that
    maintains the integrity of different assessments
  • Pilot studies with GSEG and interested states
  • Data summary report forms that assist users with
    alignment of information from assessment to OSEP
    outcomes

17
Example of Developing a Validated Crosswalk
  • First align
  • On the face of it which items appear to
    align/match which outcomes?
  • Second validate
  • Do experts agree?
  • Check for internal consistency
  • Third examine the sensitivity of the assessment
    in measuring child change

http//fpsrv.dl.kent.edu/ecis/Web/Research/OSEP/St
eps.pdf
18
Example of Interpreting the evidence
  • Standard scores
  • Residual Change Scores
  • Goal Attainment Scaling
  • Number of objectives achieved/Percent objectives
    achieved
  • Rate of Growth
  • Item Response Theory
  • Proportional Change Index
  • Stoplight model

19
Interpreting the AEPS for Accountability
  • First administration (near entry)
  • Is the child above or below a cut off score?
  • If above considered to be developing typically
  • If below development is suspect
  • Which level of the AEPS was administered?
  • Child is less than three and Level I is used
  • Child is less then three and Level II is used
  • Child is older than three and Level I is used
  • Child is older than three and Level II is used

20
Interpreting the AEPS for Accountability
  • Second administration (near exit)
  • Use cut off scores again
  • Examine which level was used
  • Look for
  • changes in area percent scores
  • changes in scoring notes
  • changes in which level was administered

21
Sample Cutoff Scores
Level Age Intervals (months) Cutoff Score
Birth to three 25-30 50
31-36 60
Three to six 37-42 20
43-48 30
49-54 40
22
Questions?
23
For More Information see http//www.the-ECO-cente
r.org
Write a Comment
User Comments (0)
About PowerShow.com