Interim Assessments - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Interim Assessments

Description:

One reaction we have heard is that interim assessment is the anti-formative assessment ... Some interim assessments may help inform instruction without meeting all the ... – PowerPoint PPT presentation

Number of Views:116
Avg rating:3.0/5.0
Slides: 29
Provided by: marian201
Category:

less

Transcript and Presenter's Notes

Title: Interim Assessments


1
Interim Assessments
  • Marianne Perie
  • (co-authors Scott Marion Brian Gong)

Presentation for the Mega SCASS New Orleans,
LA February 3, 2007
Center for Assessment
2
Our Goal
  • Distinguish classroom formative assessment from
    larger-scale interim assessments often marketed
    as formative
  • Focus our thinking on interim assessments
  • Can these assessments serve formative uses? If
    so, how?
  • What other uses can these assessments serve well?
  • What makes sense?
  • How does what should be done compare to what is
    being done?
  • Develop a framework for evaluating an interim
    assessment system

3
Interim Assessment
  • Term interim encompasses benchmark, predictive,
    diagnostic, and even some commercial formative
    assessments
  • One reaction we have heard is that interim
    assessment is the anti-formative assessment
  • More than that it is an important part of a
    comprehensive assessment system

4
Tiers of a Comprehensive Assessment System
5
Definitions
  • Formative Assessment
  • An assessment is formative to the extent that
    information from the assessment is used, during
    the instructional segment in which the assessment
    occurred, to adjust instruction with the intent
    of better meeting the needs of the students
    assessed. (Popham, Wiliam, Shepard, Stiggins,
    NCIEA, etc.)
  • Formative assessment is a process used by
    teachers and students during instruction that
    provides feedback to adjust ongoing teaching and
    learning to improve students achievement of
    intended instructional outcomes. (FAST SCASS)

6
Definitions
  • Interim Assessment
  • Assessments administered during the time of
    instruction to evaluate students knowledge and
    skills relative to specific set of academic goals
    in order to inform a policymaker or educator at
    the classroom, school, or district level. The
    purpose is the key to defining the assessment,
    and while the purpose may include providing
    results useful to instruction, all results can be
    aggregated across students, occasions, or
    concepts for other uses as well.

7
Focus Definition on Uses
  • Some interim assessments may help inform
    instruction without meeting all the criteria for
    formative assessment
  • Meet some requirements by
  • Providing qualitative insights about
    understandings and misconceptions in addition to
    a numeric score
  • Giving timely guidance on what to do to improve
    student learning besides re-teaching every missed
    item

8
Consider these other uses
  • Predict student achievement on summative test
    (e.g., early warning)
  • Provide information on how best to target
    curriculum to meet student needs
  • Provide aggregate information on how students in
    a school/ district/state are doing and where
    areas of weakness are
  • Determine students' knowledge/skills levels to
    group them for instruction
  • Encourage students to evaluate their own
    knowledge and discover the areas in which they
    need to learn more.
  • Evaluate the effectiveness of various curricular
    and/or instructional practices
  • Reinforce curricular pacing
  • Practice for summative test
  • Increase teacher knowledge of assessment, content
    domain, and student learning

9
Varied Uses and Purposes
  • All of these purposes may be worthwhile even if
    they are not formative
  • We grouped the uses into three types
  • Instructional
  • Predictive
  • Evaluative
  • There should be a clear link between the
    questions the policymakers and educational
    leaders want to answer and the tools they use to
    do so

10
Consider these questions
  • What do I want to learn from this assessment?
  • Who will use the information gathered from this
    assessment?
  • What action steps will be taken as a result of
    this assessment?
  • What professional development or support
    structures should be in place to ensure the
    appropriate action steps are taken?

11
Characteristics of a Good Interim Assessment
System
  • Provides valid and reliable results that are easy
    to interpret and provide information on next
    steps
  • Includes a rich representation of content with
    items linked directly to the content standards
    and specific teaching units.
  • Allows data to be aggregated to inform
    policymakers at the classroom, school, district,
    and even state level
  • Three main elements essential to creating a good
    system
  • Reporting Elements
  • Assessment Design
  • Administration Guidelines

12
Reporting
  • Policymakers should consider carefully the
    reporting component
  • Thinking about the end result helps to
    conceptualize the design
  • Reporting supports translating data into action
  • Consider all pieces of reporting
  • Qualitative as well as quantitative information

13
Reporting Elements
  • Type of data summary
  • Compare against criterion reference
  • Include normative reference
  • Aggregate across occasions/students/classrooms/
    schools/districts
  • Type of qualitative feedback
  • Information on inferences from correct/incorrect
    responses by content area
  • Information on what an incorrect answer or score
    level implies
  • Suggestions for next steps

14
Assessment Design
  • Need high quality items and tasks
  • Match item type to purposes
  • Instructional Inform and model desired teaching
    and learning including more open-ended, probes,
    performance tasks
  • Predictive Match the item type to what you are
    predicting, perhaps with additional probes
  • Evaluative Inform evaluation questions through
    focus on key program components using a
    combination of multiple-choice and short-answer
    items with why probes
  • Item type needs to take learning progression into
    consideration
  • Number and length of items will also influence
    fit into instruction

15
Administration Considerations
  • Flexibility in creating forms
  • Administered within instruction or separate from
    instruction
  • Adaptive or not
  • Flexibility in when/where the assessment is given
  • Computer-based
  • Web-based
  • Paper-and-pencil
  • Turnaround time for results

16
Administration Needs x Purpose
17
States and Districts Current Role
  • Increasingly, states and districts purchase
    commercially available products with labels such
    as formative/diagnostic/predictive/ benchmark
    assessments which better fit our definition of
    interim
  • How does what we want match what already exists?
  • What other options are available?
  • Customized assessment
  • Locally-designed assessment

18
States and Districts can
  • Provide policy support for developing local
    interim assessments
  • Help create item banks and interpretive tool and
    foster consortia-type relationships across
    districts and even with other states
  • Support and structure professional learning
    opportunities to foster successful implementation

19
Features of Many Current Systems
  • What these systems can do
  • Provide an item bank linked directly to state
    content standards
  • Assess students on a flexible time schedule
    wherever a computer and internet connection are
    available
  • Provide immediate results
  • Highlight content standards in which more items
    were answered incorrectly
  • Link scores of these assessments to the scores of
    the end-of-year assessments to predict results on
    the end-of-year assessments
  • Questions these systems can answer
  • Is this student on track to score Proficient on
    the end-of-year NCLB tests? (Aggregate across
    students/classrooms/schools)
  • Is this student improving over time?
  • How does this students performance compare to
    the performance of other students in the class?
  • Which content standards are the students
    performing best on and which content standards
    show the weakest student performance?
  • under the best of circumstances

20
What These Systems Lack
  • What these systems cannot do
  • Provide rich detail about the curriculum assessed
  • Provide a qualitative understanding of a
    students misconception(s)
  • Provide full information on the students depth
    of knowledge on a particular topic
  • Further a students understand through the type
    of assessment task
  • Give teachers the information on how to implement
    an instructional remedy
  • Questions these systems cannot answer
  • Why did a student answer an item incorrectly?
  • What are possible strategies for improving
    performance in this content area?
  • What did the student learn from this assessment?
  • What type of thinking process is this student
    using to complete this task?

21
Evaluative Criteria
  • Tasks should be carefully validated regarding the
    standards and cognitive processes assessed. An
    alignment study should be done to verify the
    assessment is appropriately aligned with state
    standards.
  • The collection of tasks administered through the
    year should represent a technically sound range
    of difficulty and appropriate breadth. Again,
    this should be examined during the alignment
    study.
  • The types of items used in each task should
    appropriately reflect the purpose of the
    assessment. If the test is used for predictive
    purposes, the proportion of multiple-choice and
    open-ended items should be similar to that of the
    end-of-year test. If the test is meant to provide
    instructional or evaluative information, the
    items should allow a student to express their
    rationale or understanding in a way that is
    useful to the instructor.

22
More Evaluative Criteria
  • When the tasks are administered, they should
    cover only content that has been instructedthe
    user should be able to evaluate the alignment by
    examining the items (or alignment documents) and
    ensuring the items selected only cover the
    standards that have been taught to date.
  • For interim assessment systems that require a
    break from instruction in order to test,
    educational leaders should consider the time
    required for assessment, which should be as short
    as possible to provide the desired information.
    For certain performance tasks that are less
    distinguishable from instruction than more formal
    tests, the issue of testing time is less of an
    issue, but still must be considered.

23
More Evaluative Criteria
  • Reports should provide information useful to the
    instructors. Depending on the purpose of the
    assessment, the instructor must be able to
    determine which students need extra support in
    which content areas or which lesson plans may
    need to be modified for future classes. They
    should go beyond simple number correct scores by
    content area. The degree of aggregation desired
    should also be considered.
  • Scores and subscores should be acceptably
    reliable for the use, meaning they should be
    moderately reliable if using this test for
    grading, but considerably higher if it is used
    for student accountability or other high-stakes
    decisions. As noted earlier, particular attention
    needs to be paid to the meaningfulness and
    reliability of subscores.
  • The quality and scope of professional development
    needs to be carefully evaluated to ensure that
    teachers can develop the knowledge and skills to
    use and learn from these tasks. Do this both in
    the evaluation before purchasing the assessment
    and in the first year of its use.

24
Validity Recommendations
  • Validating the evidence will be important to do
    over the next couple of years
  • If the test is used for instructional purposes,
    follow up with teachers to determine how the data
    were used and whether there was evidence of
    improved student learning for current students or
    if there were any unintended negative
    consequences
  • If the test is used for predictive purposes, do a
    follow up study to determine that the predictive
    link is reasonably accurate and that the use of
    the test contributes to improving criterion
    (e.g., end of year) scores
  • If the test is used for evaluative purposes,
    gather data from other sources to triangulate
    results of interim assessment and follow up to
    monitor if evaluation decisions are supported

25
Other Evaluation Points
  • Interim assessment systems should be evaluated
    for the effects on important aspects of the
    teaching and learning process, such as
  • Student learning, especially in terms of
    generalizability and transfer
  • Student motivation as a result of engaging with
    these tasks
  • Curricular quality as a result of incorporating
    tasks
  • Increases in teacher knowledge of content,
    pedagogy, and student learning
  • Manageability, including the quality of
    implementation

26
Further Research
  • Create a validity argument for how interim
    assessments lead to improved student learning
  • Examine differential effects of interim
    assessments on students intrinsic motivation to
    learn
  • Determine requirements for building a system that
    provides teachers the information they need but
    can still be scaled to compare results across
    students, teachers, schools
  • Analyze the types of professional development
    linked to effective use of interim assessments
    and important elements of the delivery system

27
Conclusion
  • There are valid purposes for giving interim
    assessments beyond informing instruction at that
    point
  • Determine the purpose(s) for using the assessment
    and compare that to what it can and cannot do
  • Match the features of the assessment to the
    purpose of using it
  • Further research is needed linking the use of
    interim assessments with improved student
    performance

28
For more information
  • Center for Assessment
  • www.nciea.org
  • Marianne Perie
  • mperie_at_nciea.org
  • Scott Marion
  • smarion_at_nciea.org
  • Brian Gong
  • bgong_at_nciea.org
Write a Comment
User Comments (0)
About PowerShow.com