How to think like an assessor Using data to influence your teaching - PowerPoint PPT Presentation

Loading...

PPT – How to think like an assessor Using data to influence your teaching PowerPoint presentation | free to download - id: 158721-MzBjN



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

How to think like an assessor Using data to influence your teaching

Description:

designer (only) we ask ... What would be fun and interesting activities on ... Checks for transfer of knowledge and skills taught throughout the unit at the ... – PowerPoint PPT presentation

Number of Views:87
Avg rating:3.0/5.0
Slides: 34
Provided by: kha8150
Learn more at: http://www.pkwy.k12.mo.us
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: How to think like an assessor Using data to influence your teaching


1
How to think like an assessor Using data to
influence your teaching
  • Karen Hall
  • Christine Young

2
What are your beliefs about data?
  • Results data are truth
  • Data can increase student achievement
  • Data is used to punish educators
  • Data are dishonest
  • Results are not relevant to real life

3
Data is our Best Friend!
  • What gets measured gets done.
  • Peters 1987

4
Essential Questions
  • What is an assessor?
  • How do I know the assessment measures the desired
    learning outcome?
  • What do you want to measure?

5
Word Association
  • Standards
  • Assessor
  • Performance
  • Feedback
  • Outcomes
  • Understanding

6
Thinking Like An Assessor
  • We recognize understanding through a flexible
    performance…
  • Understanding shows its face when people can
    think and act flexibly around what they know. In
    contrast, when a learner cannot go beyond rote
    and routine thought and action, this signals lack
    of understanding… To
    understand means to be able to perform flexibly.
  • -David Perkins, What Is Understanding? in
    Martha Stone Wiske, Ed.,
  • Teaching for Understanding, 1998, p. 42

7
Dynamic Data
  • Data are to goals what signposts are to
    travelers data are not end points, but are
    essential to reaching them…
  • Thus, data and feedback are interchangeable and
    should be an essential feature of how schools do
    business.
  • - Mike Schmoker

8
Assessment Paradigm Shift
Effective assessment is more like a movie than a
snapshot.
  • No longer use a single test of one type at the
    end of teaching.
  • Effective teachers gather lots of evidence along
    the way.
  • Use a variety of methods and formats when
    planning to collect evidence of understanding.

9
Reasons for Assessment
  • FOR LEARNING
  • To collect data to design next steps in
    instruction (reteach, move on, etc.) and to
    provide students specific feedback on their
    progress
  • OF LEARNING
  • To collect feedback at a specific point in time
    for the purpose of reporting to others on the
    students progress, including grading

10
Audience for Assessment
  • FOR LEARNING
  • Students about themselves
  • OF LEARNING
  • Others about students

11
Focus of Assessment
  • FOR LEARNING
  • Specific achievement targets selected by teachers
    that enable students to build towards standards
  • OF LEARNING
  • Achievement standards for which schools,
    teachers, and students are held accountable

12
Place In Time
  • FOR ASSESSMENT
  • A process before or during learning
  • OF ASSESSMENT
  • An event after learning

13
  • How do I know the
  • assessment measures the
  • desired learning outcomes?

14
What do you mean, I have to think like an
assessor?
  • Backward design tells us
  • Consider assessment evidence implied by the
    outcome sought.
  • Discontinue thinking that assessment is primarily
    a means to generate grades.
  • The performance evidence should signify goals
    have been met.
  • Evidence must be present that the learner deeply
    considered the essential questions.
  • Given the understandings, the learner must show
    that they got it.

15
Two Approaches to Thinking About Assessment
  • When thinking like an activity
  • designer (only) we ask …
  • What would be fun and interesting activities on
    this topic?
  • What project might students wish to do on this
    topic?
  • What tests should I give, based on the content I
    taught?
  • How will I give students a grade (and justify it
    to their parents)?
  • When thinking like an assessor,
  • we ask …
  • What would be sufficient and revealing evidence
    of understanding?
  • Given the goals, what performance tasks must
    anchor the unit and focus the instructional work?
  • What are the different types of evidence required
    by Stage 1 desired results?
  • Against what criteria will we appropriately
    consider work and assess levels of quality?

16
How do I begin to think like an assessor?
  • Thinking like an assessor boils down to 3 basic
    concepts…
  • Acceptable Evidence - Before you design a
    particular test or task, its important to
    consider the general types of performances that
    are implied.
  • Specific characteristics in student responses,
    products, or performances you will examine -
  • This is where criteria, rubrics, and exemplars
    come into play.
  • Level of Assessment - The proposed evidence
    enables us to infer a students knowledge, skill,
    or understanding. The evidence aligns with our
    goals, and the results are sufficiently clear.

17
  • What
  • Do You
  • Want To
  • Measure?

18
Stage One Thinking like an assessor
  • Determine overall and specific expectations
  • What is worth being familiar with?
  • What is important to know and do?
  • Enduring Understandings
  • Essential Questions
  • How do I determine if students have attained the
    Enduring Understandings?

19
Stage Two Thinking like an assessor
  • The key to valid results is the match between the
    specific learning outcomes and the selected
    assessment strategy.
  • What do the students already know?
  • What misconceptions need to be addressed?
  • How will I know students have the knowledge and
    skills to achieve learning outcomes?
  • Do I have evidence to validate that desired
    learning has been achieved?

20
Stage Three Thinking like an assessor
  • What needs to be uncovered to achieve desired
    understanding?
  • How will I address misconceptions
  • How do make big ideas less abstract and obvious?

21
A Continuum of Assessments
  • informal checks observations
    tests and quizzes academic prompts
    performance
  • for understanding dialogues

    tasks

22
  • IRI
  • Running Record
  • DRA
  • Pretests
  • Used to determine level of achievement prior to
    learning experiences
  • Helps teachers to determine and plan for
    experiences the students will need to achieve
    learning outcomes

Diagnostic (not a part of the grade)
  • Student self-
  • reflection
  • Student-led
  • conference
  • Portfolio

Formative (not a part of the grade)
Measures progress over time during the course of
learning
Summative (reported as part of the grade)
Checks for transfer of knowledge and skills
taught throughout the unit at the end of learning
activities
  • Performance
  • Event
  • Post test

23
Reflection
  • Has your journey prepared you for your
    destination?

24
Trips Over!
  • What went well?
  • What went wrong?
  • What did we do or not do to influence the outcome?

25
You Have Great Data… …Now What?
  • What are the limits of the data?
  • What are your indicators of growth?

26
Believe It or Not… Data Isnt
Perfect!
  • Data does have its limitations
  • Measurement that is inaccurate.
  • Measurement that is accurate but is measuring the
    wrong things.
  • Measurement that is accurate, but inconsistent.
  • Measurement that is accurate and consistent, but
    the results are late.
  • Measurement that is accurate, consistent, and
    timely but incomplete.
  • Measurement that is accurate, consistent, timely,
    and complete - but related to things we cannot
    control.

27
Beware the Search for Perfection!
  • Data analysis is not a choice between perfect
    truth and utter falsehood.
  • Data analysis is a CHOICE OF ERRORS

28
The Search for Perfection… NOT!
  • Error 1 I make a decision that is uninformed by
    data.
  • Error 2 I make a decision that is informed by
    data with errors in it, but have confidence
    because it is data-driven.
  • Error 3 I make a decision that is informed by
    imperfect data, I admit the potential errors, and
    I continuously gather additional data to guide
    and improve my decisions.

29
Two Types of Data
  • In the context of schools, the essence of
    holistic accountability is that we must consider
    not only the effect variabletest scoresbut also
    the cause variablesthe indicators in teaching,
    curriculum, parental involvement, leadership
    decisions, and a host of other factors that
    influence student achievement.
  • (D. Reeves, Accountability for Learning, 2004)

30
Organizing Our Data Organizing Our Thoughts
Effect Data Student achievement results from
various measurements Cause Data Information
based on actions of the adults in the system
31
Classroom, Building, and District Data
  • District data -
  • Almost always effect data
  • Building and classroom data (the essence of
    data-driven decision making) -
  • Almost always cause data

32
Using Data To Influence Your Teaching Journey
33
  • There are countless ways of attaining greatness,
    but any road to reaching ones maximum potential
    must be built on a bedrock of respect for the
    individual, a commitment to excellence, and a
    rejection of mediocrity.
  • Buck Rogers
About PowerShow.com