Data Interpretation II Workshop - PowerPoint PPT Presentation

1 / 75
About This Presentation
Title:

Data Interpretation II Workshop

Description:

Initiate reflection and discussion among division-level staff members related to ... state, but not so complex that the process becomes overwhelming and unmanageable. ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 76
Provided by: iank3
Category:

less

Transcript and Presenter's Notes

Title: Data Interpretation II Workshop


1
Data Interpretation II Workshop
2008 Writing Assessment for Learning
2
Purposes for the Day p. 2
  • Deepen understanding about the writing assessment
    project results
  • Initiate reflection and discussion among
    division-level staff members related to the
    writing assessment results
  • Provide a range of tools and processes to support
    division-level staff in their work throughout the
    system related to school improvement and,
  • Provide opportunity to discuss and plan around
    the data in the context of school improvement.

3
Agenda
  • Opening
  • Assessment for Learning Writing Assessment
  • Conceptual Framework
  • Comparators
  • The Reports
  • Heres What
  • The Data
  • Processes to Support School/Division School
    Improvement
  • Changing Contexts
  • Building Capacity
  • So What?
  • Analysis of Data and Support Structures
  • Role of Central Office in Supporting School
    Improvement
  • Sustainability
  • Now What?
  • Using Goals to Inform Planning
  • Monitoring Assessing Progress
  • Linking goals and Assessment Data
  • Identifying Interrelationships
  • Evidence of Implementation
  • Closure

4
Magnetic Quotes
  • In various locations around the room are
    statements regarding data use and school
    improvement.
  • Take a moment to read each and then go to the
    sign with the statement that resonates most for
    you.
  • Create a pair or trio with your colleagues and
    discuss why you connect with the statement and
    what it means to you.

5
(No Transcript)
6
Assessment for Learningis a Snapshot
  • Results from a large-scale assessment are a
    snapshot of student performance.
  • The results are not definitive. They do not tell
    the whole story. They need to be considered
    along with other sources of information available
    at the school.
  • The results are more reliable when larger numbers
    of students participate and when aggregated at
    the provincial and division level, and should be
    considered cautiously at the school level.
    Individual student mastery of learning is best
    determined through effective and ongoing
    classroom-based assessment. (Saskatchewan
    Learning, 2008)

7
Depth and Specificityof Knowledge
.
In-depth knowledge of specific students
In-depth knowledge of systems
Assessments
From Saskatchewan Learning. (2006).
Understanding the numbers.
8
Provincial Writing Assessment Conceptual
Framework p. 3
  • Colourful Thoughts
  • As you read through the information on the
    Provincial Writing Assessment, use highlighters
    or sticky notes to think about your reading

Wow! I agree with this.
Hmm! I wonder. . .
Yikes!
Adapted from Harvey, S. Goudvis, A. Strategies
that work, 2007.
9
Comparators Types of Referencing
  • Criterion-referenced Comparing how students
    perform relative to curriculum objectives, level
    attribution criteria (rubrics) and the level of
    difficulty inherent in the assessment tasks. If
    low percentages of students are succeeding with
    respect to specific criteria identified in
    rubrics, this may be an area for further
    investigation, and for planning intervention to
    improve student writing.
  • (Detailed rubrics, OTL rubrics and test items
    can be sourced at www.education.gov.sk.ca)
  • Standards-referenced Comparing how students
    performed relative to a set of professionally or
    socially constructed standards. Results can be
    compared to these standards to help identify key
    areas for investigation and intervention.
  • (Figure .2b, .3c, .4a, .6b, .7b and .8b.)

10
Comparators Types of Referencing
  • Experience or selfreferenced Comparing how
    students perform relative to the assessment data
    gathered by teachers during the school year.
    Where discrepancies occur, further investigation
    or intervention might be considered. It is
    recommended that several sources of data be
    considered in planning.
  • (E.g.. Comparing these results to current school
    data. The standards set by the panel.)
  • Norm-referenced Comparing how students in a
    school performed relative to the performance of
    students in the division, region or project.
    Note cautions around small groups of students.
    Norm-reference comparisons contribute very little
    to determining how to use the assessment
    information to make improvements.
  • (E.g.. Tables comparing the school, division and
    province.)

11
Comparators Types of Referencing
  • Longitudinal-referenced Comparing how students
    perform relative to earlier years performance of
    students. Viewed across several years,
    assessment results and other evidence can
    identify trends and improvements. (This data
    will not appear until the next administration of
    this assessment.)

12
Opportunity-to-Learn Elements as Reported by
Students
  • Propensity to Learn
  • using resources to explore models, generate ideas
    and assist the writing process
  • Motivation, attitude and confidence
  • Participation, perseverance and completion
  • Reflection
  • Knowledge and Use of Before, During and After
    Writing Strategies
  • Home Support for Writing and Learning
  • Encouragement and interaction
  • Access to resources and assistance

13
Opportunity-to-Learn Elements as Reported by
Teachers
  • Availability and Use of Resources
  • Teacher as key resource
  • Teacher as writer
  • Use of curriculum
  • Educational qualifications
  • Professional development
  • Time
  • Student resources
  • Classroom Instruction and Learning
  • Planning focuses on outcomes
  • Expectations and criteria are clearly outlined
  • Variety of assessment techniques
  • Writing strategies explicitly taught and
    emphasized
  • Adaptation

14
Student Performance Outcome Results
  • Demonstration of the
  • writing process
  • Pre-writing
  • Drafting
  • Revision
  • Quality of writing product
  • Messaging and content
  • Focus
  • Understanding and support
  • Genre
  • Organization and coherence
  • Introduction, conclusion, coherence
  • Language use
  • Language and word choices
  • Syntax and mechanics

15
Standards
  • To help make meaningful longitudinal comparisons
    in future years, three main processes will be
    implemented.
  • Assessment items will be developed for each
    assessment cycle using a consistent table of
    specifications.
  • The assessment items will undergo field-testing -
    one purpose of which is intended to inform the
    comparability of the two assessments.
  • A process for setting of standards for each of
    the assessment items, so that any differences in
    difficulty between two assessments are accounted
    for by varying standards for the two assessments.

16
Opportunity-to-Learn and Performance Standards
  • In order to establish Opportunity-to-Learn and
    Performance standards for the 2008 Writing
    Assessment, three panels were convened (one from
    each assessed grade), consisting of teachers from
    a variety of settings and post-secondary
    academics including Education faculty.
  • The panelists studied each genre from the 2008
    assessment in significant detail and established
    expectations for writing process, narrative
    products and expository products as well as
    opportunity to learn.

17
Thresholds of Adequacyand Proficiency
18
Thresholds of Adequacyand Proficiency
19
Cut Scores
  • On page 4 of the detailed reports you will find
    the cut scores detailing the percentage correct
    required for students to be classified at one of
    two levels

20
(No Transcript)
21
(No Transcript)
22
Four Major Categories of Data Demographics
  • Local Data
  • Descriptive information such as enrollment,
    attendance, gender, ethnicity, grade level, etc.
  • Can disaggregate other data by demographic
    variables.
  • AFL
  • Opportunity-to-Learn Data
  • Family/Home support for student writing
  • encouragement and interaction
  • access to resources

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
23
Four Major Categories of Data Student Learning
  • Local Data
  • Describes outcomes in terms of standardized test
    results, grade averages, etc.
  • AFL
  • Readiness Related Opportunity-to-Learn Data
  • Using resources to explore writing
  • Student knowledge and use of writing strategies
    (before, during, after)
  • Student performance outcomes
  • Writing 5,8,11 Narrative and Expository
  • Writing process, Writing product categories
    within

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
24
Four Major Categories of DataPerceptions
  • Local Data
  • Provides information regarding what students,
    parents, staff and community think about school
    programs and processes.
  • This is data is important because people act in
    congruence with what they believe.
  • AFL
  • Readiness Related Opportunity-to-Learn Data
  • Commitment to learn
  • Using resources
  • Motivation attitude
  • Confidence
  • Participation
  • Perseverance completion
  • Reflection
  • Knowledge and use of writing strategies

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
25
Four Major Categories of DataSchool Processes
  • Local Data
  • What the system and teachers are doing to get the
    results they are getting.
  • Includes programs, assessments, instructional
    strategies and classroom practices.
  • AFL
  • Classroom Related Opportunity-to-Learn Data
  • Instruction and learning
  • Planning and reflection
  • Expectations and assessment
  • Focus on writing strategies
  • Adaptations
  • Availability and use of resources
  • Teacher
  • Time
  • Resources for students and teachers

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
26
Examining the Report
  • Take a few minutes to look through the entire AFL
    report.
  • Use the section on data in the chart in your
    handout package to guide your thinking and
    conversation.
  • Note three or four areas of strength and areas
    for improvement.

DATA Heres What
School ImprovementHeres What
So What?
Now What?
27
DATA Heres What
School ImprovementHeres What
So What?
Now What?
28
A Changing ContextHargreaves Fink (2006)
  • Old Basics
  • Literacy
  • Numeracy
  • Obedience
  • Punctuality
  • New Basics
  • Multiliteracy
  • Creativity
  • Communication
  • IT
  • Teamwork
  • Lifelong Learning
  • Adaptation Change
  • Environmental Responsibility

29
Saskatchewans Changing Context
  • Old Paradigm
  • Content-based curricula
  • Teaching from activities
  • Assessment sorts and selects
  • Streaming
  • Those who can, learn
  • Evolving Paradigm
  • Balance between content and process
  • Outcome-based learning
  • Assessment supports learning
  • Adaptive Dimension
  • Everyone learns

30
Judith Warren Little
  • In large numbers of schools, and for long
    periods of time, teachers are colleagues in name
    only. They work out of sight and hearing of one
    another, plan and prepare their lessons and
    materials alone, and struggle on their own to
    solve most of their instructional, curricular and
    management problems. Against this almost uniform
    backdrop of isolated work, some schools stand out
    for the professional relationship they foster
    among teachers. These schools, more than others,
    are organized to permit the sort of reflection .
    . . That has been largely absent form
    professional preparation and professional work in
    schools. For teachers in such schools, work
    involves colleagueship of a more substantial sort.

31
  • Professional Learning Communities
  • Transform knowledge
  • Shared inquiry
  • Evidence informed
  • Situated certainty
  • Local solutions
  • Joint responsibility
  • Continuous learning
  • Communities of practice
  • Performance-Training Sects
  • Transfer knowledge
  • Imposed requirements
  • Results driven
  • False certainty
  • Standardized scripts
  • Deference to authority
  • Intensive training
  • Sects of performance

Hargreaves, A. (2003). Teaching in the knowledge
society Education in the age of insecurity. New
York, NY Teachers College Press.
32
Building Capacity for Success Learning by Trial
and Evidence.
  • In the article, note there are many examples of
    schools who have engaged in school improvement
    through a process of trial and evidence.
  • In all cases the focus was on the questions
  • What will students learn?
  • How will teachers best support student learning?
  • What are indicators of success?
  • How will we measure success of the practices?
  • What can we do to support students not meeting
    expectations?

33
Building Capacity for Success Learning by Trial
and Evidence.
  • Characteristics of improving schools
  • A focus on achievement.
  • Build in monitoring and measuring.
  • Leadership.
  • Involvement of all partners.
  • Considered all students needs.

34
Read and ExampleBuilding Capacity for Success
Learning by Trial and Evidence.
  • Find a partner, decide who is A and who is B.
  • Both partners read to the end of the questions on
    p.7 of the text, then stop.
  • A summarizes the reading
  • Pairs craft examples or non-examples from their
    experiences
  • Both partners continue reading up to the end of
    p. 8.
  • B summarizes the reading
  • Pairs craft examples (or non-examples)
  • Read to the end of p. 9.
  • A summarizes the reading
  • Pairs craft examples or non-examples from their
    experiences

35
Reflection Questions
  • What key lessons can be taken from this reading
    and applied to your context?
  • In what ways are you gathering evidence of
    promising practices within your schools and
    school divisions?
  • In what ways do goals reflect your school or
    divisions core values and beliefs?
  • In what ways are you building community with your
    schools and school divisions?

DATA Heres What
School ImprovementHeres What
So What?
Now What?
36
DATA Heres What
School ImprovementHeres What
So What?
Now What?
37
Double-Loop ExerciseP. 10
  • As a table group, use the provided double-loop to
    clarify the connections between the professional
    structures supporting school improvement and the
    information you are getting from the AFL data.
  • In the top circle write out the current
    structures (PLCs, catalyst teachers, PD) and
    initiatives (literacy) already in place in your
    division.
  • In the bottom circle write down 3-5 significant
    (strengths areas for improvement) indicators
    from the AFL data.
  • Draw arrows from the items in the bottom circle
    that are connected to or could be supported by
    items in the top circle.

PLCs Catalyst Teachers Reading Group Writing
Strategies
Majority of students scored at proficient in
narrative writing. Students arent reporting use
of writing processes.
Lezotte, L. W. McKee, K. M. (2006). Stepping
up Leading the charge to improve our
schools. Okemos, MI Effective Schools Products,
Ltd.
38
Double-Loop Exercise
  • Once you have completed the diagrams, use the
    following questions to guide discussion at your
    table
  • What current structures support areas where
    student performance was strong?
  • What current structures could meet the needs
    identified for improvement?
  • What new structures/initiatives may need to be
    considered?

39
Please return at 1250
Id trade, but peanut butter sticks to my tongue
stud.
40
The Role of Central Office
  • Focus on Alignment
  • Equipping staff with the knowledge and skills for
    aligning school improvement processes.
  • Provide multiple opportunities for staff to
    collaborate around current literature and best
    practices.
  • Model and encourage reflective practice aligning
    improvement efforts requires time for reflection.

Adapted from - Mooney, N. J. Mausbach, A. T.
(2008). Align the design a blueprint for school
improvement. Alexandria, VA ASCD.
41
The Role of Central Office
  • Supporting Alignment Initiatives
  • Link data to the goals and strategies already in
    place.
  • Keep improvement goals front and center.
  • Engage staffs in discussion about improvement
    initiatives.

Adapted from - Mooney, N. J. Mausbach, A. T.
(2008). Align the design a blueprint for school
improvement. Alexandria, VA ASCD.
42
The Role of Central Office
  • Getting to Goal
  • Recognize and address alignment problems.
  • Support all schools in aligning improvement
    plans.
  • Sustain the plan until . . .

Adapted from - Mooney, N. J. Mausbach, A. T.
(2008). Align the design a blueprint for school
improvement. Alexandria, VA ASCD.
43
Sustainability
  • Sustainability is the capacity of a system to
    engage in the complexities of continuous
    improvement consistent with deep values of human
    purpose. (Fullan, 2004)
  • Sustainability does not simply mean whether
    something can last. It addresses how particular
    initiatives can be developed without compromising
    the development of others in the surrounding
    environment, now and in the future. (Hargreaves
    Fink, 2000)

44
Challenges to SustainabilityP. 11
  • In your handout package is a template with five
    common challenges to sustainable collaborative
    work.
  • In groups of 3-6 brainstorm possible solutions
    for each challenge.

45
Supporting Improvement
  • So what meaning are you making about this
    information about data and school improvement?

DATA Heres What
School ImprovementHeres What
So What?
Now What?
46
Goals to Inform PlanningP. 12
  • What are your school or division goals?
  • What structures and supports do you have in place
    to support sustainable improvement towards that
    goal?
  • What kinds of data are you gathering to inform
    decision making and progress?

DATA Heres What
School ImprovementHeres What
So What?
Now What?
47
Progress Measure AreasP. 13
Goal Types Improvement Goals Proficiency Goals
Assessing Progress
Student Data Short-Term Medium-Term Long-Term
Evidence of Implementation
From Boudette, City, Murnane (2005) and Holcomb
(2004).
48
Improvement and Proficiency
  • GROWTH
  • Improvement refers to students growth on a given
    assessment within a specified period of time.
  • A student or group of students may experience
    great growth but still fall short of set
    proficiency goals.
  • COMPETENCE
  • Proficiency refers to how many students will
    achieve a certain level of performance within a
    specified period of time.
  • Proficiency goals dont measure student growth
    they measure how many have reached a set standard
    or benchmark.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
49
Improvement and Proficiency
  • Attending to both improvement and proficiency
    ensures that students grow academically and have
    achieved degrees of competence in their studies.
  • Thinking of growth and competence compels us to
    consider in what ways all students will grow
    (weak, average, and gifted) and what levels of
    competence are desired for all.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
50
Goal Types
  • In what ways do your school division goals
    reflect improvement?
  • In what ways do your school division goals
    reflect proficiency?

51
Monitoring and Assessing Progress
  • An integral part of every action plan is the
    detailed plan to ensure that progress is being
    made.
  • It is important to gather data from a variety of
    sources that clearly demonstrate the plan is
    being implemented, change in instruction is
    occurring, and student learning is improving.

52
Short-, Medium-, andLong-Term Data
  • Short-Term Data
  • Gathered daily or weekly via classroom
    assessments and/or observations.
  • Medium-Term Data
  • Gathered at periodic intervals via common
    department, school, or division assessments.
    These are usually referred to as benchmark
    assessments.
  • Long-Term Data
  • Gathered annually via standardized provincial,
    national, or international assessments.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
53
Short-, Medium-, andLong-Term Assessments p. 14
  • Your school or school division has set a goal to
    improve students quality of writing,
    particularly as it relates to organization and
    coherence.
  • Teachers in-class assessment strategies provide
    formative feedback to students in these areas
    writing effective introductions and conclusions,
    as well as transitions.
  • Writing prompts are developed for each grade
    level in the school and administered at the end
    of each reporting period. Teachers
    collaboratively grade the papers using the
    rubrics from the Assessment for Learning program
    and analyze the results together.
  • Following the common assessment, students who
    have not achieved the set benchmark receive
    additional instruction and formative assessment
    as they work towards the goal.
  • In 2010 students are again assessed on their
    writing with the provincial AFL program.

54
Short- and Medium-Term Assessments
  • Referring to your identified goals, indicate what
    types of short- and medium-term assessments would
    best measure the progress of students as they
    work toward the goal.
  • It may be useful to plan the medium-term
    assessments first to provide a framework within
    which short-term assessments would fit.
  • Use the provided Short- and Medium-Term
    Assessment Planning template to plan when these
    might be administered.
  • You will also want to consider Long-Term
    Assessments at some point.

55
Short- and Medium-Term Assessments
  • In what ways might you support staff as they
    design their learning improvement plans?
    Consider sustainability as you identify the types
    of assessments needed to provide data to support
    decision making.

56
To Consider . . .
  • Using the frame of short-, medium-, and long-
    term data, where do improvement and proficiency
    goals fall?
  • What would be the nature of the assessments used
    to address improvement goals?
  • What would be the nature of the assessments used
    to address proficiency goals?

57
Identifying Starting PointsInterrelationship
Diagramming(Wellman Lipton, 2003)
  • Interrelationship diagrams reveal critical
    relationships among the elements in a system.
    The intent is to coordinate decision-making to
    determine choices and starting points for
    improvement plans.
  • This tool is most effective when implemented with
    a group with a variety of roles and experiences
    in order to examine multiple perspectives.

Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
58
Interrelationship Diagramming
  • First, the group identifies all of the issues
    related to a process, area or goal they are
    working on.
  • For exampleimproving student writingwhat are
    the related issues and how might they be
    categorized?
  • Teacher Knowledge About Writing
  • Teacher Instructional Practices
  • Student Interest/Motivation
  • Resources to Support Writing
  • Etc
  • Brainstorm a list of issues related to a goal you
    are working on in your school division.
  • Categorize the issues into 6-8 broad categories.
    Place the titles of categories on sticky notes.

Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
59
Creating anInterrelationship Diagram
  • Take the sticky notes and place them in a circle
    with the issue as a header above

Goal
Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
60
Identifying Effects and Drivers
Interrelationship Diagramming
  • Then the group needs to separate drivers (causes)
    from effects.
  • For example, in the area of student writing

Driver
Effect
Teacher Instructional Practices
Student Interest
Arrows moving away from boxes indicate
drivers, while arrows moving into boxes indicate
effects.
Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
61
Creating anInterrelationship Diagram
  • Select one category as a starting point.
  • Ask two-way questions to determine whether this
    category is a driver or effect of each of the
    other categories. For example, Does teacher
    content knowledge influence (or drive) teacher
    instructional practices (effect)? Or, do teacher
    instructional practices influence or (drive)
    teacher content knowledge? Draw arrows from the
    drivers to the effects.
  • Continue in this way through each of the
    categories.
  • NO two-headed arrows! Decide which category
    dominates the other.

Arrows moving away from boxes indicate
drivers, while arrows moving into boxes indicate
effects.
Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
62
Creating anInterrelationship Diagram
  • DRIVERS
  • Drivers are indicated by the number of arrows
    going away from a category.
  • Count the number of arrows going away from each
    category.
  • Rank the drivers from highest to lowest.
  • EFFECTS
  • Effects are indicated by the number of arrows
    pointing towards a category.
  • Count the number of arrows pointing towards each
    category.
  • Rank the effects from highest to lowest.

Wellman, B. Lipton, L. (2004). Data-driven
dialogue. Mira Via, LLC.
63
Creating Action PlansBased on Drivers
  • Select the top 3-5 drivers.
  • The drivers are the effective intervention points
    and thus provide the best opportunities to align
    all the elements of school/division improvement
  • goals,
  • professional development,
  • assessment,
  • data collection
  • You will want to consider the role of research
    and literature in providing background
    information for this activity or its role as
    follow up to the activity.

64
Evidence of ImplementationP. 15 16
  • Once teachers have decided on teaching/ learning
    strategies to use to improve student learning, it
    is important to identify implementation
    indicators.
  • On the following slides are two examples of ways
    to gather evidence of implementation.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
65
Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
66
Indicators of School Based Application of
Assessment for Learning
  • How are teachers checking to see what has been
    learned and what needs to be learned next?
  • How are teachers ensuring that students have
    access to specific and descriptive feedback, in
    relation to criteria that is focused on
    improvement?
  • How are teachers finding ways to reduce
    evaluative feedback?
  • How are teachers involving students the people
    most able to improve the learning deeply in the
    assessment process?

Davies, A., Herbst-Luedtke, S. Parrot Reynolds,
B. (2008). Leading the way to make classroom
assessment work. Courtenay, B. C. Connections
Publishing.
67
Indicators of Classroom Application using
Assessment for Learning
  • Consider a division goal within a curricular area
    (eg. Literacy)
  • Descriptions of success learning destinations
    within the course of study are posted in the
    classroom or handed out to students and parents.
    These descriptions reflect the standards or
    learning outcomes and express them in simple
    terms that everyone can understand.

Davies, A., Herbst-Luedtke, S. Parrot Reynolds,
B. (2008). Leading the way to make classroom
assessment work. Courtenay, B. C. Connections
Publishing.
68
Indicators of Classroom Application of Assessment
for Learning
  • Look for
  • Students are able to answer the question What
    do you need to know to be successful? by
    articulating the important ideas (or referring to
    a handout which does so) and describing how this
    knowledge or set of skills will be useful outside
    of school.

Davies, A., Herbst-Luedtke, S. Parrot Reynolds,
B. (2008). Leading the way to make classroom
assessment work. Courtenay, B. C. Connections
Publishing.
69
Indicators of Classroom Application of Assessment
for Learning
  • Look for
  • Teachers are able to summarize the learning
    destination and explicitly describe how the
    activity, assignment, or range of activities and
    assignments help all students learn.
    Furthermore, teachers can show plans for how
    student evidence or proof of learning will
    account for all the standards or outcomes.

Davies, A., Herbst-Luedtke, S. Parrot Reynolds,
B. (2008). Leading the way to make classroom
assessment work. Courtenay, B. C. Connections
Publishing.
70
Indicators of Classroom Application of Assessment
for Learning
  • Look for
  • In response to the question What does quality
    look like? students will refer to models,
    exemplars, or criteria.

Davies, A., Herbst-Luedtke, S. Parrot Reynolds,
B. (2008). Leading the way to make classroom
assessment work. Courtenay, B. C. Connections
Publishing.
71
Evidence of Implementation
  • Using either template provided, discuss and write
    down what data would need to be gathered as
    evidence that each of the indicators is being
    actualized in an effective manner, i.e. the
    strategies are being used as designed as opposed
    to interpretations of the strategy.
  • Who will collect the data?
  • How will it be collected?
  • When will it be collected?

72
Advancing Assessment Literacy Modules
  • 17 Modules designed to facilitate conversations
    and work with data for improvement of
    instruction.
  • www.spdu.ca
  • Publications
  • Advancing Assessment Literacy Modules
  • Download a PDF of a PowerPoint and accompanying
    Lesson Plan for use by education professionals in
    schools.

73
(No Transcript)
74
Reflection
  • Partner Interviews
  • Select a one-word summary for today.
  • Why did you choose that word?
  • What commitments are you making to yourself?

75
Evaluation
  • Deepen understanding about the writing assessment
    project results
  • Initiate reflection and discussion among
    division-level staff members related to the
    writing assessment results
  • Provide a range of tools and processes to support
    division-level staff in their work throughout the
    system related to school improvement and,
  • Provide opportunity to discuss and plan around
    the data in the context of school improvement.
Write a Comment
User Comments (0)
About PowerShow.com