Data Interpretation I Workshop - PowerPoint PPT Presentation

1 / 65
About This Presentation
Title:

Data Interpretation I Workshop

Description:

Bring context and meaning to the writing assessment project ... Synectics. Please complete the following statement: 'Data use in schools is like . . . because. ... – PowerPoint PPT presentation

Number of Views:130
Avg rating:3.0/5.0
Slides: 66
Provided by: iank3
Category:

less

Transcript and Presenter's Notes

Title: Data Interpretation I Workshop


1
Data Interpretation I Workshop
2008 Writing Assessment for Learning
2
Purposes for the Day
  • Bring context and meaning to the writing
    assessment project results
  • Initiate reflection and discussion among school
    staff members related to the writing assessment
    results
  • Encourage school personnel to judiciously review
    and utilize different comparators when judging
    writing assessment results
  • Model processes that can be used at the
    school-and division-level for building
    understanding of the data among school staff and
    the broader community and,
  • Provide an opportunity to discuss and plan around
    the data

3
Agenda
  • Understanding datasources, categories uses
  • Provincial Writing Assessment
  • Conceptual Framework
  • Comparators
  • Student Performance Data
  • Opportunity to Learn Data
  • Standards and Cut Scores
  • Predicting
  • Categories of Data
  • Action Planning
  • Linking Data, Goals and Intervention
  • Closure

4
Synectics
  • Please complete the following statement
  • Data use in schools is like . . . because . . .
  • Data use in schools is like molasses because it
    is slow and gets slower as it gets colder.
  • Data use in schools is like molasses because it
    is sticky and can make a big mess!

5
A Data-Rich Environment
  • Wellman Lipton (2004) state
  • Schools and school districts are rich in data.
    It is important that the data a group explores
    are broad enough to offer a rich and deep view of
    the present state, but not so complex that the
    process becomes overwhelming and unmanageable.

Wellman, B. Lipton, L. (2004). Data driven
dialogue. Mira Via, LLC.
6
International Data Sources
  • Programme for International Student Assessment
    (PISA)

http//snes.eas.cornell.edu/Graphics/earth20white
20background.JPG
7
National Data Sources
  • Pan-Canadian Achievement Program (PCAP)
  • Canadian Test of Basic Skills (CTBS)
  • Canadian Achievement Tests (CAT3)

http//www.recyclage.rncan.gc.ca/images/canada_map
.jpg
8
Provincial Data Sources
  • Assessment for Learning (AFL)
  • Opportunity to Learn Measures
  • Performance Measures
  • Departmentals

http//regina.foundlocally.com/Images/Saskatchewan
.jpg
9
Division Data Sources
  • Division level rubrics
  • Division bench mark assessments

http//www.sasked.gov.sk.ca/branches/ed_finance/no
rth_east_sd200.shtml
10
Local Data Sources
  • Cum Folders
  • Teacher designed evaluations
  • Portfolios
  • Routine assessment data

11
Nature of Assessment Data
From Understanding the numbers. Saskatchewan
Learning
12
Depth and Specificityof Knowledge
.
In-depth knowledge of specific students
In-depth knowledge of systems
Assessments
From Saskatchewan Learning. (2006).
Understanding the numbers.
13
Using a Variety of Data Sources
  • Thinking about the data sources available, their
    nature and the depth of knowledge they provide,
    how might the information in each impact the
    decisions you make?
  • What can you do with this data?
  • What is its impact on classrooms?

14
Using a Variety of Data Sources
15
Using a Variety of Data Sources
16
Using a Variety of Data Sources
Please refer to the Using a Variety of Data
Sources template on p. 3 in your handout package
as a guide for your discussion.
17
Assessment for Learningis a Snapshot
  • Results from a large-scale assessment are a
    snapshot of student performance.
  • The results are not definitive. They do not tell
    the whole story. They need to be considered
    along with other sources of information available
    at the school.
  • The results are more reliable when larger numbers
    of students participate and when aggregated at
    the provincial and division level, and should be
    considered cautiously at the school level.
    Individual student mastery of learning is best
    determined through effective and ongoing
    classroom-based assessment. (Saskatchewan
    Learning, 2008)

18
Provincial Writing Assessment Conceptual
Framework p. 4 5
  • Colourful Thoughts
  • As you read through the information on the
    Provincial Writing Assessment, use highlighters
    or sticky notes to think about your reading

Wow! I agree with this.
Hmm! I wonder. . .
Yikes!
Adapted from Harvey, S. Goudvis, A. Strategies
that work, 2007.
19
Comparators Types of Referencing p. 6
  • Criterion-referenced Comparing how students
    perform relative to curriculum objectives, level
    attribution criteria (rubrics) and the level of
    difficulty inherent in the assessment tasks. If
    low percentages of students are succeeding with
    respect to specific criteria identified in
    rubrics, this may be an area for further
    investigation, and for planning intervention to
    improve student writing.
  • (Detailed rubrics, OTL rubrics and test items
    can be sourced at www.education.gov.sk.ca)
  • Standards-referenced Comparing how students
    performed relative to a set of professionally or
    socially constructed standards. Results can be
    compared to these standards to help identify key
    areas for investigation and intervention.
  • (Figure .2b, .3c, .4a, .6b, .7b and .8b.)

20
Comparators Types of Referencing
  • Experience or selfreferenced Comparing how
    students perform relative to the assessment data
    gathered by teachers during the school year.
    Where discrepancies occur, further investigation
    or intervention might be considered. It is
    recommended that several sources of data be
    considered in planning.
  • (E.g.. Comparing these results to current school
    data. The standards set by the panel.)
  • Norm-referenced Comparing how students in a
    school performed relative to the performance of
    students in the division, region or project.
    Note cautions around small groups of students.
    Norm-reference comparisons contribute very little
    to determining how to use the assessment
    information to make improvements.
  • (E.g.. Tables comparing the school, division and
    province.)

21
Comparators Types of Referencing
  • Longitudinal-referenced Comparing how students
    perform relative to earlier years performance of
    students. Viewed across several years,
    assessment results and other evidence can
    identify trends and improvements. (This data
    will not appear until the next administration of
    this assessment.)

22
Opportunity-to-Learn Elements as Reported by
Students
  • Propensity to Learn
  • using resources to explore models, generate ideas
    and assist the writing process
  • Motivation, attitude and confidence
  • Participation, perseverance and completion
  • Reflection
  • Knowledge and Use of Before, During and After
    Writing Strategies
  • Home Support for Writing and Learning
  • Encouragement and interaction
  • Access to resources and assistance

23
Opportunity-to-Learn Elements as Reported by
Teachers
  • Availability and Use of Resources
  • Teacher as key resource
  • Teacher as writer
  • Use of curriculum
  • Educational qualifications
  • Professional development
  • Time
  • Student resources
  • Classroom Instruction and Learning
  • Planning focuses on outcomes
  • Expectations and criteria are clearly outlined
  • Variety of assessment techniques
  • Writing strategies explicitly taught and
    emphasized
  • Adaptation

24
Student Performance Outcome Results
  • Demonstration of the
  • writing process
  • Pre-writing
  • Drafting
  • Revision
  • Quality of writing product
  • Messaging and content
  • Focus
  • Understanding and support
  • Genre
  • Organization and coherence
  • Introduction, conclusion, coherence
  • Language use
  • Language and word choices
  • Syntax and mechanics

25
Standards
  • To help make meaningful longitudinal comparisons
    in future years, three main processes will be
    implemented.
  • Assessment items will be developed for each
    assessment cycle using a consistent table of
    specifications.
  • The assessment items will undergo field-testing -
    one purpose of which is intended to inform the
    comparability of the two assessments.
  • A process for setting of standards for each of
    the assessment items, so that any differences in
    difficulty between two assessments are accounted
    for by varying standards for the two assessments.

26
Opportunity-to-Learn and Performance Standards
  • In order to establish Opportunity-to-Learn and
    Performance standards for the 2008 Writing
    Assessment, three panels were convened (one from
    each assessed grade), consisting of teachers from
    a variety of settings and post-secondary
    academics including Education faculty.
  • The panelists studied each genre from the 2008
    assessment in significant detail and established
    expectations for writing process, narrative
    products and expository products as well as
    opportunity to learn.

27
Thresholds of Adequacyand Proficiency
28
Thresholds of Adequacyand Proficiency
29
Cut Scores
  • On page 4 of the detailed reports you will find
    the cut scores detailing the percentage correct
    required for students to be classified at one of
    two levels

30
(No Transcript)
31
(No Transcript)
32
(No Transcript)
33
Predicting Card Stackand Shuffle
  • Individually As you refer to the cut scores on
    page 4, create a stack of cards with some of your
    predictions about student outcomes in Narrative
    and Expository writing consider each
    separately.
  • Writing Process (Prewriting, drafting,
    revising)
  • Writing Product (Message, organization and
    language choices)
  • Eg. I predict our 85 of our Gr. 8s will meet the
    adequate standard or higher in Propensity to
    Learn and of those, 20 will be proficient or
    higher because our students are very comfortable
    with writers workshop processes, which we have
    emphasized for the last three years.
  • Eg. I predict 90 of our Gr. 5s will score
    adequate or higher on demonstration of writing
    process in narrative writing because of our whole
    school emphasis on writing, especially with
    respect to narrative writing.

34
Predicting Card Stackand Shuffle
  • As you complete each card, place it in the center
    of the table.
  • As a group, shuffle the cards.
  • In turn, each group member picks a card to read
    aloud to the table group. The group engages in
    dialogue or discussion about the items.
  • Guiding questions
  • With what parts of this prediction do you agree?
    Why?
  • With what parts of this prediction do you
    disagree? Why?
  • To what extent is this prediction generalizable
    to all the classrooms in your school?

35
Predictions
  • Considering all of the predictions, are there any
    themes or patterns emerging upon which you can
    all agree?
  • Why might this be?

36
Comparisons
  • The completed tables are on page 7.
  • What are you noticing about the data?
  • What surprised you?
  • Which of your predictions were confirmed?
  • Which of your predictions were not confirmed?
  • Consider your assumptions as you discuss the
    results.

Wellman, B. Lipton, L. (2004). Data driven
dialogue. Mira Via, LLC.
37
Examining the Report
  • Take a few minutes to look through the entire AFL
    report. Use the chart below to guide your
    thinking and conversation.

38
Please return at 1240
Id trade, but peanut butter sticks to my tongue
stud.
39
Local Level Sources of Data
  • While international, national and provincial
    sources of data can provide direction for school
    initiatives, the data collected at the local
    level is what provides the most detailed
    information regarding the students in classrooms.

40
Four Major Categories of Data Demographics p.
7
  • Local Data
  • Descriptive information such as enrollment,
    attendance, gender, ethnicity, grade level, etc.
  • Can disaggregate other data by demographic
    variables.
  • AFL
  • Opportunity-to-Learn Data
  • Family/Home support for student writing
  • encouragement and interaction
  • access to resources

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
41
Four Major Categories of Data Student Learning
  • Local Data
  • Describes outcomes in terms of standardized test
    results, grade averages, etc.
  • AFL
  • Readiness Related Opportunity-to-Learn Data
  • Using resources to explore writing
  • Student knowledge and use of writing strategies
    (before, during, after)
  • Student performance outcomes
  • Writing 5,8,11 Narrative and Expository
  • Writing process
  • Writing product

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
42
Four Major Categories of DataPerceptions
  • Local Data
  • Provides information regarding what students,
    parents, staff and community think about school
    programs and processes.
  • This is data is important because people act in
    congruence with what they believe.
  • AFL
  • Readiness Related Opportunity-to-Learn Data
  • Commitment to learn
  • Using resources
  • Motivation attitude
  • Confidence
  • Participation
  • Perseverance completion
  • Reflection
  • Knowledge and use of writing strategies

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
43
Four Major Categories of DataSchool Processes
  • Local Data
  • What the system and teachers are doing to get the
    results they are getting.
  • Includes programs, assessments, instructional
    strategies and classroom practices.
  • AFL
  • Classroom Related Opportunity-to-Learn Data
  • Instruction and learning
  • Planning and reflection
  • Expectations and assessment
  • Focus on writing strategies
  • Adaptations
  • Availability and use of resources
  • Teacher
  • Time
  • Resources for students and teachers

Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
44
What Data are Usefuland Available? P. 8
  • Think about the goals/priorities set within your
    school or school division regarding student
    writing.
  • Using the supplied template, begin to catalogue
    the data you already have and the data you need
    in order to better address the goals that have
    been set.
  • An example follows on the next slide.

45
Goal Students will consciously use writing
strategies for all genres.
Bernhardt, V. L. (2004). Data analysis for
continuous school improvement, 2nd Edition.
Larchmont, NY Eye on Education.
46
Designing Interventions
  • Assumptions must be examined because our
    interventions will be based on them.
  • We must strive to correctly identify the causal
    factors.
  • Dont fall in love with any theory until you have
    other data.
  • Use a strength-based approach to interventions.

47
Team Action Plan
  • Please turn to page 9 in your handout package.
  • What are some areas of strength indicated within
    your data?
  • What are some areas for improvement indicated
    within your data?
  • Please consider all aspects of the report
    including the Opportunity to Learn Measures.

48
Fishbone AnalysisStrengths - p 10
  • At your table, analyze one strength and consider
    all contributing factors that led to that
    strength.

All classrooms using Writers Workshop
Majority of PD focused on writing
Writing Process
PLC read Strategies that Work
Teachers explicitly teaching pre-writing strategy
in all subjects
49
Fishbone AnalysisArea for Improvement p. 11
  • Identify one area for improvement.
  • What elements from your area of strength could
    contribute to improvement in this area?
  • Eg. We did well in the process of writing
    because all teachers are explicitly teaching
    pre-writing across the curriculum with every
    writing activity
  • So, we need to explicitly teach how to write
    introductions, conclusions, and transitions in
    writing in all subject areas

50
Setting a Goal p. 12
  • Based on your previous discussions regarding
    strengths and areas for improvement, write a goal
    statement your team will work on over the coming
    year.
  • Eg. For the 2010 AFL in Writing, all students
    will score at level 4 and above with respect to
    their use of before, during and after writing
    strategies.
  • Write your goal on the provided bubble map. This
    is a template add more bubbles if you need
    them! You do not have to fill in all the
    bubbles.
  • Brainstorm possible strategies for meeting that
    goal. You may need to use different strategies
    at different grade levels.

51
Research Instructional StrategiesP. 13
  • Once you have completed brainstorming strategies,
    you will want to conduct some research on the
    effectiveness of those strategies.
  • Available resources could include a variety of
    websites, the professional collection at the
    Stewart Resources Centre and the McDowell
    Foundation (www.stf.sk.ca).

52
Impact/Feasibility p 14
  • Once you have completed your research, conduct an
    impact/feasibility analysis of the strategies you
    have identified.
  • Impact refers to the degree to which a strategy
    will make a difference in the learning of
    students. A high impact strategy will make the
    greatest difference in learning for the broadest
    population of students.
  • Feasibility refers to the practical supports that
    need to be in place such as time, funding,
    scheduling, etc.

When done, choose the strategy that will have the
greatest impact and is most feasible to implement.
Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
53
Data-Driven Decision Making Improvement Cycle
p. 16
1. Find the data Treasure Hunt
2. Data Analysis and Strength Finder
7. Action Plan, Schedule, REVIEW
3. Needs Analysis
6. Determine Results Indicators
4. Goal Setting and Revision
(White, 2005)
5. Identify Specific Strategies to Achieve Goals
54
Four Tasks of Action PlanningP. 17
  • Decide on strategies for improvement.
  • Agree on what your plan will look like in
    classrooms.
  • Put the plan down on paper.
  • Plan how you will know if the plan is working.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
55
Put the Plan Down on Paper
  • By documenting team members roles and
    responsibilities and specifying the concrete
    steps that need to occur, you build internal
    accountability for making the plan work.
  • Identifying the professional development time and
    instruction your team will need and including it
    in your action plan lets teachers know they will
    be supported through the process of instructional
    improvement.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
56
Writing Out The Plan p. 18
  • Using the supplied Action Plan template, begin
    to draft the details of the plan as you work to
    achieving your goal.
  • The supplied template is only a suggestion you
    may create your own or use another of your own
    design.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
57
Plan How You Will Know if the Plan is Working
  • Before implementing your plan, it is important to
    determine what type of data you will need to
    collect in order to understand whether students
    are moving towards the goal.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
58
Different Lenses p. 20
  • What types of data might be required to gain a
    clearer picture of how specific groups of
    students are doing?
  • Consider the four categories of data available
    demographics, perceptions, student learning and
    school processes as you explore what types of
    data you need.

59
Short-, Medium-, andLong-Term Data
  • Short-Term Data
  • Gathered daily or weekly via classroom
    assessments and/or observations.
  • Medium-Term Data
  • Gathered at periodic intervals via common
    department, school, or division assessments.
    These are usually referred to as benchmark
    assessments.
  • Long-Term Data
  • Gathered annually via standardized provincial,
    national, or international assessments.

Boudette, K., City, E. A., Murnane, R. J.
(2005). Data wise A step-by-step guide to using
assessment results to improve teaching and
learning. Cambridge, MA Harvard Education Press.
60
Short- and Medium-Term Assessments
  • Referring to your action plan, identify what
    types of short- and medium-term assessments would
    best measure the progress of students as they
    work toward the goal.
  • It may be useful to plan the medium-term
    assessments first to provide a framework within
    which short-term assessments would fit.
  • Use the provided Short- and Medium-Term
    Assessment Planning template to plan when these
    might be administered.

61
Short-, Medium-, andLong-Term Assessments
  • Your school or school division has set a goal to
    improve students quality of writing,
    particularly as it relates to organization and
    coherence.
  • Teachers in-class assessment strategies provide
    formative feedback to students in these areas
    writing effective introductions and conclusions,
    as well as transitions.
  • Writing benchmark prompts are developed for each
    grade level in the school and administered at the
    end of each reporting period. Teachers
    collaboratively grade the papers using the
    rubrics from the Assessment for Learning program
    and analyze the results together.
  • Following the common assessment, students who
    have not achieved the set benchmark receive
    additional instruction and formative assessment
    as they work towards the goal.
  • In 2010 students are again assessed on their
    writing with the provincial AFL program.

62
Advancing Assessment Literacy Modules p. 21
  • 17 Modules designed to facilitate conversations
    and work with data for improvement of
    instruction.
  • www.spdu.ca
  • Publications
  • Advancing Assessment Literacy Modules
  • Download a PDF of a PowerPoint and accompanying
    Lesson Plan for use by education professionals in
    schools.
  • The PPT of this workshop will also be available
    on the same site.

63
(No Transcript)
64
Reflection
  • What did you discover today
  • that surprised you?
  • What will you take with
  • you from today?

65
Evaluation
  • Bring context and meaning to the writing
    assessment project results
  • Initiate reflection and discussion among school
    staff members related to the writing assessment
    results
  • Encourage school personnel to judiciously review
    and utilize different comparators when judging
    writing assessment results
  • Model processes that can be used at the school
    and division level for building understanding of
    the data among school staff and the broader
    community and,
  • Provide an opportunity to discuss and plan around
    the data.
Write a Comment
User Comments (0)
About PowerShow.com