Simple Stats Say It All - PowerPoint PPT Presentation

1 / 82
About This Presentation
Title:

Simple Stats Say It All

Description:

'One of the proposed strategies is to drastically reduce the ... Tallies and counts. Schedules / calendars. Logs and anecdotal records. Observations. Interviews ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 83
Provided by: eile95
Category:
Tags: simple | stats | tallies

less

Transcript and Presenter's Notes

Title: Simple Stats Say It All


1
Simple Stats Say It All
  • Eileen E. Schroeder
  • schroede_at_uww.edu
  • E. Anne Zarinnia
  • zarinnie_at_uww.edu

2
One of the proposed strategies is to drastically
reduce the number of school librarians in the
area claiming that school libraries can be
effectively run by aides to ensure services are
provided and the library remains open. This is
despite the fact that I have hundreds of students
in the library each day, and teach in the
classroom regularly. I have voiced my objection,
but I am told that such reductions will not
impact on student learning in any way.
Todd, R. Evidence-Based Practice Findings of
Australian Study, 2002-2003.
3
Data driven decision making
  • Need to convince decision makers that library
    media program enhances school mission
  • Students competent to enter the information age
  • Students achieve at higher levels due to LMS
  • Students achieve at higher levels because of
    quality library media program

Berkowitz, R. From Indicators of Quantity to
Measures of Effectiveness.
4
Purposes of Data Collection
  • Baseline data (describe program)
  • Document need for a program or idea
  • Raise awareness about a problem, condition or
    solution
  • Evaluate progress
  • Identify strengths and weaknesses

A Planning Guide to Information Power Building
Partnerships for Learning. AASL, 1999.
5
Communicate with decision makers
  • Recognize financial concerns
  • Acknowledge concerns about student performance
  • Speak to concerns in ways decision makers
    understand and value
  • Educate about LMS role and responsibilities
  • Provide visible evidence of positive impact of
    program

Berkowitz, R. From Indicators of Quantity to
Measures of Effectiveness
6
Evidence-based practice
  • Demonstrate
  • Outcomes of making and implementing sound
    decisions in daily work
  • Impact of decisions on organization goals and
    objectives

Todd, R. Evidence-Based Practice Findings of
Australian Study, 2002-2003.
7
What data is most valuable?
  • Demonstrate difference LM program makes in
  • Content learning
  • Information literacy skills
  • Technology skills
  • Reading
  • Collaborative planning and teaching
  • Demonstrate impact of resources

Loertscher, D.. California Project Achievement.
8
Data Gathering Techniques
Student Learning
  • Tallies and counts
  • Schedules / calendars
  • Logs and anecdotal records
  • Observations
  • Interviews
  • Performance assessment
  • Products

A Planning Guide to Information Power Building
Partnerships for Learning. AASL, 1999.
9
Student Learning
Student Learning
  • Content learning
  • Achievement in coursework
  • Standardized test achievement
  • Critical thinking
  • Independent thinking
  • Interaction with others
  • Personal responsibility for learning
  • Motivation
  • Teacher tests
  • Test item analysis
  • Rubrics, checklists
  • Performance assessments
  • Student or teacher interviews or focus groups
  • Observations

10
Standardized Tests WKCE
Student Learning
  • LMS reinforces skills taught in the classroom
  • Mis)-Alignment to standards (1998 Match to
    Terra Nova, Form A items)
  • Some standards not appropriate for pen and pencil
    test (e.g., media and research standards)
  • Grade 10 1 item matches each of LA Standards E
    and F
  • Grade 8 1 item matches LA Standard F
  • Grade 4 1 item matches each of LA Standards E
    and F

Wisconsin Knowledge and Concepts Examinations
Alignment to Wisconsin Model Academic Standards,
1998 (http//www.dpi.state.wi.us/oea/alignmnt.html
))
11
WKCE Sample Question
Student Learning
WKCE 8th Grade Reading Sample Question
(http//www.dpi.state.wi.us/oea/read8itm.html)
12
WKCE Sample Question
Student Learning
WKCE 10th Grade Social Studies Sample Question
(http//www.dpi.state.wi.us/oea/ss8items.html)
13
Standardized Tests Reading
Student Learning
  • Assessment Framework for Reading
  • Objectives supported by library program
  • Analyzing literary and informational text
  • Evaluate and extend literary and informational
    text
  • Use of framework
  • Match local curriculum to framework
  • Engage in discussions on where skills are taught
    and reinforced
  • Examine problem areas

Assessment Framework for Reading in Grades 3
through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oe
a/wkce-crt.html)
14
Student Learning
Provide resources
Story hours
Research projects
Assessment Framework for Reading in Grades 3
through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oe
a/wkce-crt.html)
15
Information and Technology Literacy
I TL
  • Student mastery of grade level benchmarks
  • Information literacy skills
  • Technology skills
  • Independent use of skills
  • Ability to transfer skills to new problems
  • Student spontaneous and persistent use of
    information literacy skills and inquiry

16
Collecting Information and Technology Literacy
Data
I TL
  • Mastery
  • Teachers Assess skills in curricular projects
  • Teachers Discuss or interview on student skills
  • Teachers Review sample products or portfolios
  • LMS Information literacy skills in lessons
  • Upcoming 8th grade technology assessment
  • Standardized tests Analyze items linked to
    information literacy skills
  • Students Self-assess skills
  • Students Research logs
  • Students Conferences (reflect on work, skills
    and benefits)
  • Students Interviews, surveys or focus groups

Interpret - Teacher
Interpret - Student
Assess - Teacher
Assess - Student
Enough - Student
Need / Strategies
Thinking - Teacher
Information Power (1988) pp. 176-181 and
Loertscher Todd (2003) We Boost Achievement,
pp. 115-118
17
Collecting Information and Technology Literacy
Data
I TL
  • Independent use
  • Spontaneous and persistent use
  • Teacher discussions or interviews on student
    skills
  • Feedback from teachers and/or LMS at next level
  • Student interviews, surveys or focus groups
  • LMS or teacher observations

Information Power (1988) pp. 176-181 and
Loertscher Todd (2003) We Boost Achievement,
pp. 115-118
18
More Rubric Sites
I TL
  • Oak Harbor, Washington, Information Skills Rating
    Scale
  • DigiTales Digital Media Scoring Guides
  • NCRTEC Scoring Guide for Student Products
  • Research Report Rubric Generator
  • Rubric generator sites
  • http//school.discovery.com/schrockguide/assess.ht
    ml

19
Mastery Online Self-Assessment
I TL
  • Student checklist of skills used during research
  • Locally developed web form
  • Tie to database (FileMaker, Access)
  • Email submission from form using CGI script
  • AASL Power Learner Survey

File
20
Mastery Online Assessments
I TL
  • Locally created test
  • Discovery School example (http//school.discovery.
    com/quizzes31/eileenschroeder/Research.html)
  • College online assessments
  • ETSs ICT Literacy Assessment
  • being tested, http//www.ets.org/ictliteracy/index
    .html
  • Cal Poly - Pomona
  • http//www.csupomona.edu/library/InfoComp/instrum
    ent.htm
  • Raritan Valley Community College
  • http//library.raritanval.edu/InfolitTest/infoLitT
    est.html
  • Cabrillo College
  • http//www.topsy.org/InfoLitAssess.html

21
Mastery Technology Skills Assessments
I TL
  • 8th grade tech assessment
  • NETS Online Technology Assessment
  • NCREL / NETS for Students Extended Rubric
  • Bellingham Technology Self-Assessments

22
NETS Online Assessment
I TL
23
I TL
24
Observations
I TL
  • Focus and limit the scope
  • What cant be measured in a product
  • Observe small groups
  • Checklists and rubrics for consistency
  • Different subject areas and groups
  • Options
  • Deep Several observations over short period
  • Broad Single observations on regular basis
  • Partner with others to do observations
  • Plan ahead and prepare

Improve Your Library A Self-Evaluation Process
for Secondary School Libraries and Learning
Resource Centres. Department for Education and
Skills.
25
I TL
26
Spreadsheet or Database?
I TL
  • Organize
  • Fields
  • Subfields
  • Scores (rubrics for scoring)
  • Gather
  • Laptop?
  • PDA?
  • Paper?

27
I TL
28
Creating Surveys
I TL
  • What do you really want to know?
  • Who will have the answer? Who will you survey?
    How many?
  • Are questions and instrument as brief as
    possible?
  • Will answers be selected from options (yes/no,
    ranking, rating) or open-ended?
  • Are questions and instructions clear, not open to
    multiple interpretations?
  • Do questions ask for personal opinion? Do you
    want that?
  • Are embarrassing or leading questions excluded?

Survey on Information Literacy
Technology Self-Assessment
Independent Use (Survey Monkey)
A Planning Guide to Information Power Building
Partnerships for Learning. AASL, 1999.and
Improve Your Library A Self-Evaluation Process
for Secondary School Libraries and Learning
Resource Centres. Department for Education and
Skills.
29
Survey Independent Use
I TL
30
Online Survey Tools
I TL
  • Survey Monkey
  • 200 annually for 1000 responses per month
  • http//www.surveymonkey.com/
  • Zoomerang
  • 350 annually
  • http//info.zoomerang.com
  • WebSurveyor
  • More sophisticated (help on survey design and
    analysis, use of passwords)
  • 250 for single surveys, 1500 annually
  • FileMaker Pro
  • VivED
  • Limited version free for K-12 educators
  • http//www.vived.com/

31
Interviews / Focus Groups
I TL
  • Make purpose clear
  • Script questions, but adapt language
  • Have follow-up questions ready
  • Record answers (tape or by hand), but dont let
    this interfere with your attention to respondent
  • Select interview location free from interruption
  • Get a range of students / teachers (users and
    non-users, grade levels, abilities, genders,
    ethnicity, subject areas)
  • May be more useful to interview students in
    groups
  • May get more honesty if LMS does not do interviews

Improve Your Library A Self-Evaluation Process
for Secondary School Libraries and Learning
Resource Centres. Department for Education and
Skills.
32
Teacher Interview Questions Use
I TL
  • Do students appear confident in working in the
    library?
  • Are the students self- motivated and able to work
    independently, or do they need assistance to find
    information?
  • Do students choose methods of working best suited
    to the information seeking task?

33
Interview versus Survey
I TL
  • Interview
  • Extended, open-ended answers
  • Adaptive questions
  • Reach small number but in more depth
  • Survey
  • Closed questions - range of possible answers
    known
  • Can use branching
  • Can reach larger number of people
  • Easier to conduct and tabulate

34
Reading
Reading
  • Choice to read voluntarily
  • Enjoyment of reading
  • Amount read
  • Voluntarily
  • As part of curriculum
  • Access to reading materials
  • Suitably challenging and varied selections
  • Impact on reading comprehension
  • Choice to read
  • Student surveys, interviews or focus groups
  • Reader self assessments
  • Snapshot of reader advisory
  • Amount read
  • Reading inventories (pre/post)
  • Student reading logs
  • Circulation statistics, ILL requests
  • Track involvement in reading incentive activities
  • Access
  • Collection mapping
  • Comprehension
  • Analysis of library involvement in teacher unit
    plans for reading
  • Teacher surveys, interviews or focus groups
  • Standardized or local reading test score
  • Accelerated Reader / Reading Counts points

Loertscher, D. California Project Achievement.
35
Reading Habits
Reading
  • KMMS Reading Inventory Online
  • (http//ms.kmsd.edu/7Emsimc/reader_survey.html)
  • Independent Reading Rubric (Cornwell)
  • Print version (http//www.indianalearns.org/reade
    rsindependent.asp)
  • Online survey based on Cornwell (in Zoomerang)
  • Reading Log
  • Power Reader Survey (AASL)

36
Reading Survey
Reading
37
Collaboration
Collaboration
  • Collaboration with teachers
  • Time and frequency of collaboration
  • Number and range of teachers collaborating
  • Level of collaborative activity and LMS support
  • Gather resources for unit
  • Provide lesson ideas
  • Integrate info. tech literacy skills in
    curriculum
  • Teach information or technology skills
  • Quality of learning experience
  • Types of assignments - Higher level thinking
  • Teachers use information problem solving model
  • Impact on content learning and information skills
  • Integration of info and tech literacy skills
  • Greater use of resources
  • Level of student engagement
  • Schedules
  • Collaborative planning records
  • Prepared bibliographies
  • Unit plans
  • Unit / lesson plans
  • Curriculum maps
  • Post-unit reflections
  • Interviews, focus groups, surveys,
  • Assessment - student
  • content knowledge
  • Information skills
  • motivation

38
Planning Sheets Wisconsin DPI
Collaboration
39
Planning Sheets
Collaboration
Stacy Fisher. and Jane Johns. Milton Middle
School
40
Post-Unit Review
Collaboration
Unit title Timeframe for
unit Teacher of students What worked
well? Suggestions for improvement Time spent on
teaching information literacy /
technology Information technology skills /
standards learned From both the LMSs and the
teachers point of view was the unit enhanced by
collaboration? Yes No Why? Was the unit
successful enough to warrant doing it
again? Yes No Why? How well was the
unit supported by
(5excellent, 4above average,
3average, 2below average, 1poor) The
collection The web resources Diversity of
formats 5 4 3 2 1 5 4 3 2 1 Recency 5 4 3 2 1
5 4 3 2 1 Number of items 5 4 3 2 1 5 4 3 2 1 R
eading level 5 4 3 2 1 5 4 3 2 1 Technology 5
4 3 2 1 5 4 3 2 1 What materials / technology
will we need if we are planning the unit
again? Attach a list of resources used
and/or found useful.
Adapted from Loertscher and Achterman (2003).
Increasing Academic Achievement through the
Library Media Center, p. 17.
41
Tracking Collaborative Units
Collaboration
Input form 1
Skills Report
  • Impact!
  • Collaboration profile
  • Activities
  • Hours spent
  • Learning venues
  • Difficulty level of units
  • Content area profile
  • Resource profile
  • Research skills profile (3-9 skills)
  • Collaboration timeline

Input form 2
Collaboration Stats
Input form 3
Collaboration Goals
Input form 4
Activities
Coverage
Hours and Places
Timeline
42
Log sheets
Collaboration
Stacy Fisher and Jane Johns. Milton Middle School
43
Collaboration
44
Filemaker Database
Collaboration
  • Reservations
  • Planning
  • Collaboration
  • Evaluation

45
Resources Actual and Perceived
Resources
  • Range, appropriateness, level, and amount of
    resources for curricular needs and student
    interests
  • Organization, accessibility and use of resources,
    space, and technology by staff and students
  • In LMC, classroom, over network, from home
  • During and outside school hours
  • Circulation of resources
  • Use of online resources
  • Staff expertise and availability
  • Collection mapping tied to curriculum
  • Post-unit assessment of resources
  • Post-unit student assessment
  • Library and lab sign-ups
  • Circulation statistics
  • Logs of online resource use
  • Interviews or focus groups
  • Satisfaction surveys

46
Circulation Statistics
Resources
Circulation Statistics from Winnebago Spectrum
47
Use Tracking Day Sample
Resources
Val Edwards. Monona Grove High School.
48
Use Tracking Quarterly Sample
Resources
Val Edwards. Monona Grove High School.
49
Use Tracking Week Sample
Resources
50
Room Scheduling
51
Calendars Curriculum, Professional, Other
52
Calendars Curriculum Events
53
Calendars Documenting Collaboration and
Integration
54
Data Collection Plan
  • What is the most important data in your school?
  • Input and output
  • Triangulation
  • Where can you get it? From whom?
  • How are you going to collect it?
  • Events
  • Instruments
  • What is timeline?
  • Who will be responsible?
  • What resources do you need to collect data?
  • Who will test data collection instruments?
  • How will you analyze the results?
  • How will you use the results?
  • To whom, and how, will you communicate results?

55
Tips for Gathering Data
  • Keep it SIMPLE
  • Minimum amount of information to show impact
  • Merge in daily routines
  • Identify where to best spend time to be effective
  • Be systematic
  • Use different types of evidence
  • Use both objective and subjective data
  • Consider samples of data
  • Collect data at opportune events
  • Plan for analysis right from the start

56
Data Analysis
  • Statistics
  • Frequency distribution
  • Mean, median, mode
  • Standard deviation
  • Change over time (pre/post)
  • Subgroup analysis (what is same / different)
  • Trends
  • Ratios

57
Consider when Analyzing Data
  • Opportunity, reality, perception
  • Is analysis in-depth and comprehensive?
  • Did you identify strengths, weaknesses, and
    trends?
  • Did you analyze for appropriate sub-groups?
  • Does data provide big picture?
  • Do you have comparisons to similar schools or
    benchmarking studies?
  • Did you provide graphic overviews?

Fitzpatrick (1998). Program Evaluation Library
Media Services
58
Presenting Results
  • Audience, Audience, Audience!
  • Principal
  • District administration
  • Board
  • Parents / community
  • Frequency of presentation
  • Annual report
  • Quarterly report
  • Special events (elevator interactions, faculty
    meetings)
  • Format of presentation
  • Oral presentation (with or without media)
  • Formal report
  • Brochure
  • Mass media (letter to the editor, mailing,
    webpage)
  • Memo

59
When presenting, check
  • Highlights factors important to the audience?
  • Well organized, written and illustrated
  • Language appropriate to audience and avoids
    jargon?
  • Ties clearly to mission and goals of school and
    library program?
  • Emphasizes outputs, especially student learning?
  • Graphic depictions show relationships?
  • Plans for future and builds on previous years
    reports and activities?
  • Executive summary is clear, covers key points

Fitzpatrick (1998). Program Evaluation Library
Media Services
60
Resources
  • Loertscher, David and Todd, Ross (2003). We
    Boost Achievement Evidence-Based Practice for
    School Library Media Specialists. Hi Willow
    Research and Publishing.
  • Evidence-Based Practice and School Library Media
    Programs. (2003). Treasure Mountain Research
    Retreat 11, Oct. 22-23, 2003.
  • Loertscher, David (2003). Project Achievement
    A National Initiative to Collect and Present
    Evidence that Links Library Media Programs to
    Student Achievement, 2003-5. (http//www.davidvl.
    org)
  • Todd, Ross (2003). Irrefutable Evidence. School
    Library Journal, 49 (4), p. 52 (3 p.)

61
Where to go from here
  • Who do you want to inform?
  • What do you want to measure?
  • What data do you need?
  • What data to you have now?
  • What can you do to collect this data in your
    daily work?

62
More Information?
  • academics.uww.edu/libmedia/AASL05

63
Sample Rubrics
64
Mastery Information Skills Rubric
I TL
  • Effectively interprets and synthesizes information

Marzano, Pickering, and McTighe. Assessing
Student Outcomes. MCREL, 1993, p. 96.
Back
65
Mastery Student Information Skills Rubric
I TL
  • I find meaning in information and then combine
    and organize information to make it useful for my
    task.

Back
Marzano, Pickering, and McTighe. Assessing
Student Outcomes. MCREL, 1993, p. 122.
66
Mastery Information Skills Rubric
I TL
  • Accurately assess the value of information

Marzano, Pickering, and McTighe. Assessing
Student Outcomes. MCREL, 1993, p. 97.
Back
67
Mastery Student Information Skills Rubric
I TL
  • I accurately determine how valuable specific
    information may be to my task.

Back
Marzano, Pickering, and McTighe. Assessing
Student Outcomes. MCREL, 1993, p. 123.
68
Mastery Student Information Skills Rubric
I TL
  • I provide enough information to support the
    statement.

Marzano, Pickering, and McTighe. Assessing
Student Outcomes. MCREL, 1993, p. 112.
Back
69
Mastery Information Skills Rubric
I TL
Back
Rubrics for the Assessment of Information
Literacy based on the Information Literacy
Guidelines for Colorado Students and School
Library Media Specialists, 1006 (DRAFT)
70
Collaborative Rubric Critical Thinking
I TL
Back
California Assessment Program 1990,
History-Social Science Grade 11, Scoring Guide
Group Performance Task in Herman et al, 1992.
71
IMPACT Examples
  • From Val Edwards, Monona Grove High School
    (Wisconsin)

72
Back
73
Research Skills
Back
74
Collaboration Type and Resources
Back
75
Collaboration Evaluation
Back
76
Skills Reporting
Back
77
Collaboration Statistics
Back
78
Collaboration Goals
Back
79
Collaboration Activities
Back
80
Collaboration Coverage
Back
81
Hours and Places
Back
82
Timeline
Back
Write a Comment
User Comments (0)
About PowerShow.com