Fort Recovery Local Schools How the Use of Data Can Drive Student Instruction - PowerPoint PPT Presentation

1 / 154
About This Presentation
Title:

Fort Recovery Local Schools How the Use of Data Can Drive Student Instruction

Description:

'All students will pass each area of state or off-year tests' ... Average passing % of students with disabilities Proficiency and Off-Year Tests for Grades 1-9 ... – PowerPoint PPT presentation

Number of Views:140
Avg rating:3.0/5.0
Slides: 155
Provided by: noa4
Category:

less

Transcript and Presenter's Notes

Title: Fort Recovery Local Schools How the Use of Data Can Drive Student Instruction


1
Fort Recovery Local SchoolsHow the Use of Data
Can Drive Student Instruction
  • Northwest Ohio Educational Resource Center
  • Drive-In Workshop
  • December 15, 2005

2
Successes and Challenges of Data Driven Decisions
  • Jeffrey Tuneberg, Coordinator
  • Mercer County ESC

3

Fort Recovery Local Schools 400 East Butler
Street Box 604 Fort Recovery, Ohio 419-375-4139 ht
tp//www.noacsc.org/mercer/fr/index.asp David R.
Riel, Superintendent - riel_at_fr.noacsc.org Ed
Snyder, HS Principal snyder_at_fr.noacsc.org Curtis
Hamrick, Technology Coordinator
hamrick_at_fr.noacsc.org        Mercer County
ESC 441 East Market StreetCelina, Ohio
45822Telephone (419) 586-6628 http//www.noacsc.
org/mercer/mc/ Jeffrey Tuneberg, Coordinator
tunebej_at_mc.noacsc.org
4
Fort Recovery Profile and Culture
  • David R. Riel, Superintendent

5
Fort Recovery Local Schools
  • 979 Students 3 buildings
  • Low-wealth school district 20 mil floor,
    border/fort town
  • Excellent Rating all 6 years of State Report Card
  • Graduation Rate 100
  • Racial Minorities 1.7
  • Economically Disadvantaged 9.1
  • Students With Disabilities 15.4
  • 73 of Teachers have Masters degree

6
Culture
  • We have used data for decision making long before
    DSL/DASL now data is more extensive and more
    accessible
  • Chronology of Improvement Efforts
  • Late 1980s Effective Schools Process
  • Early 1990s Venture Capital
  • Mid 1990s Network for Systemic Improvement
    Grant

7
Culture
  • A culture for effective change is a
    pre-requisite for continuous improvement
  • Professional Development
  • Pressure Support
  • Culture of Trust FR administration has always
    trusted staff with student data absolute must
    for any data-driven decision making tool to be
    successful
  • Effective Leadership

8
Data - Where to start?
  • Train people in the use and understanding of your
    data
  • Look for easy to use data tools i.e. DASL,
    WebSurveyor
  • Make an on-line home for your data i.e.
    Intranet, Internet
  • Let staff be creative with uses of data
  • Always support decisions with data

9
Trust - Where to start?
  • Model Let them see you using data for decision
    making
  • Allow people to make mistakes dont
    automatically resort to blame
  • Trust is mutual you cant get it without giving
    it
  • We can't dictate commitment- it has to come from
    the individual thatis why we need a culture of
    trust

10
CIP/Report Card Analysis
  • David R. Riel, Superintendent

11
Continuous Improvement Plan Data
Analysis(Originally Network for Systemic
Improvement - Progress)
  • Fort Recovery Local Schools
  • As of August 18, 2005
  • STATE REPORT CARD Implications for State Report
    Card

12
Continuous Improvement PlanCIP Goals
  • All students regardless of ability or
    handicapping condition, will
  • Goal I - Learn to the best of their abilities
  • Goal II - Learn in a safe, supportive, caring
    environment
  • Goal III - Graduate with the skills / knowledge
    to compete successfully in work school

13
Goal I Performance IndicatorsStudents will
Learn to the Best of their Abilities
  • A. All students will pass each area of state or
    off-year tests
  • B. Each year an increase in students reaching
    advance level of proficiency in each level
  • C. There will be an increase each year in seniors
    graduating with honor diplomas (College Prep and
    Vocational)

14
Goal I Performance IndicatorsStudents will
Learn to the Best of their Abilities
  • D. Show group progress on locally administered
    tests i.e. KDI, Plan, Iowa
  • E. Show group progress on AP tests, PSAT, ACT

15
Progress - Goal I, Indicator AAll students will
pass each area of state or off-year tests
  • Average passing of all Proficiency and Off-Year
    Tests for Grades 1-9
  • New cut scores for off-year test start 1999
  • 2004 4th, 6th, 9th Proficiency Tests
    onlySTATE REPORT CARD

16
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • Average passing of all Achievement Tests for
    Grades 3-8
  • 2004 Reading 3
  • 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
  • 2006 Reading 3-8 Math 3-8 Writing 4
  • 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
    Studies 5,8 Science 5,8 STATE REPORT CARD

17
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • Average passing of all OGT areas for Grade 10
  • 2003 and 2004 Reading, Math only
  • 2005 forward Reading, Math, Writing, Science,
    Social Studies STATE REPORT CARD

18
Progress - Goal I, Indicator AAll students will
pass each area of state or off-year tests
  • Average passing of students with disabilities
    Proficiency and Off-Year Tests for Grades 1-9
  • New cut scores for off-year test start 1999
  • 2004 4th, 6th, 9th Proficiency Tests only
  • 2005 4th SS, Sc, M 6th ProficiencySTATE
    REPORT CARD

19
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • Average passing of students with disabilities
    Achievement Tests for Grades 3-8
  • 2004 Reading 3
  • 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
  • 2006 Reading 3-8 Math 3-8 Writing 4
  • 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
    Studies 5,8 Science 5,8 STATE REPORT CARD

20
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • Average passing of students with disabilities
    OGT areas for Grade 10
  • 2003 and 2004 Reading, Math only
  • 2005 forward Reading, Math, Writing, Science,
    Social Studies STATE REPORT CARD

21
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • District Adequate Yearly Progress (AYP)
  • Based upon Building AYP
  • STATE REPORT CARD

22
Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
  • Performance Index Score (as reported on State
    Report Card)
  • Advanced 1.2
  • Accelerated 1.1
  • Proficient 1.0
  • Basic - .6
  • Below Basic - .3
  • Untested 0
  • STATE REPORT CARD

23
Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency in each level
  • Percentage of Advanced Level for Proficiency and
    Off-Year Tests for Grades 1-9
  • New cut scores for off-year test start 1999
  • 2004 4th, 6th, 9th Proficiency Tests only
  • 2005 4th SS, Sc, M 6th Proficiency STATE
    REPORT CARD

24
Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency (Achievement) in each level
  • Percentage of Advanced Level for Achievement
    Tests for Grades 3-8
  • 2004 Reading 3
  • 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
  • 2006 Reading 3-8 Math 3-8 Writing 4
  • 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
    Studies 5,8 Science 5,8
  • STATE REPORT CARD

25
Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency (Achievement) in each level
  • Percentage for Advanced Level for OGT for Grade
    10
  • 2003 Advanced Level not reported
  • 2003 and 2004 Reading, Math only
  • 2005 On All 5 Areas STATE REPORT CARD

26
Progress -Goal I, Indicator CThere will be an
increase each year in seniors graduating with
honor diplomas (College Prep and Vocational)
  • Percentage of seniors with Honor Diplomas (as
    determined by State criteria)

27
Progress -Goal I, Indicator D Show group
progress on locally administered tests i.e.
KDI, PLAN, Iowa
  • KDI Kindergarten Diagnostic Instrument
  • 2003 60 students average 124.9
  • 2004 62 students average 134.6
  • 2005 76 students average 137.9
  • Includes students that have had no school
    experience

28
Progress -Goal I, Indicator D Show group
progress on locally administered tests i.e.
KDI, PLAN, Iowa
  • PLAN 10th Grade
  • pre-ACT test
  • National Average 16.5
  • 2002 68 students - average 18.1
  • 2003 57 students - average 19.4
  • 2004 71 students average 18.0
  • 2005 76 students average 18.5

29
Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
  • Advanced Placement
  • Chemistry, English, Calculus
  • 2000 38 tests 1.42
  • 2001 30 tests 1.60
  • 2002 24 tests 1.50
  • 2003 24 tests 2.13
  • 2004 33 tests 2.33
  • 2005 26 tests 1.96

30
Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
  • PSAT Pre-SAT
  • 2003 10th 12 studentsSelection Index 51
  • 2003 11th 6 studentsSelection Index 69
  • 2004 10th 6 studentsSelection Index 43
  • 2004 11th 4 studentsSelection Index 88
  • 2005 Not enough students took test for
    reporting purposes

31
Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
  • ACT
  • 1999 47 students 21.2
  • 2000 51 students 22.1
  • 2001 39 students 21.8
  • 2002 47 students 20.9
  • 2003 64 students 21.7
  • 2004 58 students 21.3
  • 2005 60 students 21.6
  • National Average
  • 1999-2001 21.0
  • 2002-2003 20.8
  • 2004 20.9
  • 2005 20.9

32
Goal II Performance IndicatorsStudents will
Learn in a Safe, Supportive, Caring Environment
  • Yearly student surveys will increase positive
    ratings on students treat each other with
    respect
  • Increase rating on our teachers care about the
    students
  • Staff survey will show increase in rating of our
    staff is collegial and works well together
  • Student surveys will show increase in rating of
    climate section

33
Progress Goal II, Indicator A Yearly student
surveys will increase positive ratings on
students treat each other with respect
  • Average rating (1-5) of annual student survey
    question students treat each other with
    respect

34
Progress -Goal II, Indicator B Increase rating
on our teachers care about the students
  • Average rating (1-5) of annual student survey
    question our teachers care about the students

35
Progress -Goal II, Indicator CStaff survey will
show increase in rating of our staff is
collegial and works well together
  • Average rating (1-10) of annual teacher survey
    question our staff is collegial and works well
    together

36
Progress -Goal II, Indicator DStudent surveys
will show increase in rating of climate section
  • Average rating (1-5) of Positive Learning Climate
    of annual student surveys

37
Goal III Performance IndicatorsAll students
will graduate with the skills / knowledge to
compete successfully in work school
  • On BAC surveys, employers rating average of new
    FR graduate employees will increase
  • All students will reach age/grade appropriate
    Technology Benchmarks set by District Technology
    Committee
  • Senior surveys will show an increase on exposure
    to career choices
  • Post-Graduate surveys will show an increase in
    positive results (including SCAN Skills)

38
Progress -Goal III, Indicator APriorities as
indicated by area employer surveys
  • Need to focus more upon (4 point scale)
  • 2005 28 BAC surveys returned

39
Progress -Goal III, Indicator AAverage Rating of
new FR Graduate Employees from Employer Surveys
  • Employee Rating (4 point scale)
  • 2005 28 BAC surveys returned

40
Progress - Goal III, Indicator B All students
will reach age/grade appropriate Technology
Benchmarks set by District Technology Committee
  • Percentage of students reaching grade level Tech
    Benchmarks
  • In 2002-03, raised the difficulty level and
    differentiation of Tech Benchmarks
  • In 2004-05, once again raised difficulty level
    and differentiation of Tech Benchmarks

41
Progress - Goal III, Indicator CSenior surveys
will show an increase on exposure to career
choices
  • Percentage of students who answered the question
    Adequacy of programs in career education and
    planning as Satisfied, No Change Necessary
  • (ACT HS Profile Report question 188)

42
Progress -Goal III, Indicator D Post-Graduate
surveys will show an increase in positive results
(including SCAN Skills)
  • Exit Survey - who feel they are well prepared

43
Progress State Report Card RatingGoal I,
Indicators A-EGoal II, Indicators A-DGoal III,
Indicators A-D
Excellent
Effective
Continuous Improvement
Academic Watch
Academic Emergency
44
Fort Recovery Local SchoolsVision The Fort
Recovery Schools will become a community of
learners where all students learn to the best of
their abilities, become responsible citizens, and
acquire the knowledge and attributes needed to
successfully compete in a global society.
Excellent
Effective
Continuous Improvement
Academic Watch
Academic Emergency
45
Instructional Conclusions/Implications
  • Need more accurate and consistent data on CIP
    Goal 3 particularly employee and post-graduate
    data
  • Our students academically are doing very well
    when compared to state standards (AYP)
  • Strategies may need to be developed and
    implemented to improve the percentage of students
    who score at the Advanced levels

46
InstructionalConclusions/Implications
  • We need to learn all we can from our
    participation in the Battelle For Kids Value
    Added Pilot
  • May want to look at strategies to review data on
    a more regular basis
  • Strategic planning process will likely result in
    CIP revisions

47
InstructionalConclusions/Implications
  • We need to provide on-going Professional
    Development opportunities for staff to assist
    them in providing instruction that leads to
    sustained and incremental growth in the
    indicators
  • We need to find ways to encourage more
    collaboration among our teachers

48
InstructionalConclusions/Implications
  • Increased attention will need to be paid to our
    subgroups (students with disabilities)
  • Apply Plan-Do-Study-Act Model to CIP Development
    and Implementation
  • May need to find ways to assist teachers in
    applying research techniques in their classrooms
    (DLT)

49
Understanding Building/District Report Cards
  • Ohios Accountability System

50
Classifications
  • Excellent
  • Effective
  • Continuous Improvement
  • Academic Watch
  • Academic Emergency

51
(No Transcript)
52
(No Transcript)
53
(No Transcript)
54
(No Transcript)
55
Two Avenues for Classifications
Indicator Points -or- Performance Index
56
INDICATOR POINTS
57
2004-2005 INDICATORS
  • Reading (75)
  • 3rd, 4th, 5th, 6th, 8th, and 10th
  • Math (75)
  • 3rd, 4th, 6th, 7th, 8th and 10th
  • Science (75)
  • 4th, 6th and 10th
  • Writing (75)
  • 4th, 6th and 10th
  • Social Studies (75)
  • 4th, 6th and 10th
  • Attendance (93)
  • Graduation Rate (90)

23 AVAILABLE INDICATORS
58
NUMBER OF POINTS RECEIVED
NUMBER OF POINTS POSSIBLE
59
Elementary Card
9

100
9
60
MS CARD
8
88

9
61
HS Card
7
____
100

7
62
FR DISTRICT
22
96

23
63
What Percent of the Indicators Were Met?
  • Excellent 94-100
  • Effective 75-93.9
  • Continuous Improvement 50-74.9
  • Academic Watch 31-49.9
  • Academic Emergency 0-30.9

64
ELEMENTARY-100 Excellent MIDDLE SCHOOL- 88
Effective HIGH SCHOOL-100 Excellent DISTRICT-9
6 Excellent
65
PERFORMANCE INDEX
  • Every Child Counts

66
THE PROBLEM WITH THE INDICATOR ASSESSMENT IS THAT
STUDENTS TESTS ARE ONLY GRADED AS PASS OR FAIL
AND SCHOOLS ARE ONLY GRADED AS PASS OR FAIL.
THIS GIVES THE SAME WEIGHT TO THOSE
STUDENTS/SCHOOLS WHO BARELY PASSED TO THOSE WHO
BLEW THE DOORS OFF AND THE SAME WEIGHT TO THOSE
WHO JUST BARELY FAILED TO THOSE WHO LEFT THE
ENTIRE TEST BLANK.
67
(No Transcript)
68
(No Transcript)
69
Performance Index Levels and Weights
70
2004-2005 TEST INDICATORS
Test indicators include 3rd grade Reading and
Math 4th grade Reading, Math, Writing,
Citizenship, Science 5th grade Reading 6th
grade Reading, Writing, Math, Citizenship, and
Science 7th grade Math 8th grade Reading
and Math 10th grade Reading, Writing, Math,
Social Studies, and Science
71
(No Transcript)
72
Rating on PI Only
73
(No Transcript)
74
(No Transcript)
75
(No Transcript)
76
(No Transcript)
77
AYPAdequate Yearly Progress
  • No Child Left Behind

78
AYP GOAL
  • TO SET STANDARDS TO INSURE THAT ALL STUDENTS WILL
    BE PROFICIENT IN MATH AND LANGUAGE ARTS BY THE
    SCHOOL YEAR 2013-2014

79
AYP SETS A CEILING AND A FLOOR FOR THE REPORT
CARD RATING
  • MET AYP
  • Cannot be labeled Academic Watch or Academic
    Emergency
  • -or
  • DIDNT MEET AYP
  • Cannot be labeled Effective or Excellent if
    missing AYP for three years

80
4 CRITERIA TO MEET AYP
MATH GOAL
READING GOAL
95 PARTICIPATION
93 ATTENDANCE
81
Subgroups 30 or more
  • African American
  • American Indian
  • Asian
  • Hispanic
  • Multi-racial
  • White
  • Economically Disadvantaged
  • Limited English Proficient
  • Students With Disabilities
  • Requires 45 or more

82
Multiple MeasuresNCLB Adequate Yearly Progress
95 Eligible Students Tested
93 Attendance
83
FORMULA FOR DETERMINING THE GOAL
(3rd Grade Reading Goal) x ( of Third Grade
Students) (4th Grade Reading Goal) x ( of
Fourth Grade Students) (6th Grade Reading Goal)
x ( of Sixth Grade Students) (OGT Reading
Goal) x ( of Students taking OGT) -DIVIDED
BY- OF STUDENTS IN ALL AYP GRADES
84

IF THERE WERE 60 STUDENTS IN EACH GRADE TESTED
THE READING GOAL WOULD BE (60 71.2) (60
46.7) 4272 2802 11,382 (60 71.8)________ _
4308____ ____ 63.23 60 6060
180 180
85
2004-2005 AYP Results
86
Consequences Schools
87
SPECIAL CONSIDERATION FOR PROGRESS
88
Three Ways to Meet AYP
89
DISTRICTS/BUILDINGS IN ACADEMIC WATCH OR ACADEMIC
EMERGENCY WHO IMPROVE THEIR PERFORMANCE INDEX AT
LEAST 10 POINTS IN TWO YEARS WITH AT LEAST THREE
POINTS IN THE MOST RECENT YEAR, CAN MOVE UP ONE
RATING BUT NO HIGHER THAN CONTINUOUS IMPROVEMENT
90
  • Value Added Is Coming!
  • Only measure not correlated to socioeconomics
  • Measures the performance of students compared to
    their predicted performance

91
Fort Recovery Data Profile Book
  • David R. Riel, Superintendent

92
Professional Development Program Review
  • Jeffrey Tuneberg, Coordinator
  • Mercer County ESC

93
Todays Outline
  • Ask the Question
  • Why Here and Why Now?
  • Best Practice vs. Current Practice
  • Elements of the Review
  • Recommendations to the District

94
Ask The Question
  • Why Review the Professional Development Program?

95
Ask The Question
  • Why Review the Professional Development Program?
  • Because A well conceived, well planned and well
    implemented professional development program
    leads to

96
A well conceived, well planned and well
implemented professional development program
leads to
  • An increase in staff understanding of the
    districts purpose.
  • An increase in dedication of staff to district
    goals.
  • An increase in teacher retention.
  • An increase in student achievement.

97
Why Here and Why Now? A meta-analysis of over
200 research studies shows that
  • Effective staff development programs are
    intentional, ongoing, and systemic.
  • Ineffective staff development programs focus on
    documentation rather than results, are too
    shallow/do not have meaningful indicators of
    success, and are too brief/do not extend over a
    period of time.
  • What matters less is where and when the training
    is held. What matters more is the training
    design.
  • Individual teaching styles and value orientation
    do not often affect teachers abilities to learn
    from staff development. The order of change in
    teachers practice goes as follows 1. Teaching
    practices change, 2. Student learning improves,
    3. Teacher attitudes and beliefs change.

98
More Reasons Why
  • Change is highly contextual. What works with
    teachers in one school may not work with teachers
    in another.
  • The average corporation today allocates 5 of
    employees time toward staff development. In
    some cases, like Motorola, the figure is more
    like 10. In teaching, the average is much less.
  • Most staff development problems do not reside
    with individual teachers.
  • 80 - 94 of all barriers to teacher improvement
    reside in the organizations structure and
    processes and not in the performance of
    individuals (Deming, 1986). It is imperative
    that both individual and organizational changes
    are addressed simultaneously.

99
What We Plan to Investigate
  • Best Practice vs. Current Practice
  • National Standards - CIP - Reality
  • Are the elements present?
  • Are the elements good?
  • Are the elements used?
  • Seeking connectivity

100
National StandardsRecommendations from
  • Curriculum Management Systems, inc. (CMSi) - Phi
    Delta Kappa
  • National Staff Development Council
  • Ohio LPDC Advisory Council

101
Current Practice in your DistrictWhat will we
investigate?
  • Governance function
  • Administrative function
  • Design and Implementation function

102
Current Practice in your DistrictHow will we
investigate?
  • Governance function
  • Curriculum Management Systems, inc. (CMSi)
  • Review of District Governance Documents (board
    policy, continuous improvement plan, board
    meeting minutes, budget items related to staff
    development, grants, etc.)

103
Current Practice in your DistrictHow will we
investigate?
  • Administrative function
  • Curriculum Management Systems, inc. (CMSi)
  • Review of District Administrative Documents
    (committee meeting minutes, staff development
    documentation, staff development planning items,
    etc.)

104
Current Practice in your DistrictHow will we
investigate?
  • Design and Implementation Data
  • National Staff Development Council
  • Ohio LPDC Advisory Council
  • Complete a staff development survey by selected
    staff members.

105
Best PracticeCurriculum Management Systems, inc.
  • Has policy that directs staff development
    activities and actions to be aligned to and an
    integral part of the district long-range planning
    and implementation.
  • Requires an evaluation process that includes
    multiple sources of information, focuses on all
    levels of the organization and is based on actual
    changed behavior and increased student
    achievement.
  • Fosters a norm of improvement and development of
    a community characterized by professional and
    personal growth.
  • Provides for organizational, unit and individual
    development in a systematic manner.
  • Is provided for all employees. Requires each
    principal/supervisor to be a staff developer of
    those supervised.
  • Is based on a thorough analysis of data and is
    data driven. Uses disaggregated student
    achievement data to determine adult learning
    priorities, monitors progress and helps sustain
    improvement of each person carrying out his/her
    work.

106
Best PracticeCurriculum Management Systems, inc.
  • Focuses on approaches that have been shown to
    increase productivity.
  • Provides for the following phases awareness,
    initiation, implementation, institutionalization,
    and renewal/support.
  • Is based on adult human learning and development
    theory and directs staff development efforts and
    uses a variety of staff development approaches.
  • Provides for follow-up and requires on-the-job
    application necessary to ensure improvement.
  • Provides for system-wide management oversight of
    staff development efforts.
  • Provides support and resources to deliver staff
    development called for in the district long-range
    planning and is reflected in the district budget
    allocation.

107
Best PracticeNSDC and Ohio LPDC Advisory
CouncilQuality Professional Development
  • increases the capacity of educators to improve
    student achievement.
  • addresses educators varied experience and
    learning needs.
  • applies knowledge from research, as well as what
    has been learned from sound educational practice.
  • is based on student data, aligned with building
    and district goals, and focused on a specific set
    of targeted improvements in student learning.
  • is relevant to and embedded in each educators
    principal work.
  • is a process that occurs over time with system
    support for acquiring new skills and
    incorporating them into practice.
  • creates communities of educators that support
    continuous inquiry, collaboration, and growth.

108
What Will We Review?
  • District Governance Documents (board policy,
    continuous improvement plan, board meeting
    minutes, budget items related to staff
    development, grants, etc.)
  • District Administrative Documents (committee
    meeting minutes, staff development documentation,
    staff development planning items, etc.)
  • Design and Implementation Data (survey of staff)

109
What Will You Receive?
  • A written report
  • A list of findings. Where your district stands
    compared to the standards.
  • A list of governance and administrative
    recommendations.
  • An exceptions report

110
Questions?
  • Where will it take place?
  • When will it take place?
  • Who will do the work?
  • How long does it take?
  • When will you get the report?
  • Who receives the report? Confidentiality.
  • Who completes the survey?
  • Etc

111
School View
  • Jeffrey Tuneberg, Coordinator
  • Mercer County ESC

112
Fort Recovery Data Tool Kit
  • A brief description of some of the most widely
    used data tools in Fort Recovery Local Schools

113
DASL
  • Curtis Hamrick, Technology Coordinator

114
Use of DASL in the Classroom (Teachers)
  • Testing Results
  • Classroom Management
  • Student Management

115
DASL Testing Results
  • Test Results for a Specific District
  • Test Results for a Specific Teacher
  • Assessment Strand Display

116
(No Transcript)
117
Classroom Management
  • Student Pictures
  • Class List with or without Pictures
  • Demographics address, parent info
  • Contacts
  • Medical
  • Information for Sub Folder
  • Seating Chart
  • Student Schedule
  • Daily Attendance and Lunch Count

118
Example of Student Profile
119
Student Management
  • SIS Student Search by grade, activity, gender,
    etc.
  • Student Proficiency Scores
  • Membership Lists
  • Transcripts
  • Course History
  • Attendance and Attendance History

120
District Secretaries
  • Student Registration
  • Locker Assignment
  • EMIS Entry
  • Find Students name, ID, birth date, SSN
  • Download for Student Emergency Medical Forms

121
District Central Office
  • Download to Transportation Database
  • Civil Rights Reporting
  • Download to Parent/Student Directory
  • SIS Student Search

122
District Tech Coordinator
  • Schedules
  • Grade Quick interface
  • Downloads Transportation Database, Student
    Emergency Medical Forms, Parent/Student Book,
    Cafe Terminal

123
Value Added
  • David R. Riel, Superintendent

124
Curriculum Alignment Tool (CAT)
  • Curtis Hamrick, Technology Coordinator

125
How the CAT was Born
  • The Curriculum Alignment Tool (CAT) was developed
    by Fort Recovery Local Schools in response to
    teachers requests for help with aligning their
    curriculum to the State standards and grade level
    indicators.

126
Contents of the CAT
  • The Standard, Benchmark, and Grade Level
    Indicators were all preloaded into the database
    and cannot be changed by the teacher.

127
Contents of the CAT
  • The teacher can then input data in different
    areas (tabs) that documents their curriculum
    alignment with the specified grade level
    indicator.

128
Input Areas
  • Input Tabs Include
  • Assessment
  • Instructional Strategies
  • Unit Themes
  • Enrichment
  • Adaptations/Modifications

129
Input Areas
  • Resources
  • Assessment Bank
  • Companion Indicators
  • Technology
  • Reflections
  • Vocabulary

130
Assessment Tab
131
Instructional Implications of the CAT
  • Alignment to State Standards
  • Common Instructional Materials and Assessment
    Items
  • Avenue for Teachers to Realize Cross-Curricular
    Connections
  • Wealth of Sources and Strategies from Veteran
    Teachers Documented

132
Web Surveyor
  • Curtis Hamrick, Technology Coordinator

133
  • Every person wants to feel like a valued
    contributor

134
SurveysGathering the Data
  • Invaluable sources of data
  • Gives people a voice
  • Online surveys are often truth serums
  • Teacher, Student, Parent Surveys
  • Technology Benchmark Surveys
  • Surveys on various programs IATs, Gifted, etc.
  • Surveys after every in-service session

135
SurveysSharing and Using the Data
  • Shared with various committees
  • Posted on Intranet and Internet
  • Analyzed and developed into Action Plans
  • Shared with parents and students
  • Make decisions based on this data
  • Staff trusts that data is used to improve the
    district

136
Example of Results fromHigh School Parent Survey
137
Staff Selection Process
  • David R. Riel, Superintendent

138
D3A2
  • Curtis Hamrick, Technology Coordinator

139
D3A2 Data Driven Decision-Making for Academic
Achievement
  • Linking Student Information Systems
    (DASL/eSIS/etc.) with existing on-line content
    (InfOhio, ORC, IMS, etc.)
  • D3A2 is infrastructure that connects the analysis
    with the resources
  • D3A2 Committee is made up of over 100 members in
    4 User Groups
  • Long-term project in pilot phase right now

140
(No Transcript)
141
DASL Item Analysis
142
DASL Item Analysis View Graph
143
DASL Item Analysis View Item
144
DASL Item Analysis Annotated Item
145
DASL Item Analysis View Resources
146
DASL Item Analysis View Resources
147
Performance Index Calculator
  • Curtis Hamrick, Technology Coordinator

148
Why Focus on Performance Index Data?
  • Report Card Rating is Based on Higher of
    Indicator Points or Performance Index
  • Performance Index Gives Credit for High
    Performing Students
  • Differentiated Instruction and Intervention
    Strategies can be Developed Based on Results

149
Information on Report Card
150
Information from PI Calculator
151
Additional Features of PI Calculator
  • Functionality to adjust reported scores to see
    affect on score
  • Ability to enter scores to predict future PI
    scores
  • Detailed reports that include student names and
    scores for each test area and achievement level
    (coming soon)

152
Question and Answer/Round Table Discussion
  • Ed Snyder, High School Principal

153
Action Planning
  • Ed Snyder, High School Principal

154
This Presentation is Available Online
  • http//www.noacsc.org/mercer/fr/CENOFF/administ.ht
    m
  • Link titled Using Data to Drive Instruction on
    the left side of the page
Write a Comment
User Comments (0)
About PowerShow.com