Title: Fort Recovery Local Schools How the Use of Data Can Drive Student Instruction
1Fort Recovery Local SchoolsHow the Use of Data
Can Drive Student Instruction
- Northwest Ohio Educational Resource Center
- Drive-In Workshop
- December 15, 2005
2Successes and Challenges of Data Driven Decisions
- Jeffrey Tuneberg, Coordinator
- Mercer County ESC
3 Fort Recovery Local Schools 400 East Butler
Street Box 604 Fort Recovery, Ohio 419-375-4139 ht
tp//www.noacsc.org/mercer/fr/index.asp David R.
Riel, Superintendent - riel_at_fr.noacsc.org Ed
Snyder, HS Principal snyder_at_fr.noacsc.org Curtis
Hamrick, Technology Coordinator
hamrick_at_fr.noacsc.org Mercer County
ESC 441 East Market StreetCelina, Ohio
45822Telephone (419) 586-6628 http//www.noacsc.
org/mercer/mc/ Jeffrey Tuneberg, Coordinator
tunebej_at_mc.noacsc.org
4Fort Recovery Profile and Culture
- David R. Riel, Superintendent
5Fort Recovery Local Schools
- 979 Students 3 buildings
- Low-wealth school district 20 mil floor,
border/fort town - Excellent Rating all 6 years of State Report Card
- Graduation Rate 100
- Racial Minorities 1.7
- Economically Disadvantaged 9.1
- Students With Disabilities 15.4
- 73 of Teachers have Masters degree
6Culture
- We have used data for decision making long before
DSL/DASL now data is more extensive and more
accessible - Chronology of Improvement Efforts
- Late 1980s Effective Schools Process
- Early 1990s Venture Capital
- Mid 1990s Network for Systemic Improvement
Grant
7Culture
- A culture for effective change is a
pre-requisite for continuous improvement - Professional Development
- Pressure Support
- Culture of Trust FR administration has always
trusted staff with student data absolute must
for any data-driven decision making tool to be
successful - Effective Leadership
8Data - Where to start?
- Train people in the use and understanding of your
data - Look for easy to use data tools i.e. DASL,
WebSurveyor - Make an on-line home for your data i.e.
Intranet, Internet - Let staff be creative with uses of data
- Always support decisions with data
9Trust - Where to start?
- Model Let them see you using data for decision
making - Allow people to make mistakes dont
automatically resort to blame - Trust is mutual you cant get it without giving
it - We can't dictate commitment- it has to come from
the individual thatis why we need a culture of
trust
10CIP/Report Card Analysis
- David R. Riel, Superintendent
11Continuous Improvement Plan Data
Analysis(Originally Network for Systemic
Improvement - Progress)
- Fort Recovery Local Schools
- As of August 18, 2005
- STATE REPORT CARD Implications for State Report
Card
12Continuous Improvement PlanCIP Goals
- All students regardless of ability or
handicapping condition, will - Goal I - Learn to the best of their abilities
- Goal II - Learn in a safe, supportive, caring
environment - Goal III - Graduate with the skills / knowledge
to compete successfully in work school
13Goal I Performance IndicatorsStudents will
Learn to the Best of their Abilities
- A. All students will pass each area of state or
off-year tests - B. Each year an increase in students reaching
advance level of proficiency in each level - C. There will be an increase each year in seniors
graduating with honor diplomas (College Prep and
Vocational)
14Goal I Performance IndicatorsStudents will
Learn to the Best of their Abilities
- D. Show group progress on locally administered
tests i.e. KDI, Plan, Iowa - E. Show group progress on AP tests, PSAT, ACT
15Progress - Goal I, Indicator AAll students will
pass each area of state or off-year tests
- Average passing of all Proficiency and Off-Year
Tests for Grades 1-9 - New cut scores for off-year test start 1999
- 2004 4th, 6th, 9th Proficiency Tests
onlySTATE REPORT CARD
16Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- Average passing of all Achievement Tests for
Grades 3-8 - 2004 Reading 3
- 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
- 2006 Reading 3-8 Math 3-8 Writing 4
- 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
Studies 5,8 Science 5,8 STATE REPORT CARD
17Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- Average passing of all OGT areas for Grade 10
- 2003 and 2004 Reading, Math only
- 2005 forward Reading, Math, Writing, Science,
Social Studies STATE REPORT CARD
18Progress - Goal I, Indicator AAll students will
pass each area of state or off-year tests
- Average passing of students with disabilities
Proficiency and Off-Year Tests for Grades 1-9 - New cut scores for off-year test start 1999
- 2004 4th, 6th, 9th Proficiency Tests only
- 2005 4th SS, Sc, M 6th ProficiencySTATE
REPORT CARD
19Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- Average passing of students with disabilities
Achievement Tests for Grades 3-8 - 2004 Reading 3
- 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
- 2006 Reading 3-8 Math 3-8 Writing 4
- 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
Studies 5,8 Science 5,8 STATE REPORT CARD
20Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- Average passing of students with disabilities
OGT areas for Grade 10 - 2003 and 2004 Reading, Math only
- 2005 forward Reading, Math, Writing, Science,
Social Studies STATE REPORT CARD
21Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- District Adequate Yearly Progress (AYP)
- Based upon Building AYP
- STATE REPORT CARD
22Progress - Goal I, Indicator A All students
will pass each area of state or off-year tests
- Performance Index Score (as reported on State
Report Card) - Advanced 1.2
- Accelerated 1.1
- Proficient 1.0
- Basic - .6
- Below Basic - .3
- Untested 0
- STATE REPORT CARD
23Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency in each level
- Percentage of Advanced Level for Proficiency and
Off-Year Tests for Grades 1-9 - New cut scores for off-year test start 1999
- 2004 4th, 6th, 9th Proficiency Tests only
- 2005 4th SS, Sc, M 6th Proficiency STATE
REPORT CARD
24Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency (Achievement) in each level
- Percentage of Advanced Level for Achievement
Tests for Grades 3-8 - 2004 Reading 3
- 2005 Reading 3,4,5,8 Math 3,7,8 Writing 4
- 2006 Reading 3-8 Math 3-8 Writing 4
- 2007 Reading 3-8 Math 3-8 Writing 4,7 Social
Studies 5,8 Science 5,8 - STATE REPORT CARD
25Progress - Goal I, Indicator B Each year an
increase in students reaching advance level of
proficiency (Achievement) in each level
- Percentage for Advanced Level for OGT for Grade
10 - 2003 Advanced Level not reported
- 2003 and 2004 Reading, Math only
- 2005 On All 5 Areas STATE REPORT CARD
26Progress -Goal I, Indicator CThere will be an
increase each year in seniors graduating with
honor diplomas (College Prep and Vocational)
- Percentage of seniors with Honor Diplomas (as
determined by State criteria)
27Progress -Goal I, Indicator D Show group
progress on locally administered tests i.e.
KDI, PLAN, Iowa
- KDI Kindergarten Diagnostic Instrument
- 2003 60 students average 124.9
- 2004 62 students average 134.6
- 2005 76 students average 137.9
- Includes students that have had no school
experience
28Progress -Goal I, Indicator D Show group
progress on locally administered tests i.e.
KDI, PLAN, Iowa
- PLAN 10th Grade
- pre-ACT test
- National Average 16.5
- 2002 68 students - average 18.1
- 2003 57 students - average 19.4
- 2004 71 students average 18.0
- 2005 76 students average 18.5
-
29Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
- Advanced Placement
- Chemistry, English, Calculus
- 2000 38 tests 1.42
- 2001 30 tests 1.60
- 2002 24 tests 1.50
- 2003 24 tests 2.13
- 2004 33 tests 2.33
- 2005 26 tests 1.96
-
30Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
- PSAT Pre-SAT
- 2003 10th 12 studentsSelection Index 51
- 2003 11th 6 studentsSelection Index 69
- 2004 10th 6 studentsSelection Index 43
- 2004 11th 4 studentsSelection Index 88
- 2005 Not enough students took test for
reporting purposes -
31Progress -Goal I, Indicator E Show group
progress on AP tests, PSAT, ACT
- ACT
- 1999 47 students 21.2
- 2000 51 students 22.1
- 2001 39 students 21.8
- 2002 47 students 20.9
- 2003 64 students 21.7
- 2004 58 students 21.3
- 2005 60 students 21.6
- National Average
- 1999-2001 21.0
- 2002-2003 20.8
- 2004 20.9
- 2005 20.9
-
32Goal II Performance IndicatorsStudents will
Learn in a Safe, Supportive, Caring Environment
- Yearly student surveys will increase positive
ratings on students treat each other with
respect - Increase rating on our teachers care about the
students - Staff survey will show increase in rating of our
staff is collegial and works well together - Student surveys will show increase in rating of
climate section
33Progress Goal II, Indicator A Yearly student
surveys will increase positive ratings on
students treat each other with respect
- Average rating (1-5) of annual student survey
question students treat each other with
respect
34Progress -Goal II, Indicator B Increase rating
on our teachers care about the students
- Average rating (1-5) of annual student survey
question our teachers care about the students
35Progress -Goal II, Indicator CStaff survey will
show increase in rating of our staff is
collegial and works well together
- Average rating (1-10) of annual teacher survey
question our staff is collegial and works well
together
36Progress -Goal II, Indicator DStudent surveys
will show increase in rating of climate section
- Average rating (1-5) of Positive Learning Climate
of annual student surveys
37Goal III Performance IndicatorsAll students
will graduate with the skills / knowledge to
compete successfully in work school
- On BAC surveys, employers rating average of new
FR graduate employees will increase - All students will reach age/grade appropriate
Technology Benchmarks set by District Technology
Committee - Senior surveys will show an increase on exposure
to career choices - Post-Graduate surveys will show an increase in
positive results (including SCAN Skills)
38Progress -Goal III, Indicator APriorities as
indicated by area employer surveys
- Need to focus more upon (4 point scale)
- 2005 28 BAC surveys returned
39Progress -Goal III, Indicator AAverage Rating of
new FR Graduate Employees from Employer Surveys
- Employee Rating (4 point scale)
- 2005 28 BAC surveys returned
40Progress - Goal III, Indicator B All students
will reach age/grade appropriate Technology
Benchmarks set by District Technology Committee
- Percentage of students reaching grade level Tech
Benchmarks - In 2002-03, raised the difficulty level and
differentiation of Tech Benchmarks - In 2004-05, once again raised difficulty level
and differentiation of Tech Benchmarks
41Progress - Goal III, Indicator CSenior surveys
will show an increase on exposure to career
choices
- Percentage of students who answered the question
Adequacy of programs in career education and
planning as Satisfied, No Change Necessary - (ACT HS Profile Report question 188)
42Progress -Goal III, Indicator D Post-Graduate
surveys will show an increase in positive results
(including SCAN Skills)
- Exit Survey - who feel they are well prepared
43Progress State Report Card RatingGoal I,
Indicators A-EGoal II, Indicators A-DGoal III,
Indicators A-D
Excellent
Effective
Continuous Improvement
Academic Watch
Academic Emergency
44Fort Recovery Local SchoolsVision The Fort
Recovery Schools will become a community of
learners where all students learn to the best of
their abilities, become responsible citizens, and
acquire the knowledge and attributes needed to
successfully compete in a global society.
Excellent
Effective
Continuous Improvement
Academic Watch
Academic Emergency
45Instructional Conclusions/Implications
- Need more accurate and consistent data on CIP
Goal 3 particularly employee and post-graduate
data - Our students academically are doing very well
when compared to state standards (AYP) - Strategies may need to be developed and
implemented to improve the percentage of students
who score at the Advanced levels
46InstructionalConclusions/Implications
- We need to learn all we can from our
participation in the Battelle For Kids Value
Added Pilot - May want to look at strategies to review data on
a more regular basis - Strategic planning process will likely result in
CIP revisions
47InstructionalConclusions/Implications
- We need to provide on-going Professional
Development opportunities for staff to assist
them in providing instruction that leads to
sustained and incremental growth in the
indicators - We need to find ways to encourage more
collaboration among our teachers
48InstructionalConclusions/Implications
- Increased attention will need to be paid to our
subgroups (students with disabilities) - Apply Plan-Do-Study-Act Model to CIP Development
and Implementation - May need to find ways to assist teachers in
applying research techniques in their classrooms
(DLT)
49Understanding Building/District Report Cards
- Ohios Accountability System
50Classifications
- Excellent
- Effective
- Continuous Improvement
- Academic Watch
- Academic Emergency
51(No Transcript)
52(No Transcript)
53(No Transcript)
54(No Transcript)
55Two Avenues for Classifications
Indicator Points -or- Performance Index
56INDICATOR POINTS
572004-2005 INDICATORS
- Reading (75)
- 3rd, 4th, 5th, 6th, 8th, and 10th
- Math (75)
- 3rd, 4th, 6th, 7th, 8th and 10th
- Science (75)
- 4th, 6th and 10th
- Writing (75)
- 4th, 6th and 10th
- Social Studies (75)
- 4th, 6th and 10th
- Attendance (93)
- Graduation Rate (90)
23 AVAILABLE INDICATORS
58NUMBER OF POINTS RECEIVED
NUMBER OF POINTS POSSIBLE
59Elementary Card
9
100
9
60MS CARD
8
88
9
61HS Card
7
____
100
7
62FR DISTRICT
22
96
23
63What Percent of the Indicators Were Met?
- Excellent 94-100
- Effective 75-93.9
- Continuous Improvement 50-74.9
- Academic Watch 31-49.9
- Academic Emergency 0-30.9
64ELEMENTARY-100 Excellent MIDDLE SCHOOL- 88
Effective HIGH SCHOOL-100 Excellent DISTRICT-9
6 Excellent
65PERFORMANCE INDEX
66THE PROBLEM WITH THE INDICATOR ASSESSMENT IS THAT
STUDENTS TESTS ARE ONLY GRADED AS PASS OR FAIL
AND SCHOOLS ARE ONLY GRADED AS PASS OR FAIL.
THIS GIVES THE SAME WEIGHT TO THOSE
STUDENTS/SCHOOLS WHO BARELY PASSED TO THOSE WHO
BLEW THE DOORS OFF AND THE SAME WEIGHT TO THOSE
WHO JUST BARELY FAILED TO THOSE WHO LEFT THE
ENTIRE TEST BLANK.
67(No Transcript)
68(No Transcript)
69Performance Index Levels and Weights
702004-2005 TEST INDICATORS
Test indicators include 3rd grade Reading and
Math 4th grade Reading, Math, Writing,
Citizenship, Science 5th grade Reading 6th
grade Reading, Writing, Math, Citizenship, and
Science 7th grade Math 8th grade Reading
and Math 10th grade Reading, Writing, Math,
Social Studies, and Science
71(No Transcript)
72Rating on PI Only
73(No Transcript)
74(No Transcript)
75(No Transcript)
76(No Transcript)
77AYPAdequate Yearly Progress
78AYP GOAL
- TO SET STANDARDS TO INSURE THAT ALL STUDENTS WILL
BE PROFICIENT IN MATH AND LANGUAGE ARTS BY THE
SCHOOL YEAR 2013-2014
79AYP SETS A CEILING AND A FLOOR FOR THE REPORT
CARD RATING
- MET AYP
- Cannot be labeled Academic Watch or Academic
Emergency - -or
- DIDNT MEET AYP
- Cannot be labeled Effective or Excellent if
missing AYP for three years
804 CRITERIA TO MEET AYP
MATH GOAL
READING GOAL
95 PARTICIPATION
93 ATTENDANCE
81Subgroups 30 or more
- African American
- American Indian
- Asian
- Hispanic
- Multi-racial
- White
- Economically Disadvantaged
- Limited English Proficient
- Students With Disabilities
- Requires 45 or more
82Multiple MeasuresNCLB Adequate Yearly Progress
95 Eligible Students Tested
93 Attendance
83FORMULA FOR DETERMINING THE GOAL
(3rd Grade Reading Goal) x ( of Third Grade
Students) (4th Grade Reading Goal) x ( of
Fourth Grade Students) (6th Grade Reading Goal)
x ( of Sixth Grade Students) (OGT Reading
Goal) x ( of Students taking OGT) -DIVIDED
BY- OF STUDENTS IN ALL AYP GRADES
84IF THERE WERE 60 STUDENTS IN EACH GRADE TESTED
THE READING GOAL WOULD BE (60 71.2) (60
46.7) 4272 2802 11,382 (60 71.8)________ _
4308____ ____ 63.23 60 6060
180 180
852004-2005 AYP Results
86Consequences Schools
87SPECIAL CONSIDERATION FOR PROGRESS
88Three Ways to Meet AYP
89DISTRICTS/BUILDINGS IN ACADEMIC WATCH OR ACADEMIC
EMERGENCY WHO IMPROVE THEIR PERFORMANCE INDEX AT
LEAST 10 POINTS IN TWO YEARS WITH AT LEAST THREE
POINTS IN THE MOST RECENT YEAR, CAN MOVE UP ONE
RATING BUT NO HIGHER THAN CONTINUOUS IMPROVEMENT
90- Value Added Is Coming!
- Only measure not correlated to socioeconomics
- Measures the performance of students compared to
their predicted performance
91Fort Recovery Data Profile Book
- David R. Riel, Superintendent
92Professional Development Program Review
- Jeffrey Tuneberg, Coordinator
- Mercer County ESC
93Todays Outline
- Ask the Question
- Why Here and Why Now?
- Best Practice vs. Current Practice
- Elements of the Review
- Recommendations to the District
94Ask The Question
- Why Review the Professional Development Program?
95Ask The Question
- Why Review the Professional Development Program?
- Because A well conceived, well planned and well
implemented professional development program
leads to
96A well conceived, well planned and well
implemented professional development program
leads to
- An increase in staff understanding of the
districts purpose. - An increase in dedication of staff to district
goals. - An increase in teacher retention.
- An increase in student achievement.
97Why Here and Why Now? A meta-analysis of over
200 research studies shows that
- Effective staff development programs are
intentional, ongoing, and systemic. - Ineffective staff development programs focus on
documentation rather than results, are too
shallow/do not have meaningful indicators of
success, and are too brief/do not extend over a
period of time. - What matters less is where and when the training
is held. What matters more is the training
design. - Individual teaching styles and value orientation
do not often affect teachers abilities to learn
from staff development. The order of change in
teachers practice goes as follows 1. Teaching
practices change, 2. Student learning improves,
3. Teacher attitudes and beliefs change. -
98More Reasons Why
- Change is highly contextual. What works with
teachers in one school may not work with teachers
in another. - The average corporation today allocates 5 of
employees time toward staff development. In
some cases, like Motorola, the figure is more
like 10. In teaching, the average is much less. - Most staff development problems do not reside
with individual teachers. - 80 - 94 of all barriers to teacher improvement
reside in the organizations structure and
processes and not in the performance of
individuals (Deming, 1986). It is imperative
that both individual and organizational changes
are addressed simultaneously.
99What We Plan to Investigate
- Best Practice vs. Current Practice
- National Standards - CIP - Reality
- Are the elements present?
- Are the elements good?
- Are the elements used?
- Seeking connectivity
100National StandardsRecommendations from
- Curriculum Management Systems, inc. (CMSi) - Phi
Delta Kappa - National Staff Development Council
- Ohio LPDC Advisory Council
101Current Practice in your DistrictWhat will we
investigate?
- Governance function
- Administrative function
- Design and Implementation function
102Current Practice in your DistrictHow will we
investigate?
- Governance function
- Curriculum Management Systems, inc. (CMSi)
-
- Review of District Governance Documents (board
policy, continuous improvement plan, board
meeting minutes, budget items related to staff
development, grants, etc.)
103Current Practice in your DistrictHow will we
investigate?
- Administrative function
- Curriculum Management Systems, inc. (CMSi)
-
- Review of District Administrative Documents
(committee meeting minutes, staff development
documentation, staff development planning items,
etc.)
104Current Practice in your DistrictHow will we
investigate?
- Design and Implementation Data
- National Staff Development Council
- Ohio LPDC Advisory Council
-
- Complete a staff development survey by selected
staff members.
105Best PracticeCurriculum Management Systems, inc.
- Has policy that directs staff development
activities and actions to be aligned to and an
integral part of the district long-range planning
and implementation. - Requires an evaluation process that includes
multiple sources of information, focuses on all
levels of the organization and is based on actual
changed behavior and increased student
achievement. - Fosters a norm of improvement and development of
a community characterized by professional and
personal growth. - Provides for organizational, unit and individual
development in a systematic manner. - Is provided for all employees. Requires each
principal/supervisor to be a staff developer of
those supervised. - Is based on a thorough analysis of data and is
data driven. Uses disaggregated student
achievement data to determine adult learning
priorities, monitors progress and helps sustain
improvement of each person carrying out his/her
work.
106Best PracticeCurriculum Management Systems, inc.
- Focuses on approaches that have been shown to
increase productivity. - Provides for the following phases awareness,
initiation, implementation, institutionalization,
and renewal/support. - Is based on adult human learning and development
theory and directs staff development efforts and
uses a variety of staff development approaches. - Provides for follow-up and requires on-the-job
application necessary to ensure improvement. - Provides for system-wide management oversight of
staff development efforts. - Provides support and resources to deliver staff
development called for in the district long-range
planning and is reflected in the district budget
allocation.
107Best PracticeNSDC and Ohio LPDC Advisory
CouncilQuality Professional Development
- increases the capacity of educators to improve
student achievement. - addresses educators varied experience and
learning needs. - applies knowledge from research, as well as what
has been learned from sound educational practice. - is based on student data, aligned with building
and district goals, and focused on a specific set
of targeted improvements in student learning. - is relevant to and embedded in each educators
principal work. - is a process that occurs over time with system
support for acquiring new skills and
incorporating them into practice. - creates communities of educators that support
continuous inquiry, collaboration, and growth.
108What Will We Review?
- District Governance Documents (board policy,
continuous improvement plan, board meeting
minutes, budget items related to staff
development, grants, etc.) - District Administrative Documents (committee
meeting minutes, staff development documentation,
staff development planning items, etc.) - Design and Implementation Data (survey of staff)
109What Will You Receive?
- A written report
- A list of findings. Where your district stands
compared to the standards. - A list of governance and administrative
recommendations. - An exceptions report
110Questions?
- Where will it take place?
- When will it take place?
- Who will do the work?
- How long does it take?
- When will you get the report?
- Who receives the report? Confidentiality.
- Who completes the survey?
- Etc
111School View
- Jeffrey Tuneberg, Coordinator
- Mercer County ESC
112Fort Recovery Data Tool Kit
- A brief description of some of the most widely
used data tools in Fort Recovery Local Schools
113DASL
- Curtis Hamrick, Technology Coordinator
114Use of DASL in the Classroom (Teachers)
- Testing Results
- Classroom Management
- Student Management
115DASL Testing Results
- Test Results for a Specific District
- Test Results for a Specific Teacher
- Assessment Strand Display
116(No Transcript)
117Classroom Management
- Student Pictures
- Class List with or without Pictures
- Demographics address, parent info
- Contacts
- Medical
- Information for Sub Folder
- Seating Chart
- Student Schedule
- Daily Attendance and Lunch Count
118Example of Student Profile
119Student Management
- SIS Student Search by grade, activity, gender,
etc. - Student Proficiency Scores
- Membership Lists
- Transcripts
- Course History
- Attendance and Attendance History
120District Secretaries
- Student Registration
- Locker Assignment
- EMIS Entry
- Find Students name, ID, birth date, SSN
- Download for Student Emergency Medical Forms
121District Central Office
- Download to Transportation Database
- Civil Rights Reporting
- Download to Parent/Student Directory
- SIS Student Search
122District Tech Coordinator
- Schedules
- Grade Quick interface
- Downloads Transportation Database, Student
Emergency Medical Forms, Parent/Student Book,
Cafe Terminal
123Value Added
- David R. Riel, Superintendent
124Curriculum Alignment Tool (CAT)
- Curtis Hamrick, Technology Coordinator
125How the CAT was Born
- The Curriculum Alignment Tool (CAT) was developed
by Fort Recovery Local Schools in response to
teachers requests for help with aligning their
curriculum to the State standards and grade level
indicators.
126Contents of the CAT
- The Standard, Benchmark, and Grade Level
Indicators were all preloaded into the database
and cannot be changed by the teacher.
127Contents of the CAT
- The teacher can then input data in different
areas (tabs) that documents their curriculum
alignment with the specified grade level
indicator.
128Input Areas
- Input Tabs Include
- Assessment
- Instructional Strategies
- Unit Themes
- Enrichment
- Adaptations/Modifications
129Input Areas
- Resources
- Assessment Bank
- Companion Indicators
- Technology
- Reflections
- Vocabulary
130Assessment Tab
131Instructional Implications of the CAT
- Alignment to State Standards
- Common Instructional Materials and Assessment
Items - Avenue for Teachers to Realize Cross-Curricular
Connections - Wealth of Sources and Strategies from Veteran
Teachers Documented
132Web Surveyor
- Curtis Hamrick, Technology Coordinator
133- Every person wants to feel like a valued
contributor
134SurveysGathering the Data
- Invaluable sources of data
- Gives people a voice
- Online surveys are often truth serums
- Teacher, Student, Parent Surveys
- Technology Benchmark Surveys
- Surveys on various programs IATs, Gifted, etc.
- Surveys after every in-service session
135SurveysSharing and Using the Data
- Shared with various committees
- Posted on Intranet and Internet
- Analyzed and developed into Action Plans
- Shared with parents and students
- Make decisions based on this data
- Staff trusts that data is used to improve the
district
136Example of Results fromHigh School Parent Survey
137Staff Selection Process
- David R. Riel, Superintendent
138D3A2
- Curtis Hamrick, Technology Coordinator
139D3A2 Data Driven Decision-Making for Academic
Achievement
- Linking Student Information Systems
(DASL/eSIS/etc.) with existing on-line content
(InfOhio, ORC, IMS, etc.) - D3A2 is infrastructure that connects the analysis
with the resources - D3A2 Committee is made up of over 100 members in
4 User Groups - Long-term project in pilot phase right now
140(No Transcript)
141DASL Item Analysis
142DASL Item Analysis View Graph
143DASL Item Analysis View Item
144DASL Item Analysis Annotated Item
145DASL Item Analysis View Resources
146DASL Item Analysis View Resources
147Performance Index Calculator
- Curtis Hamrick, Technology Coordinator
148Why Focus on Performance Index Data?
- Report Card Rating is Based on Higher of
Indicator Points or Performance Index - Performance Index Gives Credit for High
Performing Students - Differentiated Instruction and Intervention
Strategies can be Developed Based on Results
149Information on Report Card
150Information from PI Calculator
151Additional Features of PI Calculator
- Functionality to adjust reported scores to see
affect on score - Ability to enter scores to predict future PI
scores - Detailed reports that include student names and
scores for each test area and achievement level
(coming soon)
152Question and Answer/Round Table Discussion
- Ed Snyder, High School Principal
153Action Planning
- Ed Snyder, High School Principal
154This Presentation is Available Online
- http//www.noacsc.org/mercer/fr/CENOFF/administ.ht
m - Link titled Using Data to Drive Instruction on
the left side of the page