Title: District Audit Tool: A Method for Determining Level of Need for Support to Improvement
1District Audit ToolA Method for Determining
Level of Need for Support to Improvement
- Comprehensive Assessment Systems Under ESEA Title
I (CAS) State Collaborative on Student Standards
and Assessments (SCASS) - And
- Edvantia (Formerly AEL)
- Presented by Ted Jarrell , Delaware
- and
- Diane Lowery, South Dakota
- CCSSO Summer Leadership Training
Conference--Using Data to Improve Instruction - Boulder, CO
- July 31, 2006
2How we got here
- Capacity Building for Support and Corrective
Action Project began in 2003 - Projections of what was ahead under NCLB
- Laws allowance for prioritizing
- Change of emphasis
- From same for everyone to variable based on need
- From individual schools to districts
- Defining level of need
3The District Audit Tool Intensity and focus of
technical assistance
- All procedures and documents based on research
- Prioritize which districts require the most
intense and immediate assistance. - Quantitatively and qualitatively judge district
status per research-based elements of success - Identify needs to improve student learning.
- Assign/confer with districts on a plan
- How, when and by whom TA is delivered
- Design progress monitoring on plan
implementation.
4(No Transcript)
5(No Transcript)
6Stage 3d Stage 3d Evaluation of process
7(No Transcript)
8Stage 5d Decision point
9Stage 1
- How Are The Schools That Did Not Make AYP
Different?
10(No Transcript)
11(No Transcript)
12How Bad is Bad? How might we calculate the
magnitude by which schools missed AYP? How
might we prioritize schools which did not make
AYP?
13Discriminating among schools based on AYP
Information
14Metric Calculation Explained
- The AYP Achievement metric relies upon the
distance between AYP targets and observed group
performance. - In Nevada, confidence intervals are used.
- Distance Target (Observed CI)
15Example of Distance
D3
D1
D4
D2
Distance only calculated for groups that missed
target.
16Metric Formula Controls
- Control for School Size and Heterogeneity
- Divide sum of distances by total of groups
analyzed for achievement
17Number of AYP Comparisons
182003 AYP ClassificationBy Number of AYP
Comparisons
192003 AYP Classification By School Size
202003 AYP Classification By of School
Population Minority
212003 AYP Classification By of School
Population SWD
222003 AYP Classification By of School
Population ELL
232003 AYP Classification By of School
Population FRL
24Metric Formula Controls
- Control for School Size and Heterogeneity
- Divide sum of distances by total of groups
analyzed for achievement - Control for differences in AMO targets across
educational levels - Divide distances by AMO Safe Harbor targets
25Achievement Metric Formula
Group 9 ? ELAStatusDist Group 1
Group 9 ? MathStatusDist Group 1
Group 9 ? ELASafeDist Group 1
Group 9 ? MathSafeDist Group 1
Safe Harbor Weight
Achievement Metric Value
ELA AMO
Math AMO
SH Target
SH Target
of subgroups evaluated for ELA Achievement
(Status/SH) Math Achievement (Status/SH)
Explanation of Summation Function used in Metric
Formula
Sum of the distance between observed performance
and the goal for all groups that did not meet the
relevant target. Groups 1-9 represent the school
as a whole, American Indian/ Alaskan Native,
Asian, Hispanic, Black/ African American, White,
IEP, Limited English Proficient, and Free or
Reduced Price Lunch groups.
Group 9 ? Group 1
262003 Achievement Metric as a Predictor of 2004
AYP Classification
Achievement Metric Quintiles (2003)
27- Development of this program was co-sponsored by
the Edvantia (formerly AEL) and the Chief Council
of State School Officers (CCSSO)
28Preparing to Run the Program
File Layout Guide for Source Data File
- Download the program (currently version 1.2) from
the Edvantia website http//www.edvantia.org/aypm
etric (Complete Documentation is provided on the
website) - Install the program and the Microsoft .NET
Framework (Free Download) - Requirements Must have Microsoft Excel 2002 or
later installed - Create Source Data File
29Source Data File
- Required Data Elements for Data File
- School/ District Name
- School/ District Number
- Educational Level (elementary, middle, high)
- ELA Math Achievement Results for each of 9
groups (Yes, No, TF) - ELA Math Percent Proficient Rates for each of 9
groups Status Analysis (corrected with
confidence interval or not) - ELA Math Percent Reduction in of
Non-Proficient Students for each of 9 groups
Safe Harbor (corrected with confidence interval
or not)
30Program Interface Example (Level of Need)
31Results File
- Program creates an Excel results file containing
for each school or district - Case Number
- School/ District Name
- School/ District Number
- Level
- AYP Achievement Metric Value
- AYP Achievement Metric Z-Score
- Total of Achievement Analyses
- Number of Missed Analyses
- Any Relevant Error Messages
32Tiered Support Based on Metric
AYP Metric Quintiles (2003)
33Tiered Support Based on Metric
State SIP Team
Technical Advisory Team
State Remediation Funds
Participation in State Educational Programs /
Initiatives
School Improvement Plan Data Analysis Workshops
AYP Metric Quintiles (2003)
34South Dakota Stage 1
- Priority Points determined fall 2004 for all
public schools - Used to determine weighting for allocations of
school improvement funds for Title I schools - 59 Title I schools
- Range from Level 1 (choice) to Level 4 (planning
for restructuring) - Priority Points determined
- Rank ordered and assigned to quintiles
- Placed number of schools in each cell of matrix
according to quintile (vertical) and SI status
(horizontal)
35Standards Setting Process DOE staff first set
preliminary cut pointsCommittee of Practitioners
refined cut points
36SI Allocations (1003 funds)
- Three factors
- poverty, enrollment, and level of need
- POVERTY
- One half SI funds allocated based on number of
children eligible for FRL - ENROLLMENT
- One half of SI funds are allocated based on
weighted school enrollment counts. - Enrollment counts are weighted depending on each
eligible schools Level of Need.
37SI Allocations (1003 funds)
- The enrollments are weighted according to the
factors in the following table - Minimum grant amounts have been set at 5000.
38(No Transcript)
39Stage 2a Determining Level and Type of
Assistance
- Stage 2 Examines the validity of the decisions
made in Stage 1 using additional data. - Takes into account factors other than achievement
that may have resulted in the district/school not
making AYP - Other indicator and graduation rates
40Stage 2 b Desk Audit
- Confirm the validity of the priority level or
make adjustments to the level of need assignment. - Customize the Desk Audit for the number of
districts you have selected.
41Stage 2b Data Sources Wyoming Sample Page
42Stage 2c District Audit Rubric
- Purpose is to identify areas of concern for
on-site visit. - Clarifies the focus of an on-site visit and gives
districts an opportunity to review the
documentation at the SEA level for district
currency.
43Elements of the Audit Tool
- Leadership
- Curriculum/Instruction
- Highly Qualified Staff
- Professional Development
- Assessment/Accountability
- School Culture/Climate
- Budget and Resources
- Parents and Community
44DISTRICT DISK AUDIT RUBRICBlank Form
Scale 4 Exemplary This element contributes to
the districts success, and provides a model for
other districts to emulate. 3 Meets
expectation This element is fully functional and
all indicators are evident. 2 Area of Concern
This element is marginal. Some indicators for
this element are evident. Performance in
this area should be monitored for change and
impact on Areas of Need. 1 Area of Need No
evidence that this element is met or understood
by the district. This element would be
identified as a priority for technical assistance.
45DISTRICT DISK AUDIT RUBRICBlank Form
46DISTRICT DEST AUDIT RUBRIXCBlank Form
47Stage 2c District Audit Rubric
- Use two readers to score separately.
- Read every word in the scoring section to see if
the element is present. - A 3 is a 3 until it is a 4. Dont inflate
based on assumptions. - In each element, note data sources used and
comment on your findings for future reference.
48Stage 2d Decision Point Level of Assistance
- Reconsider the group into which a district has
been placed from the initial stages. - If district moves to a lower level of need, an
on-site visit may not be required. - If district moves to a higher level of need,
there are many other contributing factors and a
more comprehensive on-site visit is needed.
49Stage 3d Stage 3d Evaluation of process
50SD Stage 3 -- School Level
- District A conducted a school audit of its own
middle school - During its corrective action year
- Facilitated by consultant (the chosen corrective
action) - Resulted in 5 investigative teams researching 5
major issues that arose from the audit.. - Launched plans for restructuring
51School Audits 2006
- SD Accountability Workbook, Element 1.6
- Schools in level 4 (restructuring)
- Receive an audit
- All districts will use the school level audit
tool - District responsibility
- Training and Support by SST (School Support Team)
- Conducted in early fall
- Drive the restructuring plan development
52NevadaUsing Elements Related to School Success
Rubric for CSR Decisions
- Did desk audit with info available
- Supplied each school with opportunity to assemble
materials - Conducted on-site visits
- Awarded schools CSR funds
53SD Stage 3 -- On-Site Visit
- District Level
- Conducted on-site reviews of three districts
- Facilitator was SST (School Support Team) member
- Team of 4-6
- SST
- ESA (Educational Service Agency) staff
- SEA staff
54Prep for On-site
- Facilitator
- Asked district for additional data
- Provide LEA with audit tool completed with
preliminary findings but no scores - Protocols were not shared in advance
- Determine and notify district of which schools in
district will be visited - Randomly select teacher and alternate. Notify
district of teachers to be interviewed. - Finalize itinerary, prepare field documents
- Ongoing communication with LEA and team
55Prep for On-site
- District
- Provide additional data to audit facilitator
- Inform schools of upcoming visit
- Review audit tool with preliminary findings with
no scores - assemble additional evidence, responses
- NCLB Coordinator
- Secured additional data from SEA that were
discovered
56On-Site Visit
Stage 3b
- AGENDA
- Day One
- Organizational meeting upon entering district
- School visits
- Community/Parent Focus Group
- Team meeting to complete rubric
- Day Two
- Board member interviews
- Additional data review
- Meeting with district leadership team
57Organizational Meeting
- District administrative team
- SI Coordinator, Federal Programs Director, SPED
Director, Superintendent - Curriculum and assessment director was
unavailable - Communicate purpose of the visit help gather
info for writing TA plan - data gathering process
- Ultimate goal TA plan
- Not compliance
58Organizational Meeting (cont.)
- Schools to visit (all in SI)
- Location, directions, lunch arrangements
- HS, MS, in town elementary schools (K-3 and 4-5)
- Two outlying schools (K-8)
- 1 school per reviewer except for two elementary
schools in town were reviewed by one person and
MS (level 4 SI) had a team of two review them - Teachers to interview
- Timelines
59On-site visit to schools
- Team member met with principal
- Principal conducted general walk through and
interview - Teacher interview
- Observations conducted in as many classrooms as
possible
60Focus Groups
- High School Students
- 12 Students interviewed
- Principal selected
- Participants Male/female, Older/younger
- Used protocol
- Parent focus group
- 8 parents, all school employees (by choice, not
selection) - Used protocol
- No district parent council
61Discuss Results of Days Work
- Debrief, Informal over dinner
- Formal meeting in evening
- Conferred on findings from the day
- Criteria by criteria
- Collectively formulating audit results from day
of observations and interviews - Noted questions for further review the next
morning
62Day 2
- Board member interviews
- Further documentation review and inquiries
- District Leadership Team (previously scheduled)
- all building principals, district office staff,
and superintendent - Facilitator reported out initial findings of the
audit - Some feedback, clarification
63Post On-site Visit
Stage 3c
- Team members conducted an additional teacher
interview to outlying schools by phone call - Team members scored rubric individually and sent
completed rubric to facilitator - More time to review the scoring
- More independent scoring
- Negotiated discrepancies in scores by email
- One overall score for district per element,
criteria
64Prior to DOE meeting with LEA
- Facilitator compiled and developed a report of
the review - Facilitator attended district data retreat as an
SST member and gave the LEA the completed rubric - Followed a similar process for Districts B C
65Interim Activities
- District went through the completed rubric
- Identified TA they would like
- Gave LEA time to process and have internal
discussion - DOE set date and time of meeting
- Primary contact in district is Si Coordinator
- Asked district to provide summary of district
data retreat
66Reflection and Feedback
Stage 3d
- Several opportunities for informal feedback
- 2nd day of on-site with district leadership team
- Discussion or audit results during DOE visit
- Formal survey
- Need to develop survey protocol
- For LEA
- For audit team
- Needs to be conducted
67(No Transcript)
68SD Stage 4 TA Plan
Stage 4a
- DOE Meeting with LEA (District A)
- Purpose of meeting
- to discuss results of the audit and determine
possible topics for technical assistance plan - Facilitator
- DOE NCLB Coordinator
- responsible for school improvement
- led the discussion
69Participants
- District Staff
- School Improvement coordinator
- Special Education Director
- Curriculum and Assessment Director
- Federal Programs Director
- Superintendent
- ESA person that works with district
- DOE
- (Audit facilitator was unavailable)
70Resources for discussion
- Audit Tool
- Summary document
- Rubric report
- District team notes from audit results review
- List of TA District was asking for
- Results of district Data Retreat
- Includes local assessments and other data sources
- District NCLB Report Card
71LEADERSHIP IMPLICATIONS
72Agenda
- Purpose of work
- Goal to develop TA plan between SEA LEA
- Defining TA
- More than SST helping develop SI plan
- Access to Resources
- Collaboration
- TA versus PD
- Efficiency
73Process
- Went through rubric criteria by criteria
- Read through the rubrics and discussed meaning of
rubric criteria - Read through the findings and talked about
relationship to the rubric and score - Feedback on rubric provided throughout process
- Developed a list of TA needs (issues / concerns)
as rubric discussed
74TA Topics to Research
- Each district developed a list of topics
- Topics chosen based upon identified need and of
interest to the district - Topics listed in relation to element
- Issues listed for each district
- Two similar districts (on reservations)
- Notice similarities and differences in needs
identified
75South DakotaTechnical Assistance Needs
Identified through District Audit Pilots in 2005
- 1006
76Leadership
TA Topics
- District A
- Support for Administrative PD
- Exemplar policies and procedures
- Protocols for decision-making that are efficient
- Monitoring process that is efficient
- District B
- Policies and procedures
- Monitoring
77Academic Standards
TA Topics
- District A
- Systemic strategy to communicate expectations and
standards - Support for implementation and understanding of
the elementary report card (standards based) - How do we assist teachers shift from traditional
grading to standards based, especially MS and HS?
78Curriculum Instruction
TA Topics
- District A
- Process for leadership team to support
implementation of the curriculum - How to assure access to grade level instruction,
use of performance descriptors, HS course
descriptions needed
79Curriculum Instruction
TA Topics
- District C
- Curriculum alignment
- Instruction to grade level standards
- Instructional materials
80Professional Development
TA Topics
- District A
- Further discussion needed about SBR on long range
PD plan - Evaluation of PD, beyond link to state assessment
results, exemplars - Teacher issues restructuring of schools,
requiring PD, quality issues vs. HQT
81Professional Development
TA Topics
- District B
- Professional development based on scientifically
based research strategies - District-wide Professional Development Plan using
scientifically based research methodologies - Use assessment data to evaluate impact of
professional development
82Professional Development
TA Topics
- District C
- Method or tool to assist in monitoring
implementation of professional development in the
classroom - Method to determine effectiveness of the
professional development - SBR strategies and delivery method
83Assessment and Accountability
TA Topics
- District B
- Quality assessments aligned to state content and
achievement standards (waiting for further
documentation) - Use of reports
- District C
- Access to assessment data for ALL students
(public and private) - Reconsider school structure of the Eagle Center
as it is designed for at risk students
84School Culture and Climate
TA Topics
- District B
- High expectations of students
- High expectations of teachers
- Safe Drug Free learning environments
- District C
- Guidance and assistance in maintaining a safe and
drug free environment
85Budget and Resources
TA Topics
- District B
- High expectations of students
- High expectations of teachers
- Safe Drug Free learning environments
86Budget and Resources
TA Topics
- District C
- School funding focused on student achievement
- Comparability of funding
- Comparability of teachers
- Monitoring budget priorities
- Guidance on school budget priorities
87Parents and Community
TA Topics
- District B
- Parent communication
- Community communication and involvement
- District C
- Parent communication and involvement
- Community communication and involvement
88Additional Needs of District A
TA Topics
- Assistance with changing student data system to
use state system, better access and use of data,
accuracy for accountability - Student Groups
- Students with Disabilities
- LEP
- ESL specific to needs of Native American LEP
students - Early Childhood Support
89Next Step Research TA topics
- For each topic
- what is available
- where it can be found
- Options, specifics
- Research team collects and compiles info
- DOE contracted with McREL to conduct this
research - McREL present report to full research team
- RT Report to DOE Administration for discussion,
selection, - SEA approval of TA to offer district
90Research Team
- Research team members need to be specific to the
issues raised - District SI Coordinator
- DOE staff
- Office of Special Education, Title III,
Curriculum, Office of Indian Ed - Audit facilitator and other SST members
- ESA representatives
- State partners Ed labs, HEd, consultants
91Formulating the TA Plan
Stage 4b
- Department meetings to determine
- What TA has already been offered to districts
- What additional TA that may be available
- Develop plan with responsibilities, timelines,
providers - Voluntary participation by districts
- Plan developed to prioritize challenges and
maintain strengths
92District Technical Assistance Plan
Worksheet District ______________________________
__________ Research Team Members
__________________________________________________
______
93SD Districts in Corrective Action
- Statewide panel will meet on August 4th
- Develop policy for district corrective action
- SD Accountability Workbook, Element 1.6
- Requires districts in Level 3 (corrective
action) to receive audit - Planning to use same audit tool as piloted for TA
- Longer, more intensive site-visit with larger
team - Results of audit will guide decision of
corrective action to take for corrective action - Likely implement corrective action in fall 2006
94Stage 5d Decision point
95Progress reports on completion of timeline targets
- Evaluating the Technical Assistance Plan
- Targets completed
- Evaluation of success
- What worked / What did not work
- Proposed changes or adjustments
96What Can These Metrics Tell Us About Progress
toward School Improvement?
97Program Interface Example (Progress)
98Quantitative measurement of progress AYP
Achievement Metric revisited
- Measuring Progress using the AYP metric
- Method 1 AYP Progress Ratio
99What is the AYP Progress Ratio?
- Comparison of progress made by districts or
schools across two years of AYP data using the
same AMO targets in each year using the ratio of
the metric values. - Year 2 AYP Metric Value
- Year 1 AYP Metric Value
100When can the AYP Progress Ratio be used?
- None of the following are true
- A change in the grades included in the
accountability system - A change in the cut points used to define
proficiency - A new or substantially modified assessment
101What are the AYP Progress Ratio categories?
- Exemplary improvement (Exemplars)
- Ratio value 0, (i.e., Year 2 Metric value is 0)
- Interpretation Met all AYP achievement targets
- Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0
- Year 2/Year 1 ratio value 0/0.21 0
102What are the AYP Progress Ratio categories
(contd)?
- High improvement
- Ratio value gt 0 and 0.33
- Interpretation On track to meet AYP achievement
targets next year - Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0.06
- Year 2/Year 1 ratio value 0.06/0.21 0.29
103What are the AYP Progress Ratio categories
(contd)?
- Moderate improvement
- Ratio value 0.34 and 0.67
- Interpretation On track to meet AYP achievement
targets in next 1-2 years - Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0.11
- Year 2/Year 1 ratio value 0.11/0.21 0.52
104What are the AYP Progress Ratio categories
(contd)?
- Low improvement
- Ratio value 0.68 and lt 1
- Interpretation On track to meet AYP achievement
targets in next 2-3 years - Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0.16
- Year 2/Year 1 ratio value 0.16/0.21 0.76
105What are the AYP Progress Ratio categories
(contd)?
- No improvement or continued regression
- Ratio value 1
- Interpretation Not making progress toward AYP
achievement targets (value 1) or moving farther
away from AYP achievement targets (value gt 1) - Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0.23
- Year 2/Year 1 ratio value 0.23/0.21 1.10
106What are the AYP Progress Ratio categories
(contd)?
- No improvement or continued regression
- Ratio value 1
- Interpretation Not making progress toward AYP
achievement targets (value 1) or moving farther
away from AYP achievement targets (value gt 1) - Example
- Year 1 Metric value 0.21
- Year 2 Metric value 0.23
- Year 2/Year 1 ratio value 0.23/0.21 1.10
107 108(No Transcript)
109 110(No Transcript)
111(No Transcript)
112- Effect of CI on Progress Ratio
113(No Transcript)
114Quantitative measurement of progress AYP
Achievement Metric revisited
- Measuring Progress using the AYP metric
- Method 2 AYP Progress Matrix
115What is the AYP Progress Matrix?
- Comparison of progress made by districts or
schools across two years of AYP data using
cross-tabulated quintile matrix of the AYP data
for the two years to be compared the same AMO
targets in each year.
116When can the AYP Progress Matrix be used?
117What are the AYP Progress Matrix categories?
- Substantially regressing (red)
- Moderately regressing (orange)
- Making little or no progress (yellow),
- Moderately improving (blue)
- Substantially improving (green) - (Exemplars)
118Achievement Metric ValueComparisons Across Years
Achievement Metric 04 (Quintiles)
119Next Steps?
- Feedback on progress
- Suggestions for changes or improvements
- Continuation or adjustment of plan
- Verifying predictions and identification of
exemplars
120District Audit Tool
- Comprehensive Assessment System for ESEA Title I
(CAS) SCASS in partnership with Edvantia
(formerly AEL) has completed a three year project
to help states prioritize their delivery of
support to districts and schools that fail to
meet AYP targets. The publication, District Audit
Tool A Method for Determining Level of Need for
Support to Improvement, is now available on the
CCSSO website at http//www.ccsso.org/publications
/details.cfm?PublicationID295