District Audit Tool: A Method for Determining Level of Need for Support to Improvement - PowerPoint PPT Presentation

1 / 120
About This Presentation
Title:

District Audit Tool: A Method for Determining Level of Need for Support to Improvement

Description:

Comprehensive Assessment Systems Under ESEA Title I (CAS) State Collaborative on ... Presented by Ted Jarrell , Delaware. and. Diane Lowery, South Dakota ... – PowerPoint PPT presentation

Number of Views:120
Avg rating:3.0/5.0
Slides: 121
Provided by: jansheinke
Category:

less

Transcript and Presenter's Notes

Title: District Audit Tool: A Method for Determining Level of Need for Support to Improvement


1
District Audit ToolA Method for Determining
Level of Need for Support to Improvement
  • Comprehensive Assessment Systems Under ESEA Title
    I (CAS) State Collaborative on Student Standards
    and Assessments (SCASS)
  • And 
  • Edvantia (Formerly AEL)
  • Presented by Ted Jarrell , Delaware
  • and
  • Diane Lowery, South Dakota
  • CCSSO Summer Leadership Training
    Conference--Using Data to Improve Instruction
  • Boulder, CO
  • July 31, 2006

2
How we got here
  • Capacity Building for Support and Corrective
    Action Project began in 2003
  • Projections of what was ahead under NCLB
  • Laws allowance for prioritizing
  • Change of emphasis
  • From same for everyone to variable based on need
  • From individual schools to districts
  • Defining level of need

3
The District Audit Tool Intensity and focus of
technical assistance
  • All procedures and documents based on research
  • Prioritize which districts require the most
    intense and immediate assistance.
  • Quantitatively and qualitatively judge district
    status per research-based elements of success
  • Identify needs to improve student learning.
  • Assign/confer with districts on a plan
  • How, when and by whom TA is delivered
  • Design progress monitoring on plan
    implementation.

4
(No Transcript)
5
(No Transcript)
6
Stage 3d Stage 3d Evaluation of process
7
(No Transcript)
8
Stage 5d Decision point
9
Stage 1
  • How Are The Schools That Did Not Make AYP
    Different?

10
(No Transcript)
11
(No Transcript)
12
How Bad is Bad? How might we calculate the
magnitude by which schools missed AYP? How
might we prioritize schools which did not make
AYP?
13
Discriminating among schools based on AYP
Information
14
Metric Calculation Explained
  • The AYP Achievement metric relies upon the
    distance between AYP targets and observed group
    performance.
  • In Nevada, confidence intervals are used.
  • Distance Target (Observed CI)

15
Example of Distance
D3
D1
D4
D2
Distance only calculated for groups that missed
target.
16
Metric Formula Controls
  • Control for School Size and Heterogeneity
  • Divide sum of distances by total of groups
    analyzed for achievement

17
Number of AYP Comparisons
18
2003 AYP ClassificationBy Number of AYP
Comparisons
19
2003 AYP Classification By School Size
20
2003 AYP Classification By of School
Population Minority
21
2003 AYP Classification By of School
Population SWD
22
2003 AYP Classification By of School
Population ELL
23
2003 AYP Classification By of School
Population FRL
24
Metric Formula Controls
  • Control for School Size and Heterogeneity
  • Divide sum of distances by total of groups
    analyzed for achievement
  • Control for differences in AMO targets across
    educational levels
  • Divide distances by AMO Safe Harbor targets

25
Achievement Metric Formula


Group 9 ? ELAStatusDist Group 1
Group 9 ? MathStatusDist Group 1
Group 9 ? ELASafeDist Group 1
Group 9 ? MathSafeDist Group 1
Safe Harbor Weight

Achievement Metric Value



ELA AMO
Math AMO
SH Target
SH Target
of subgroups evaluated for ELA Achievement
(Status/SH) Math Achievement (Status/SH)
Explanation of Summation Function used in Metric
Formula
Sum of the distance between observed performance
and the goal for all groups that did not meet the
relevant target. Groups 1-9 represent the school
as a whole, American Indian/ Alaskan Native,
Asian, Hispanic, Black/ African American, White,
IEP, Limited English Proficient, and Free or
Reduced Price Lunch groups.
Group 9 ? Group 1
26
2003 Achievement Metric as a Predictor of 2004
AYP Classification
Achievement Metric Quintiles (2003)
27
  • Development of this program was co-sponsored by
    the Edvantia (formerly AEL) and the Chief Council
    of State School Officers (CCSSO)

28
Preparing to Run the Program
File Layout Guide for Source Data File
  • Download the program (currently version 1.2) from
    the Edvantia website http//www.edvantia.org/aypm
    etric (Complete Documentation is provided on the
    website)
  • Install the program and the Microsoft .NET
    Framework (Free Download)
  • Requirements Must have Microsoft Excel 2002 or
    later installed
  • Create Source Data File


29
Source Data File
  • Required Data Elements for Data File
  • School/ District Name
  • School/ District Number
  • Educational Level (elementary, middle, high)
  • ELA Math Achievement Results for each of 9
    groups (Yes, No, TF)
  • ELA Math Percent Proficient Rates for each of 9
    groups Status Analysis (corrected with
    confidence interval or not)
  • ELA Math Percent Reduction in of
    Non-Proficient Students for each of 9 groups
    Safe Harbor (corrected with confidence interval
    or not)

30
Program Interface Example (Level of Need)
31
Results File
  • Program creates an Excel results file containing
    for each school or district
  • Case Number
  • School/ District Name
  • School/ District Number
  • Level
  • AYP Achievement Metric Value
  • AYP Achievement Metric Z-Score
  • Total of Achievement Analyses
  • Number of Missed Analyses
  • Any Relevant Error Messages

32
Tiered Support Based on Metric
AYP Metric Quintiles (2003)
33
Tiered Support Based on Metric
State SIP Team
Technical Advisory Team
State Remediation Funds
Participation in State Educational Programs /
Initiatives
School Improvement Plan Data Analysis Workshops
AYP Metric Quintiles (2003)
34
South Dakota Stage 1
  • Priority Points determined fall 2004 for all
    public schools
  • Used to determine weighting for allocations of
    school improvement funds for Title I schools
  • 59 Title I schools
  • Range from Level 1 (choice) to Level 4 (planning
    for restructuring)
  • Priority Points determined
  • Rank ordered and assigned to quintiles
  • Placed number of schools in each cell of matrix
    according to quintile (vertical) and SI status
    (horizontal)

35
Standards Setting Process DOE staff first set
preliminary cut pointsCommittee of Practitioners
refined cut points
36
SI Allocations (1003 funds)
  • Three factors
  • poverty, enrollment, and level of need
  • POVERTY
  • One half SI funds allocated based on number of
    children eligible for FRL
  • ENROLLMENT
  • One half of SI funds are allocated based on
    weighted school enrollment counts.
  • Enrollment counts are weighted depending on each
    eligible schools Level of Need.

37
SI Allocations (1003 funds)
  • The enrollments are weighted according to the
    factors in the following table
  • Minimum grant amounts have been set at 5000.

38
(No Transcript)
39
Stage 2a Determining Level and Type of
Assistance
  • Stage 2 Examines the validity of the decisions
    made in Stage 1 using additional data.
  • Takes into account factors other than achievement
    that may have resulted in the district/school not
    making AYP
  • Other indicator and graduation rates

40
Stage 2 b Desk Audit
  • Confirm the validity of the priority level or
    make adjustments to the level of need assignment.
  • Customize the Desk Audit for the number of
    districts you have selected.

41
Stage 2b Data Sources Wyoming Sample Page
42
Stage 2c District Audit Rubric
  • Purpose is to identify areas of concern for
    on-site visit.
  • Clarifies the focus of an on-site visit and gives
    districts an opportunity to review the
    documentation at the SEA level for district
    currency.

43
Elements of the Audit Tool
  • Leadership
  • Curriculum/Instruction
  • Highly Qualified Staff
  • Professional Development
  • Assessment/Accountability
  • School Culture/Climate
  • Budget and Resources
  • Parents and Community

44
DISTRICT DISK AUDIT RUBRICBlank Form
Scale 4 Exemplary This element contributes to
the districts success, and provides a model for
other districts to emulate. 3 Meets
expectation This element is fully functional and
all indicators are evident. 2 Area of Concern
This element is marginal. Some indicators for
this element are evident. Performance in
this area should be monitored for change and
impact on Areas of Need. 1 Area of Need No
evidence that this element is met or understood
by the district. This element would be
identified as a priority for technical assistance.
45
DISTRICT DISK AUDIT RUBRICBlank Form
46
DISTRICT DEST AUDIT RUBRIXCBlank Form
47
Stage 2c District Audit Rubric
  • Use two readers to score separately.
  • Read every word in the scoring section to see if
    the element is present.
  • A 3 is a 3 until it is a 4. Dont inflate
    based on assumptions.
  • In each element, note data sources used and
    comment on your findings for future reference.

48
Stage 2d Decision Point Level of Assistance
  • Reconsider the group into which a district has
    been placed from the initial stages.
  • If district moves to a lower level of need, an
    on-site visit may not be required.
  • If district moves to a higher level of need,
    there are many other contributing factors and a
    more comprehensive on-site visit is needed.

49
Stage 3d Stage 3d Evaluation of process
50
SD Stage 3 -- School Level
  • District A conducted a school audit of its own
    middle school
  • During its corrective action year
  • Facilitated by consultant (the chosen corrective
    action)
  • Resulted in 5 investigative teams researching 5
    major issues that arose from the audit..
  • Launched plans for restructuring

51
School Audits 2006
  • SD Accountability Workbook, Element 1.6
  • Schools in level 4 (restructuring)
  • Receive an audit
  • All districts will use the school level audit
    tool
  • District responsibility
  • Training and Support by SST (School Support Team)
  • Conducted in early fall
  • Drive the restructuring plan development

52
NevadaUsing Elements Related to School Success
Rubric for CSR Decisions
  • Did desk audit with info available
  • Supplied each school with opportunity to assemble
    materials
  • Conducted on-site visits
  • Awarded schools CSR funds

53
SD Stage 3 -- On-Site Visit
  • District Level
  • Conducted on-site reviews of three districts
  • Facilitator was SST (School Support Team) member
  • Team of 4-6
  • SST
  • ESA (Educational Service Agency) staff
  • SEA staff

54
Prep for On-site
  • Facilitator
  • Asked district for additional data
  • Provide LEA with audit tool completed with
    preliminary findings but no scores
  • Protocols were not shared in advance
  • Determine and notify district of which schools in
    district will be visited
  • Randomly select teacher and alternate. Notify
    district of teachers to be interviewed.
  • Finalize itinerary, prepare field documents
  • Ongoing communication with LEA and team

55
Prep for On-site
  • District
  • Provide additional data to audit facilitator
  • Inform schools of upcoming visit
  • Review audit tool with preliminary findings with
    no scores
  • assemble additional evidence, responses
  • NCLB Coordinator
  • Secured additional data from SEA that were
    discovered

56
On-Site Visit
Stage 3b
  • AGENDA
  • Day One
  • Organizational meeting upon entering district
  • School visits
  • Community/Parent Focus Group
  • Team meeting to complete rubric
  • Day Two
  • Board member interviews
  • Additional data review
  • Meeting with district leadership team

57
Organizational Meeting
  • District administrative team
  • SI Coordinator, Federal Programs Director, SPED
    Director, Superintendent
  • Curriculum and assessment director was
    unavailable
  • Communicate purpose of the visit help gather
    info for writing TA plan
  • data gathering process
  • Ultimate goal TA plan
  • Not compliance

58
Organizational Meeting (cont.)
  • Schools to visit (all in SI)
  • Location, directions, lunch arrangements
  • HS, MS, in town elementary schools (K-3 and 4-5)
  • Two outlying schools (K-8)
  • 1 school per reviewer except for two elementary
    schools in town were reviewed by one person and
    MS (level 4 SI) had a team of two review them
  • Teachers to interview
  • Timelines

59
On-site visit to schools
  • Team member met with principal
  • Principal conducted general walk through and
    interview
  • Teacher interview
  • Observations conducted in as many classrooms as
    possible

60
Focus Groups
  • High School Students
  • 12 Students interviewed
  • Principal selected
  • Participants Male/female, Older/younger
  • Used protocol
  • Parent focus group
  • 8 parents, all school employees (by choice, not
    selection)
  • Used protocol
  • No district parent council

61
Discuss Results of Days Work
  • Debrief, Informal over dinner
  • Formal meeting in evening
  • Conferred on findings from the day
  • Criteria by criteria
  • Collectively formulating audit results from day
    of observations and interviews
  • Noted questions for further review the next
    morning

62
Day 2
  • Board member interviews
  • Further documentation review and inquiries
  • District Leadership Team (previously scheduled)
  • all building principals, district office staff,
    and superintendent
  • Facilitator reported out initial findings of the
    audit
  • Some feedback, clarification

63
Post On-site Visit
Stage 3c
  • Team members conducted an additional teacher
    interview to outlying schools by phone call
  • Team members scored rubric individually and sent
    completed rubric to facilitator
  • More time to review the scoring
  • More independent scoring
  • Negotiated discrepancies in scores by email
  • One overall score for district per element,
    criteria

64
Prior to DOE meeting with LEA
  • Facilitator compiled and developed a report of
    the review
  • Facilitator attended district data retreat as an
    SST member and gave the LEA the completed rubric
  • Followed a similar process for Districts B C

65
Interim Activities
  • District went through the completed rubric
  • Identified TA they would like
  • Gave LEA time to process and have internal
    discussion
  • DOE set date and time of meeting
  • Primary contact in district is Si Coordinator
  • Asked district to provide summary of district
    data retreat

66
Reflection and Feedback
Stage 3d
  • Several opportunities for informal feedback
  • 2nd day of on-site with district leadership team
  • Discussion or audit results during DOE visit
  • Formal survey
  • Need to develop survey protocol
  • For LEA
  • For audit team
  • Needs to be conducted

67
(No Transcript)
68
SD Stage 4 TA Plan
Stage 4a
  • DOE Meeting with LEA (District A)
  • Purpose of meeting
  • to discuss results of the audit and determine
    possible topics for technical assistance plan
  • Facilitator
  • DOE NCLB Coordinator
  • responsible for school improvement
  • led the discussion

69
Participants
  • District Staff
  • School Improvement coordinator
  • Special Education Director
  • Curriculum and Assessment Director
  • Federal Programs Director
  • Superintendent
  • ESA person that works with district
  • DOE
  • (Audit facilitator was unavailable)

70
Resources for discussion
  • Audit Tool
  • Summary document
  • Rubric report
  • District team notes from audit results review
  • List of TA District was asking for
  • Results of district Data Retreat
  • Includes local assessments and other data sources
  • District NCLB Report Card

71
LEADERSHIP IMPLICATIONS
72
Agenda
  • Purpose of work
  • Goal to develop TA plan between SEA LEA
  • Defining TA
  • More than SST helping develop SI plan
  • Access to Resources
  • Collaboration
  • TA versus PD
  • Efficiency

73
Process
  • Went through rubric criteria by criteria
  • Read through the rubrics and discussed meaning of
    rubric criteria
  • Read through the findings and talked about
    relationship to the rubric and score
  • Feedback on rubric provided throughout process
  • Developed a list of TA needs (issues / concerns)
    as rubric discussed

74
TA Topics to Research
  • Each district developed a list of topics
  • Topics chosen based upon identified need and of
    interest to the district
  • Topics listed in relation to element
  • Issues listed for each district
  • Two similar districts (on reservations)
  • Notice similarities and differences in needs
    identified

75
South DakotaTechnical Assistance Needs
Identified through District Audit Pilots in 2005
- 1006
76
Leadership
TA Topics
  • District A
  • Support for Administrative PD
  • Exemplar policies and procedures
  • Protocols for decision-making that are efficient
  • Monitoring process that is efficient
  • District B
  • Policies and procedures
  • Monitoring

77
Academic Standards
TA Topics
  • District A
  • Systemic strategy to communicate expectations and
    standards
  • Support for implementation and understanding of
    the elementary report card (standards based)
  • How do we assist teachers shift from traditional
    grading to standards based, especially MS and HS?

78
Curriculum Instruction
TA Topics
  • District A
  • Process for leadership team to support
    implementation of the curriculum
  • How to assure access to grade level instruction,
    use of performance descriptors, HS course
    descriptions needed

79
Curriculum Instruction
TA Topics
  • District C
  • Curriculum alignment
  • Instruction to grade level standards
  • Instructional materials

80
Professional Development
TA Topics
  • District A
  • Further discussion needed about SBR on long range
    PD plan
  • Evaluation of PD, beyond link to state assessment
    results, exemplars
  • Teacher issues restructuring of schools,
    requiring PD, quality issues vs. HQT

81
Professional Development
TA Topics
  • District B
  • Professional development based on scientifically
    based research strategies
  • District-wide Professional Development Plan using
    scientifically based research methodologies
  • Use assessment data to evaluate impact of
    professional development

82
Professional Development
TA Topics
  • District C
  • Method or tool to assist in monitoring
    implementation of professional development in the
    classroom
  • Method to determine effectiveness of the
    professional development
  • SBR strategies and delivery method

83
Assessment and Accountability
TA Topics
  • District B
  • Quality assessments aligned to state content and
    achievement standards (waiting for further
    documentation)
  • Use of reports
  • District C
  • Access to assessment data for ALL students
    (public and private)
  • Reconsider school structure of the Eagle Center
    as it is designed for at risk students

84
School Culture and Climate
TA Topics
  • District B
  • High expectations of students
  • High expectations of teachers
  • Safe Drug Free learning environments
  • District C
  • Guidance and assistance in maintaining a safe and
    drug free environment

85
Budget and Resources
TA Topics
  • District B
  • High expectations of students
  • High expectations of teachers
  • Safe Drug Free learning environments

86
Budget and Resources
TA Topics
  • District C
  • School funding focused on student achievement
  • Comparability of funding
  • Comparability of teachers
  • Monitoring budget priorities
  • Guidance on school budget priorities

87
Parents and Community
TA Topics
  • District B
  • Parent communication
  • Community communication and involvement
  • District C
  • Parent communication and involvement
  • Community communication and involvement

88
Additional Needs of District A
TA Topics
  • Assistance with changing student data system to
    use state system, better access and use of data,
    accuracy for accountability
  • Student Groups
  • Students with Disabilities
  • LEP
  • ESL specific to needs of Native American LEP
    students
  • Early Childhood Support

89
Next Step Research TA topics
  • For each topic
  • what is available
  • where it can be found
  • Options, specifics
  • Research team collects and compiles info
  • DOE contracted with McREL to conduct this
    research
  • McREL present report to full research team
  • RT Report to DOE Administration for discussion,
    selection,
  • SEA approval of TA to offer district

90
Research Team
  • Research team members need to be specific to the
    issues raised
  • District SI Coordinator
  • DOE staff
  • Office of Special Education, Title III,
    Curriculum, Office of Indian Ed
  • Audit facilitator and other SST members
  • ESA representatives
  • State partners Ed labs, HEd, consultants

91
Formulating the TA Plan
Stage 4b
  • Department meetings to determine
  • What TA has already been offered to districts
  • What additional TA that may be available
  • Develop plan with responsibilities, timelines,
    providers
  • Voluntary participation by districts
  • Plan developed to prioritize challenges and
    maintain strengths

92
District Technical Assistance Plan
Worksheet District ______________________________
__________ Research Team Members
__________________________________________________
______
93
SD Districts in Corrective Action
  • Statewide panel will meet on August 4th
  • Develop policy for district corrective action
  • SD Accountability Workbook, Element 1.6
  • Requires districts in Level 3 (corrective
    action) to receive audit
  • Planning to use same audit tool as piloted for TA
  • Longer, more intensive site-visit with larger
    team
  • Results of audit will guide decision of
    corrective action to take for corrective action
  • Likely implement corrective action in fall 2006

94
Stage 5d Decision point
95
Progress reports on completion of timeline targets
  • Evaluating the Technical Assistance Plan
  • Targets completed
  • Evaluation of success
  • What worked / What did not work
  • Proposed changes or adjustments

96
What Can These Metrics Tell Us About Progress
toward School Improvement?
97
Program Interface Example (Progress)
98
Quantitative measurement of progress AYP
Achievement Metric revisited
  • Measuring Progress using the AYP metric
  • Method 1 AYP Progress Ratio

99
What is the AYP Progress Ratio?
  • Comparison of progress made by districts or
    schools across two years of AYP data using the
    same AMO targets in each year using the ratio of
    the metric values.
  • Year 2 AYP Metric Value
  • Year 1 AYP Metric Value

100
When can the AYP Progress Ratio be used?
  • None of the following are true
  • A change in the grades included in the
    accountability system
  • A change in the cut points used to define
    proficiency
  • A new or substantially modified assessment

101
What are the AYP Progress Ratio categories?
  • Exemplary improvement (Exemplars)
  • Ratio value 0, (i.e., Year 2 Metric value is 0)
  • Interpretation Met all AYP achievement targets
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0
  • Year 2/Year 1 ratio value 0/0.21 0

102
What are the AYP Progress Ratio categories
(contd)?
  • High improvement
  • Ratio value gt 0 and 0.33
  • Interpretation On track to meet AYP achievement
    targets next year
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0.06
  • Year 2/Year 1 ratio value 0.06/0.21 0.29

103
What are the AYP Progress Ratio categories
(contd)?
  • Moderate improvement
  • Ratio value 0.34 and 0.67
  • Interpretation On track to meet AYP achievement
    targets in next 1-2 years
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0.11
  • Year 2/Year 1 ratio value 0.11/0.21 0.52

104
What are the AYP Progress Ratio categories
(contd)?
  • Low improvement
  • Ratio value 0.68 and lt 1
  • Interpretation On track to meet AYP achievement
    targets in next 2-3 years
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0.16
  • Year 2/Year 1 ratio value 0.16/0.21 0.76

105
What are the AYP Progress Ratio categories
(contd)?
  • No improvement or continued regression
  • Ratio value 1
  • Interpretation Not making progress toward AYP
    achievement targets (value 1) or moving farther
    away from AYP achievement targets (value gt 1)
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0.23
  • Year 2/Year 1 ratio value 0.23/0.21 1.10

106
What are the AYP Progress Ratio categories
(contd)?
  • No improvement or continued regression
  • Ratio value 1
  • Interpretation Not making progress toward AYP
    achievement targets (value 1) or moving farther
    away from AYP achievement targets (value gt 1)
  • Example
  • Year 1 Metric value 0.21
  • Year 2 Metric value 0.23
  • Year 2/Year 1 ratio value 0.23/0.21 1.10

107
  • Overall Progress

108
(No Transcript)
109
  • Progress
  • by
  • Content Area

110
(No Transcript)
111
(No Transcript)
112
  • Effect of CI on Progress Ratio

113
(No Transcript)
114
Quantitative measurement of progress AYP
Achievement Metric revisited
  • Measuring Progress using the AYP metric
  • Method 2 AYP Progress Matrix

115
What is the AYP Progress Matrix?
  • Comparison of progress made by districts or
    schools across two years of AYP data using
    cross-tabulated quintile matrix of the AYP data
    for the two years to be compared the same AMO
    targets in each year.

116
When can the AYP Progress Matrix be used?
  • No restrictions

117
What are the AYP Progress Matrix categories?
  • Substantially regressing (red)
  • Moderately regressing (orange)
  • Making little or no progress (yellow),
  • Moderately improving (blue)
  • Substantially improving (green) - (Exemplars)

118
Achievement Metric ValueComparisons Across Years
Achievement Metric 04 (Quintiles)
119
Next Steps?
  • Feedback on progress
  • Suggestions for changes or improvements
  • Continuation or adjustment of plan
  • Verifying predictions and identification of
    exemplars

120
District Audit Tool
  • Comprehensive Assessment System for ESEA Title I
    (CAS) SCASS in partnership with Edvantia
    (formerly AEL) has completed a three year project
    to help states prioritize their delivery of
    support to districts and schools that fail to
    meet AYP targets. The publication, District Audit
    Tool A Method for Determining Level of Need for
    Support to Improvement, is now available on the
    CCSSO website at http//www.ccsso.org/publications
    /details.cfm?PublicationID295
Write a Comment
User Comments (0)
About PowerShow.com