TEA Performance Based Monitoring PBM - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

TEA Performance Based Monitoring PBM

Description:

TEA Performance Based Monitoring (PBM) for 2004-2005. A Brief ... Craig Henderson (craig.henderson_at_esc13.txed.net) JoAnn Schatz (joann.schatz_at_esc13.txed.net) ... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0
Slides: 31
Provided by: regio86
Category:

less

Transcript and Presenter's Notes

Title: TEA Performance Based Monitoring PBM


1
TEA Performance Based Monitoring (PBM) for
2004-2005 A Brief Overview of Where We
Are Craig Henderson (craig.henderson_at_esc13.txed
.net) JoAnn Schatz (joann.schatz_at_esc13.txed.net) J
ohn Fessenden (john.fessenden_at_esc13.txed.net)
2
  • PBM is TEAs New Monitoring System
  • Background to PBM
  • New Legislation
  • New Agency Organization
  • New Strategies for Monitoring
  • PBM Analyses of District Effectiveness

3
New Legislation HB 3459 78th Texas Legis.
Regular Session
  • Establishes limits on TEA monitoring with the
    exception of special education
  • Includes new section on bilingual education
  • Includes new local board of trustee
    responsibilities for ensuring school district
    compliance with all applicable requirements of
    state programs
  • EMPHASIZES DATA INTEGRITY

4
New Agency Organization
  • Revised alignment of Agency functions
  • Shift away from process toward results
  • Focus on coordinated approach to agency
    monitoring
  • More creative application of sanctions and
    interventions, with on-site monitoring the
    intervention of last resort
  • STRONG EMPHASIS ON DATA INTEGRITY

5
Texas Education Agency
Proudly Serving the Students, Parents, and
Communities of Texas
Policy Development

Policy Implementation
Governor
State Board of Education
Internal Audit1 Bill Wilson
1Includes Performance Audit
Commissioner of Education Shirley Neeley
General CounselDavid Anderson
Chief Investment OfficerPermanent School
FundHolland Timmins
Governmental Relations2
2Includes TCWEC
Chief Deputy CommissionerRobert Scott
P-16 Coordination
CommunicationsDebbie Graves Ratcliffe
Education InitiativesChristi Martin
SBOE Support Renee Jackson
Associate CommissionerStandards and
ProgramsSusan Barnes
Associate CommissionerAccountability and Data
QualityCriss Cloudt
Associate Commissioner Support Services School
FinanceErnest Zamora
Associate Commissioner Planning, Grants, and
EvaluationNora Hancock
Associate Commissioner Operations and Fiscal
ManagementAdam Jones
Deputy Associate Commissioner Standards and
AlignmentSharon Jackson
Deputy Associate Commissioner School Finance and
Fiscal AnalysisJoe Wisnoski
Deputy Associate Commissioner Strategy and Grants
ManagementDan Arrigona
Deputy Associate CommissionerOperations Walter
Tillman
Deputy Associate Commissioner Special Programs,
Monitoring, and InterventionsGene Lenz
Deputy Associate Commissioner Support
ServicesKaren Case
Deputy Associate Commissioner Accountability and
Performance Monitoring Nancy Stevens
Curriculum3 George Rislov
Charter SchoolsMary Perry
FiscalManagement Shirley Beaulieu
Information SystemsBrian Rawson
Agency InfrastructureBill Hoppe
Human ResourcesHarvester Pope
Planning and EvaluationMike Jensen
IDEA4Coordination Kathy Clayton
High School Completion and Student Support6 Greg
Travillion
State Funding8 Vacant
Performance Reporting Shannon Housson
Accountability ResearchKaren Dvorak
PEIMSVacant
Accountability Development Vacant
Student AssessmentLisa Chandler
Financial Audits9Tom Canby
Program Monitoring and Interventions5Laura
Taylor
Driver Training Mike Peebles
Education Services7Philip Cochran
Project Management OfficeRick Goldgar
Budget Dana Aikman
Discretionary Grants AdministrationEarin Martin
Information Analysis Nina Taylor
Textbooks Robert Leos
NCLB Program Coordination Cory Green
Performance-Based MonitoringRachel Harrington
GovernanceRon Rowell
AccountingDianne Wheeler
Technology OperationsDale Krueger
Formula Grants Administration10Ellsworth Schave
PolicyCoordination Cristina De La Fuente-Valadez
Purchasing and ContractsNorma Barrera
4Includes Special Education Programs, Special
Education Complaints, and Deaf Services
3Includes Career and Technology, Advanced
Academics, Bilingual Education, Ed. Technology,
and Educator Development
8Includes Child Nutrition
6Includes GED, PEP, Guidance Counseling, Safe
Schools, CIS, In-School GED, and In-School Driver
Training
7Includes ESC Liaison, Waivers, and Adult
Education
10Includes NCLB, Migrant, IDEA, CATE, and ESC
Funding
Effective 9-1-2004
5Includes Program Monitoring, Data Inquiry, EEO,
Complaints Mgmt. and Interventions
9Includes Grant Monitoring
6
New Strategies for Monitoring
Monitoring Redefined Monitoring is 1. Using
a data-driven, performance-based model to
observe, evaluate, and report on the public
education system at the individual student group,
campus, local education agency, regional, and
statewide levels across diverse areas including
program effectiveness compliance with federal
and state law and regulations financial
management and data integrity for the purpose of
assessing that student needs are being
met 2. Promoting diagnostic and evaluative
systems in LEAs that are integrated with the
agencys desk and on-site monitoring processes
and 3. Relying on a research-based framework of
interventions that ensures compliance and
enhances student success.
7
New Strategies for Monitoring
  • Moving toward an umbrella evaluation that can
    integrate different agency components
  • New state accountability system
  • Federal accountability provisions (AYP under
    NCLB)
  • New performance-based monitoring system
  • School FIRST rating system
  • Federal program and fiscal compliance
  • Other financial audits

8
State Evaluation Components
Program Effectiveness Finance Compliance
Program Effectiveness
Auxiliary
Student Performance Data Integrity
Monitoring
Responsibilities
Data analyses
State Accountability Ratings
Federal program fiscal compliance
AYP
PBM
School FIRST ratings
Other state financial audits
Interventions Framework
9
New Strategies for Monitoring
  • Overall Goals
  • Achieve an integration of indicators and
    interventions
  • Deliver a consistent and coordinated response
    focused on areas that need improvement
  • Take into account both the extent and the
    duration of a districts area(s) of low
    performance/program ineffectiveness

10
PBM Analyses of District Effectiveness
Annual Analyses Based on Two Types of
Indicators 1. Student Performance/Core
Areas 2 Data Integrity
11
PBM Analyses of District Effectiveness
Development Process
12
Student Performance/Core Area Indicators
  • Indicators of student performance are currently
    being developed for
  • Bilingual education
  • Special education
  • CATE
  • Title I, Part A (economically disadvantaged)
  • Title I, Part C (migrant education)
  • Title II (highly qualified)
  • Title III (limited English proficient students)
  • Title IV, Part A (safe and drug-free schools)

13
Proposed 2004-05 PBM Student Performance/Core
Area Indicators by Program Area
14
PBM Student Performance/Core Area Indicators by
Program Area
  • Bilingual Education 2004-05
  • LEP performance on TAKS
  • 2. Performance on Spanish TAKS (Gr. 3-6)
  • 3. LEP participation in TAKS
  • 4. LEP student exemption rate from TAKS
  • 5. Analysis of TAKS performance one year after
    exit
  • 6. LEP Gr. 7-12 annual dropout rate
  • 7. RPTE performance
  • 8. LEP students receiving recommended or
    distinguished achievement diploma

15
PBM Student Performance/Core Area Indicators by
Program Area
  • CATE 2004-05
  • CATE performance on TAKS
  • 2. CATE SpEd performance on TAKS
  • 3. CATE LEP participation in TAKS
  • 4. CATE economically disadvantaged performance on
    TAKS
  • 5. CATE Tech Prep performance on TAKS
  • 6. CATE completion/dropout rate
  • 7. Non-traditional course completion by females
  • 8. Non-traditional course completion by males

16
PBM Student Performance/Core Area Indicators by
Program Area
  • NCLB Title Programs 2004-05
  • Title I, Part A AYP performance
  • 2. Title I, Part C Migrant performance on TAKS
  • 3. Title I, Part C Migrant completion/dropout
    rate
  • 4. Title II District and/or campus analysis of
    met highly qualified standards (REPORT ONLY for
    2004-05)
  • 6. Title III - of LEP students making progress
    of at least one proficiency level a year in
    acquiring English Language Proficiency as
    determined by state AMAOs
  • 7. Title III - of LEP students attaining
    English language proficiency as determined by
    state AMAOs
  • 8. Title IV LEA incident rate

17
PBM Student Performance/Core Area Indicators by
Program Area
  • Special Education 2004-05
  • Identification of students with disabilities
  • 2. Identification of African American students as
    SpEd
  • 3. Identification of Hispanic students as SpEd
  • 4. Identification of LEP students as SpEd
  • 5. SpEd student participation in TAKS
  • 6. SpEd student performance on TAKS
  • 7. Participation in SDAA
  • 8. ARD-exemption rate
  • Students taking SDAA at a level within 2 years of
    assigned grade level
  • 10. 3-11 year old SpEd students with
    instructional arrangement codes 40 (mainstream)
    or 41 (less than 21 resource room/services)

18
PBM Student Performance/Core Area Indicators by
Program Area
Special Education 2004-05 (cont.) 11. 12-21
year old SpEd students with instructional
arrangement codes 40 (mainstream) or 41 (less
than 21 resource room/services) 12. Disproportion
ate discretionary DAEP placements compared to
general education 13. Disproportionate
discretionary expulsions compared to general
education 14. Disproportionate removals to ISS
compared to general education 15. Grade 7-12 SpEd
annual dropout rate 16. SpEd population dismissed
in Grades 2-10 who pass TAKS in year after
exit 17. SpEd students receiving recommended or
distinguished achievement diploma
19
Data Integrity Indicators
  • Integrity means
  • An unimpaired condition soundness of the data
  • The quality or state of being complete
    completeness of the data
  • Firm adherence to standards incorruptibility of
    the data
  • In sum data that are error-free and truthful
  • Data integrity indicators are needed due to
  • Statute (HB 3459)
  • Data integrity is CRITICAL
  • Many financial, accountability, and other
    decisions are made based on data so data must
    be ACCURATE
  • If the various different state evaluation
    components are increasingly reliant on data
    rather than on-site visits, the data that feed
    into the system must be RELIABLE
  • Inaccurate or unreliable data submitted by a
    district does not release the district of
    responsibility or mean that an intervention or
    sanction wont occur

20
Data Integrity Indicators
  • Indicators of data integrity are currently being
    developed for 2004-05 for the following areas
  • Analysis of Leaver Records
  • Analysis of State Assessment Data
  • Safe Schools/DAEP Evaluation
  • Other Data Integrity Analyses

21
Proposed 2004-05 PBM Data Integrity Indicators by
Area
22
Data Integrity Indicators by Area
  • Analysis of Leaver Records 2004-05
  • 100 single leaver code use
  • LEA reported all leavers with the same code
  • High use of leaver codes
  • LEA in 97.5 ile of use of one or more leaver
    codes compared to same district type
  • Zero dropouts and high use of leaver intent
    codes
  • LEA reports 0 dropouts and in 97.5 ile of use of
    intent codes compared to same district type
  • Official in-state leavers
  • LEA in 97.5 ile of official in-state leavers
    compared to district type
  • Underreported leavers
  • Underreported leaver count exceeds 500 or 5 of
    total leavers reported
  • Trend analysis
  • Precipitous decrease in dropout rates over a
    3-year period

23
Data Integrity Indicators by Area
  • Analysis of Leaver Records Use of District Type
  • Comparison of LEAs based on relevant norms (e.g.,
    state, regional, urban/rural)
  • District type is a variable in the
    comprehensive ANALYZE system
  • Combines several elements (size, growth rate,
    student economic status, proximity to urban
    areas)
  • Used by TEA to analyze enrollment and retention
    patterns
  • Used by independent contractors to analyze
    statewide data
  • Stable over time
  • Shows enough variation for comparison between
    types
  • Each LEA compared to a pool of similar LEAs

24
Data Integrity Indicators by Area
  • Analysis of Leaver Records - Nine District Types
  • 1. Major Urban Austin ISD
  • 2. Major Suburban Goose Creek ISD, Castleberry
    ISD
  • 3. Other Central City Brownsville ISD, McAllen
    ISD
  • 4. Other Central City Suburban Port Arthur ISD,
    Harlingen ISD
  • 5. Independent Town Victoria ISD, Winnsboro ISD
  • 6. Non-Metro Fast Growing Somerset ISD, Harper
    ISD
  • 7. Non-Metro Stable Snyder ISD, Sheldon ISD
  • 8. Rural Valley View ISD (049903), Veribest ISD
  • 9. Charter Waco Charter School, George I.
    Sanchez Charter School

25
Data Integrity Indicators by Area Use of
District Type
26
Data Indicators by Area
  • Analysis of State Assessment Data 2004-05
  • Eligible vs. Takers
  • 25 difference in 6th 6-week attendance an of
    test documents received
  • Excessive Absences
  • documents coded Absent more than 5 higher
    than appropriate 6-week absence rate OR- any
    ethnic group with 100 attendance and at least 5
    documents coded as Absent
  • Excessive Alternative Assessments
  • High age of SpEd students assessed through SDAA
  • Excessive ARD Exemptions
  • High age of SpEd students exempt from TAKS
    (statutory)
  • Excessive LEP Exemptions
  • High age of LEP students exempt

27
Data Indicators by Area
  • Safe Schools/DAEP Evaluation 2004-05
  • Placement in DAEP or Expulsion
  • Mandatory DAEP placement or discretionary
    expulsion without correct Disciplinary Acton Code
  • Mandatory Expellable Behavior
  • Mandatory expulsion without a correct DAC
  • Student Expulsion
  • Reported expulsion was for a reason NOT permitted
    by law
  • Length of Out-of-School Suspension
  • Students were reported as OSS for more than 3
    school days
  • Expulsion to JJAEP
  • Expulsion to JJAEP in large counties without
    correct DAC
  • Zero Reported Incidents
  • One or more campuses with no reported discipline
    data for 2 or more years
  • Discretionary DAEP Placement
  • High age of discretionary DAEP placements

28
Data Indicators by Area
  • Safe Schools/DAEP Evaluation
  • Analysis for 2004-05 will be REPORT ONLY using
    2002-03 data
  • Analysis accounts for unique considerations
    regarding the application of Chapter 37
    requirements to charter schools
  • Will not include continuations from one district
    to another

29
Data Indicators by Area
  • Other Data Integrity Areas 2004-05
  • Attendance Audits
  • 2. PID Errors

30
Data Integrity Indicators
FUTURE CONSIDERATIONS a. Reduction of minimum
size criteria over time b. Adjustment of
standards over time c. Validation of
indicators d. Validation of system
Write a Comment
User Comments (0)
About PowerShow.com