Federal, State, and Local Accountability: Using Data to Guide Instruction - PowerPoint PPT Presentation

Loading...

PPT – Federal, State, and Local Accountability: Using Data to Guide Instruction PowerPoint presentation | free to download - id: 1f7b1a-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Federal, State, and Local Accountability: Using Data to Guide Instruction

Description:

... measure that follows LEP students from entry in a Texas public school ... Sample Elementary School 2003 2004: Step Three. 29. Comparison Framework: An Example ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 34
Provided by: iam8
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Federal, State, and Local Accountability: Using Data to Guide Instruction


1
Federal, State, and Local Accountability Using
Data to Guide Instruction
  • San Antonio ISD Nuts and Bolts
  • July 2004
  • Office of Research, Evaluation, and Assessment

2
Federal Accountability No Child Left Behind
(NCLB) and Adequate Yearly Progress
  • AYP performance requirements are met if the level
    of proficiency for all students and each student
    group summed across grades 3-8 and 10 in
    reading/language arts and mathematics meets or
    exceeds AYP targets.
  • The 2003 - 2004 targets are 47 for Reading and
    33 for Math.
  • The targets for 2004 - 2005 and 2005 - 2006
    are 54 for reading/language arts and 42 for
    mathematics based on a US Department of Education
    requirement.
  • AYP participation requires 95 percent of all
    students and each student group be tested in
    order to meet participation requirements,
    calculated separately for reading math.
    Participation for 2003
  • and 2004 will also be averaged, and if this
    number reaches
  • 95, the standard will be met. Students taking
    any of the
  • following tests will be counted for
    participation TAKS, SDAA,
  • LDAA, RPTE if the student was LEP exempt, and
    local math test for LEP exempt students.

3
Other AYP Issues
  • Other AYP requirements that must be met for all
    students 70 graduation rates for high schools
    and 90 attendance rates for middle and
    elementary schools.
  • The TEA request to include
    continuers in the graduation rate was
    denied.
  • Of note
  • The federal government denied TEAs request to
    use a do not evaluate for participation if
    fewer than 5 students were absent in a group.
  • SDAA baseline tests with no ARD expectations are
    considered as failers for performance.
  • Limited English Proficient students are not
    counted for performance in their first year in
    the U.S. LEP students are counted after the
    first year, and RPTE may be used for a reading
    score, but the local mathematics test for 2nd and
    3rd year LEP students will count as failing.
  • Sufficient performance gain from 2003 to 2004
    (10) may act as required improvement, if gains
    are also made in attendance or graduation rates.

4
How will AYP Performance be calculated?
  • Calculate 1 of your total tested population,
    which is grades 3-8 and 10, with the exception of
    first year immigrants. (This will probably be
    less than 6 students).
  • Add the following together
  • Students who meet ARD requirements on SDAA
  • Students who meet ARD requirements on LDAA
  • LEP Students who took RPTE and advanced a
    proficiency level in year two or three (Counts
    for ReadingLocal Math assessment does not
    count). By year 3, must be at Advanced Level.
  • If the number of students who met alternative
    testing requirements is 1 or less than the total
    tested population, then all of those students are
    counted as passers. Any additional students
    beyond the 1 cap are counted as failers
    regardless of their performance.
  • Add the number of passers from the calculation
    above to the number of students who passed TAKS
    and place this number over the total tested
    population.
  • This calculation will be made at the all
    student level and for each group that meets the
    minimum size requirement. (50 students and 10 or
    200 students or more).
  • It may be possible to appeal if the 1 cap caused
    the campus to miss AYP.

5
State Accountability Brief Explanation
  • Texas Assessment of Knowledge and Skills (TAKS)
    calls for students to answer more questions
    correctly each year for the first three years of
    implementation.
  • For the purpose of evaluation, we examine data at
    the 1 Standard Error of Measurement
    (1SEM) standard for 2003 2004 (grade 11 is at
    2SEM).
  • For the purpose of planning, we examine data at
    the Panel Recommendation (PR) standard for 2004
    2005.
  • The PR standard for 2004 2005 should represent
    the systematic advancement in testing standards.

Standard Error of Measurement If a single
student were to take the same test repeatedly
(with no new learning taking place between
testings and no memory of question effects), the
standard deviation of his/her repeated test
scores would be the SEM.
6
Texas Assessment of Knowledge and Skills (TAKS)
Indicators
Subjects Grades 2004 2005 2006
Reading ELA ELA 3-9 10 11 1 SEM 1 SEM 2 SEM PR PR 1 SEM PR
Writing 4,7 1 SEM PR PR
Mathematics 3-10, 11 1 SEM 2 SEM PR 1 SEM PR
Social Studies 8,10 11 1 SEM 2 SEM PR 1 SEM PR
Science 5,10 11 8 1 SEM 2 SEM n/a PR 1 SEM n/a PR PR TBD
7
State Performance is Measured for
  • 1. All students
  • 2. African-American students
  • 3. Hispanic students
  • 4. White students
  • 5. Economically Disadvantaged students
  • 6. Special Education students (Currently through
    the State Developed Alternative Assessment System
    measure only).

See http//www.tea.state.tx.us/perfreport/accoun
t/2004/manual/ for additional information.
8
Current and Prospective TAKS Standards
2004 2005 2006 2007 2008 2009
Exemplary 90 90 90 90 90 90
Recognized 70 70 70 80? 80 80 80
Academically Acceptable Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated
R/ELA, W, SS 50 50 50 Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated
Mathematics 35 35 35 Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated
Science 25 25 25 Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated Standards increase incrementally until AA standards for all subjects reach 70. Timeline for phasing in higher standards to be developed once data on performance gains are evaluated
Student Passing Standard 3 -10 1SEM 11 2SEM 3-10 PR 11 1SEM 3 -11 PR 3 -11 PR 3 -11 PR 3 -11 PR
Note Recognized standard for 2006 has not been
finalized.
9
State-Developed Alternative Assessment (SDAA)
Indicator
  • Single performance indicator for SDAA (grades 3-8
    in 2004) Number of tests on which students met
    ARD expectations divided by the number of tests
    taken summed across grades and subjects.
  • All Students level only. The result is one
    single percentage.
  • SDAA administered for three subjects (reading,
    writing, and mathematics) and results summed
    across subjects and grades, the 30 tests minimum
    size requirement can represent as few as 10
    students.
  • 2004 Standard Exemplary 90
  • Recognized 70
  • Academically Acceptable 50

10
Phase-In of New State Assessment Program
Required Improvement
  • Schools may move from Academically Unacceptable
    to Academically Acceptable if they show enough
    improvement on the deficient measure to meet the
    Acceptable standard in two years.
  • Schools may move from Academically Acceptable to
    Recognized if they show enough improvement on the
    deficient measure to meet the Recognized standard
    in two years. In addition, the current score
    must be 65 or higher.

11
Phase-In of New State Assessment Program
Exceptions
  • Exceptions Provision Automatic exception
    provision applied to selected assessment measures
    used in the accountability system. Only if a
    campus or district fails to receive an
    academically acceptable rating solely due to not
    meeting the accountability criteria. Exception
    granted automatically if the campus or district
    meets all of the other conditions described
    below.
  • Measures Applies to 26 assessment measures 25
    TAKS measures (5 subjects x 5 student groups)
    plus the SDAA measure.

12
Maximum Exceptions
  • Maximum Exceptions Maximum exceptions granted is
    dependent on the number of assessment measures on
    which the campus or district is evaluated

Assessment Measures Evaluated Maximum Exceptions
2 - 6 0 Exceptions
7 - 12 1 Exception
13 - 18 2 Exceptions
18 - 26 3 Exceptions
13
Exceptions Provision cont.
  • Ratings Criteria Applied to move from
    Academically Unacceptable to Academically
    Acceptable.
  • Performance Floor Performance on exception
    measure must be no more than 5 percentage points
    below the accountability standard for the
    Academically Acceptable rating level.
  • One-Time Use Allowed only one time per cell.
  • Annual Review Reevaluated in 2005 and annually
    thereafter.
  • Note Check for Academically Unacceptable
    Campuses.
  • District with Academically Unacceptable
    campuses cannot receive
  • Exemplary or Recognized rating.

14
Progress/Proficiency Measure for English Language
Learners (ELL)
  • Annual measurable achievement objective (AMAO)
    Longitudinal measure that follows LEP students
    from entry in a Texas public school until they
    score at the Met Standard level on the English
    TAKS reading test for 2 consecutive years
    (required under Title III of NCLB).
  • Progress measure in English language proficiency
    for LEP students will be developed.
  • In 2005, measure reported and accountability
    standards set.
  • Measure used for ratings in 2006 or 2007.
  • Progress measured in movement from one
    proficiency level to a higher level on Reading
    Proficiency Tests in English (RPTE) and from RPTE
    (Advanced-Level 3) to Met Standard on English
    TAKS.

15
Completion Rate (Grades 9-12) and Annual Dropout
Rate (Grades 7-8) Indicators
  • NCES Definition For 2005-06 leavers, TEA uses
    National Center for Education Statistics (NCES)
    dropout definition.
  • Completion Rate Indicator Includes graduates
    and continuing students (students who return to
    school for a fifth year) in the definition of
    high school completer for the accountability
    completion rate beginning in 2006.
  • Indicator counts GED recipients as completers in
    2003 - 2004 for 2004 - 2005, but any student
    receiving a GED in 2004 -2005 will NOT be
    counted as a completer for the following year.
  • The completion rate indicator is completers as a
    percent of total students in the class
    (graduates, continuing students, and dropouts).
  • The Annual Dropout Rate will be used for grades 7
    and 8 and is likely to be used at all grades in
    Performance-Based Monitoring.

16
Completion Rate Standards (Grade 9-12)
2004 (class of 2003 9th grade 99-00) 2005 (class of 2004 9th grade 00-01) 2006 (class of 2005 9th grade 01-02) 2007 (class of 2006 9th grade 02-03) 2008 (class of 2007 9th grade 03-04) 2009 (class of 2008 9th grade 05-06)
Exemplary 95 95 95 TBD TBD TBD
Recognized 85 85 85 TBD TBD TBD
Academically Acceptable 75 75 75 TBD TBD TBD
Indicator Definition Grads GED Cont. HS Grads GED Cont. HS Grads Cont. HS Grads Cont. HS Grads Cont. HS Grads Cont. HS
17
Completion Rate (Grades 9-12) Required Improvement
  • Improvement Measure Gain from prior year in
    completion rate required to reach a predetermined
    accountability standard in a set number of years.
  • Completion rate Required Improvement measure
    will not be finalized until summer 2004 to better
    align with other required improvement measures.
  • Minimum Size Requirements Meets minimum size
    requirements for completion rate in current year
    and has at least 10 students in the completion
    rate class in the prior year.
  • Use in Ratings Campuses and districts can
    demonstrate Required Improvement to meet
    Acceptable/Academically Acceptable absolute
    standard for completion rate.

18
Annual Dropout Rate (Grades 7-8)
  • Annual Dropout Rate Indicator dropouts of
    total students enrolled in grades 7 and 8 in a
    single school year
  • Campus and District Ratings Uses the grade 7-8
    annual dropout rate
  • Student Groups All Groups
  • Minimum Size Requirements
  • (1) At least 5 dropouts and
  • (2) 30/10/50
  • In 2004 2005 the minimum size was 10 dropouts.
    The movement to 5 as a minimum will have a major
    impact on schools.

19
Annual Dropout Rate Standard (Grades 7-8)
2004 (from 2002-03) 2005 (from 2003-04) 2006 (from 2004-05) 2007 (from 2005-06) 2008 (from 2006-07) 2009 (from 2007-08)
Exemplary 0.2 0.2 0.2 TBD TBD TBD
Recognized 0.7 0.7 0.7 TBD TBD TBD
Academically Acceptable 2.0 1.0 1.0 TBD TBD TBD
Dropout Definition Current State Definition Current State Definition Current State Definition NCES Definition NCES Definition NCES Definition
20
Data Quality Requirements
  • Data quality is considered in completion rate and
    annual dropout rate appeals.
  • PID error rate used to monitor quality of PEIMS
    data submissions.
  • Longitudinal underreported students indicator
    linked to completion rate calculation reported
    may replace annual data quality indicator in
    accountability ratings
  • Underreported Students Standards
  • 2004 Standards District gt 100 underreported
    students or gt 5 underreported students no
    Exemplary or Recognized rating.
  • 2005 Standards District gt 100 underreported
    students or gt 2 underreported students no
    Exemplary or Recognized rating.
  • Districts that fail to meet these standards will
    be investigated may prevent a district from
    being rated Academically Acceptable.

21
Gold Performance Acknowledgement (GPA)
  • Measures from 2002 that are still likely to be
    used in 2004
  • Advanced Course Completion
  • AP/IB Results
  • Attendance Rate
  • Commended Performance Reading/ELA
  • Commended Performance Mathematics
  • Commended Performance Writing
  • Commended Performance Science
  • Recommended High School Program
  • SAT/ACT Results

22
Alternative Campuses
  • 2004 Registered alternative ed. campuses rated
    Alternative Education.
  • 2005 Options to be developed.
  • Accountability must be based on data from
    standard data submission processes (PEIMS) or by
    the State test contractor.
  • Measures appropriate for alternative programs
    offered, lower standards on the same measures
    used in the regular ratings is not an option.
  • TAKS Growth Index considered as a possible
    measure
  • Performance of students in alternative education
    campuses included in district ratings in 2004.

23
Local Accountability SAISD 2003 2004 Where
are we?
  • Campuses and the District were held to a three
    step measurement process.
  • Step One Did we achieve at the level of
    excellence outlined by our state accountability
    system (70 or more passing) or were our gains
    greater than the standard?
  • Step Two Are we on track to meet a Recognized
    Standard in 2006?
  • Step Three Has the District closed the gap on
    the state average, or has the campus closed the
    gap on the District average at the appropriate
    level (elementary, middle school, high school)?

24
District/Campus Analysis Step One
  • Measured cells by all students and group
  • 90 100 (5 pts. Each) 1 X 5 5
  • 80 89 (4 pts. Each) 9 X 4 36
  • 70 79 (3 pts. Each) 6 X 3 18
  • Total 40 pts.
  • Points lost for insufficient gain - 3 X
    cells
  • Areas we lagged behind the state by -5
    percentage points or more and were not at 70 or
    greater in the cell Deduct 3 points.
  • Divide by the number of cells measured (small
    numbers rule).

25
Sample Elementary School 2004
Test All Students Econ. Disadvantaged African American Hispanic White
Reading 82 74 79 81 85
Writing 90 82 88 89 Small
Math 58 52 54 59 90
Science 43 39 43 41 44
90 - 100 2 X 5 10 80 89 6 X 4
24 70 - 79 2 X 3 6 Total 40 pts.
40 pts. cells 5 pts or more below District
average 4 4 X -3 -12 40 12 28/19 1.47
overall score
26
District/Campus Analysis Step Two and Step Three
  • Campuses were then ranked and those in the lowest
    quartile on the numerical scale were then
    measured in terms of their gains to see if they
    were on track to meet the Recognized standard in
    2006 at either the 70 or 80 standard.
  • Many campuses have impressive gains that indicate
    they are on target.
  • Campuses were then measured against District or
    District against State gains to see if we were
    closing the gap.
  • If the campus closed the gap in performance
    with the District in 70 or more of all cells,
    this was considered as significant.

27
Sample Elementary School (1SEM to 1SEM) 2003
2004 Step Two
Test 2003 2004 Difference Needed for 80 in 2006 Reached this Standard Needed for 70 in 2006 Reached this Standard
Reading 74.1 82.3 8.2 - Yes
Writing 82.5 90.0 7.5 - Yes
Math 49.1 58.4 9.0 21.6(10.8) No 11.6(5.8) Yes
Science 39.7 43.2 3.5 36.8(18.4) No 26.8(13.4) No
28
Sample Elementary School 2003 2004 Step Three
29
Comparison Framework An Example
  • The following slides show achievement across two
    years of TAKS Science.
  • The first slide in the series shows the actual
    accountability numbers from year to year (2SEM to
    1SEM), disregarding the changes in the level of
    difficulty of the test.
  • The second slide in the series shows the growth
    measured at the same level of difficulty across
    two years (1SEM to 1SEM). This is true growth
    apples to apples.
  • The third slide in the series shows the growth
    measured against the state. Are we making
    disproportionate gain?

30
Sample Elementary 2003 (2 SEM) 2004
(1 SEM) SAISD TAKS Science Grades 5
2 NC 2
NC 2
Sample Campus Data
31
2003 (1SEM) 2004 (1SEM) SAISD TAKS Science
Grades 5
10 8 11
6 8
Sample Campus Data
32
Science Comparison of Campus to District Pluses
Indicate Closing of the Gap
-

Sample Campus Data
33
Goal Setting Where are we going?
  • While we used the comparison of performance at
    1SEM to 1SEM for evaluative purposes, we
    anticipated 2004-2005 by setting goals at the
    more rigorous Panel Recommendation level.
  • The District and the Campus will set overall
    goals, as well as goals by grade level and group.
About PowerShow.com