Using Data to Improve Student Achievement - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Using Data to Improve Student Achievement

Description:

EOC ALGEBRA. 34. 14. 66. 27. 36. 13. 64. 23. 26. 11. 74. 32. 8. 47. 18. 53 ... students the 'tools' (e.g. calculator skills, writing tips, test taking skills, ... – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 44
Provided by: candie
Category:

less

Transcript and Presenter's Notes

Title: Using Data to Improve Student Achievement


1
Using Data to Improve Student Achievement to
Close the Achievement Gap
  • Tips Tools for Data Analysis
  • Spring 2007

2
TRANSPORTATION
BEFORE/AFTER SCHOOL ACTIVITIES/DUTIES
ATTITUDES
EXPERIENCES
PRIOR SUCCESS/ FAILURE
LIVING SITUATION/ FAMILY STRUCTURE/ FAMILY SIZE
SOCIOECONOMIC STATUS
MOBILITY
PHYSICAL, MENTAL, SOCIAL HEALTH
SPECIAL NEEDS
BACKGROUND KNOWLEDGE
LANGUAGE FLUENCY
BELIEFS
3
Looking at the BIG Picture
4
(No Transcript)
5
Multiple Measures
  • Demographics
  • Enrollment, attendance, drop-out rate, ethnicity,
    gender, grade level
  • Perceptions
  • Perceptions of learning environment, values
    beliefs, attitudes, observations
  • Student Learning
  • Standardized tests (NRT/CRT), teacher
    observations of abilities, authentic assessments
  • School Processes
  • Description of school programs processes

6
Criterion-Referenced Data
  • Whats required?
  • Proficiency percentages for combined pop.
    identifiable subgroups by
  • Test
  • Year (for latest 3 years)
  • Analysis of test by
  • Passage type type of response for literacy
  • Writing domain multiple choice for literacy
  • Strand type of response for math
  • in order to identify trends and draw conclusions
    based on results over 3 year period

7
Norm-Referenced Data
  • Whats required?
  • National percentile rank standard score for
    combined population identifiable subgroups by
  • Test
  • Year
  • Analysis of test by
  • Content subskill skill cluster
  • in order to identify trends, measure growth, and
    draw conclusions based on results over 2 year
    period

8
Disaggregated Data Tools
  • CRT
  • ACSIP Template and of students
    non-proficient/proficient for combined and
    subgroup populations
  • ACSIP Strand Performance Report combined and
    subgroup performance averages by test, passage
    type/domain/strand, type of response
  • Data Analysis Set cwatts_at_afsc.k12.ar.us

9
DATA SUMMARY REPORT BENCHMARK RESULTS
MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS MATH BENCHMARK RESULTS
COMBINED POPULATION 2004 2004 2004 2004 2005 2005 2005 2005 2006 2006 2006 2006
GRADE LEVEL NP P NP P NP P
3 22 69 10 31 13 38 21 62
4 12 29 30 71 12 39 19 61 21 60 14 40
5 27 60 18 40 12 40 18 60
6 28 78 8 22 19 65 10 34 23 51 22 49
7 27 73 10 27 20 53 18 47
8 32 74 11 26 23 64 13 36 27 66 14 34
EOC ALGEBRA 5 17 25 83 9 28 23 72 12 34 23 66
EOC GEOMETRY 11 38 18 62 14 39 22 61 6 20 24 80
KEY actual number of students NP
percentage of non-proficient students P
percentage of proficient advanced students
10
4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS 4TH GRADE MATH BENCHMARK RESULTS
SUB-GROUP 2004 2004 2004 2004 2005 2005 2005 2005 2006 2006 2006 2006
NP P NP P NP P
AFRICAN-AMERICAN 17 46 20 54 19 48 21 52 22 63 13 37
CAUCASIAN 12 29 29 71 12 39 19 61 21 62 13 38
HISPANIC 18 60 12 40 17 47 19 53 9 41 13 59
SPECIAL SERVICES 1 50 1 50 5 83 1 17 6 100 0 0
ECONOMICALLY DISAD. 11 50 11 50 5 38 8 62 10 56 8 44
ELL 16 64 9 36 12 67 6 33 10 56 8 44
KEY actual number of students NP
percentage of non-proficient students P
percentage of proficient advanced students
11
COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION COMBINED POPULATION
LITERARY LITERARY LITERARY LITERARY LITERARY LITERARY CONTENT CONTENT CONTENT CONTENT CONTENT CONTENT PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL
2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006
50 50 49 49 38 38 63 63 61 61 50 50 62 62 52 52 59 59
M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R
67 33 56 42 50 26 71 56 69 54 60 40 72 53 64 40 65 54
CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN CAUCASIAN
LITERARY LITERARY LITERARY LITERARY LITERARY LITERARY CONTENT CONTENT CONTENT CONTENT CONTENT CONTENT PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL
2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006
52 52 51 51 39 39 65 65 61 61 54 54 64 64 53 53 64 64
M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R
69 35 58 44 51 27 72 58 68 54 65 43 73 54 64 42 70 58
SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES SPECIAL SERVICES
LITERARY LITERARY LITERARY LITERARY LITERARY LITERARY CONTENT CONTENT CONTENT CONTENT CONTENT CONTENT PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL
2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006
36 36 34 34 44 44 47 47 20 20 31 31 38 38 16 16 17 17
M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R
42 31 38 30 55 53 57 36 40 0 43 20 53 22 33 0 34 0
ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED ECONOMICALLY DISADVANTAGED
LITERARY LITERARY LITERARY LITERARY LITERARY LITERARY CONTENT CONTENT CONTENT CONTENT CONTENT CONTENT PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL PRACTICAL
2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006 2004 2004 2005 2005 2006 2006
42 42 35 35 43 43 49 49 41 41 43 43 47 47 38 38 45 45
M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R M/C O/R
62 22 53 17 67 19 76 23 62 20 61 25 70 24 55 20 61 28
12
Disaggregated Data Tools
  • NRT
  • ITBS ACSIP Report of students performing
    above the 50th percentile on each test and
    content subskill for combined subgroup
    populations
  • Performance Profile standard score NPR on
    each test and content subskill for combined
    population
  • School Coded Summary standard score NPR on
    each test for subgroup populations
  • Data Analysis Set cwatts_at_afsc.k12.ar.us

13
NRT Growth Assessment
2003 2004 2005 2006
SS 180 200 215 230
Standard Scores Show relative development over
time
14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
SLE Analysis
SLE ITEM PERCENTAGE CORRECT
1.5 2 81.5
1.5 11 74.1
1.5 36 95.6
TOTAL 83.7 Total the percentage correct column divide by 300 (in this case). TOTAL 83.7 Total the percentage correct column divide by 300 (in this case). TOTAL 83.7 Total the percentage correct column divide by 300 (in this case).
21
SLE Analysis
Percentage Meeting Standard Suggested Action to be Taken
0-34 Align curriculum classroom instruction curriculum has not been taught or does not exist indication that instruction is textbook-driven
35-49 Coordinate curriculum objectives across grade levels subject areas making sure all objectives are taught (horizontal/vertical alignment)
50-69 Implement high-yield instructional strategies in all classrooms there is probably a high percentage of lecture, whole-group, direct teaching
70-84 Spend more quality time on instructional strategies to yield greater results check learning minutes in schedule nature of tasks on which students spend their time
85-100 Provide aligned enrichment add depth breadth review pacing reteach for mastery be sure distributed practice is occurring
Source Learning 24/7
22
Digging Deeper
  • CRT Item Analysis
  • Content Standard
  • Language of Question
  • Level of Questioning
  • Distracters

23
Content Standard
  • What is it that the student must know or be able
    to do?
  • When is this introduced in the curriculum?
  • How is it paced?
  • Is it a power standard?
  • What instructional strategies are used to help
    students master this standard?
  • Have I given students the tools (e.g.
    calculator skills, writing tips, test taking
    skills, etc.) necessary to respond appropriately?
  • Can this standard easily be integrated into other
    curricular areas?

24
Language of Question
  • How is the question worded on the test?
  • Are there vocabulary words used that may hinder
    comprehension?
  • Do I teach and test using the same language?
  • Do I have word/learning walls in my content area
    to support this standard and related vocabulary?

25
Level of Questioning
  • According to Blooms, what is the level of
    questioning used to measure mastery of the
    standard?
  • Highlight the verb(s) in the question. Do I use
    those same verbs in my teaching and testing?
  • Have I taught key or clue words that will
    help students to understand what is being asked
    of them?
  • Is the question multi-layered?

26
Distracters
  • Are there items that distract the student from
    identifying what is being asked, or are there
    items that may confuse the student as he/she
    makes an answer choice?
  • Labels
  • Additional information
  • Multi-layered tasks
  • Conversions
  • Not

27
  • SLE Correlation NPO 1.3 (prior to 2004
    revisions) which states
  • Apply and master counting, grouping, place value,
    and estimation.
  • Item Analysis
  • -What must the student know or be able to do?
    Content Standard
  • -How is the question worded on the test?
    Language of the Question
  • -According to Blooms, what is the level of
    questioning used to measure mastery
  • of the standard? Level of Questioning
  • -Are there items that distract the student from
    identifying what is being asked, or are there
    items that may confuse the student as he/she
    makes an answer choice? Distracters

28
(No Transcript)
29
Digging Deeper
  • NRT Item Analysis
  • Building Item Analysis
  • Identify items that have a negative value of 10
    or more as indicated by the bar falling to the
    left of the 0 mark
  • Analyze results of all related items

30
(No Transcript)
31
Peeling the Data Levels of Looking at Data
  • District
  • K-12 Feeder Patterns
  • School Levels
  • Grade Level
  • Programs Tracks
  • Classroom-teacher
  • Student

32
Data analysis should not be about just gathering
data. It is very easy to get analysis paralysis
by spending time pulling data together and not
spending time using the data. -Bernhardt,
2004, p. 19
33
Peeling the Data Questions to Ask
  • Are there any patterns by racial/ethnic groups?
    by gender? by other identifiers?
  • What groups are doing well?
  • What groups are behind? What groups are on
    target? Ahead?
  • What access and equity issues are raised?
  • Do the data surprise you, or do they confirm your
    perceptions?
  • How might some school or classroom practices
    contribute to successes and failures? For which
    groups of students?
  • How do we continue doing whats working and
    address whats not working for students?

34
Peeling the Data Dialogue to Have
  • How is student performance described? (by
    medians, quartiles, levels of proficiency, etc.)
  • How are different groups performing? Which
    groups are meeting the targeted goals?
  • What dont the data tell you?
  • What other data do you need?
  • What groups might we need to talk to? (students,
    teachers)
  • What are the implications for?
  • Developing or revising policies
  • Revising practices and strategies
  • Reading literature
  • Visiting other schools
  • Revising, eliminating, adding programs
  • Dialogues with experts
  • Professional development goal setting and
    monitoring progress
  • How do we share and present the data to various
    audiences?

35
Sample Questions from a Schools Data Team
  • Are there patterns of achievement based on
    Benchmark scores within subgroups?
  • Are there patterns of placement for special
    programs by ethnicity, gender, etc.?
  • What trends do we see with students who have
    entered our school early in their education vs.
    later? Is there a relationship between number of
    years at our school and our Benchmark scores?

36
Sample Questions from a Schools Data Team
  • Is there a relationship between
    attendance/tardiness and achievement?
  • How do students who have been retained do later?
  • How do our elementary students do in middle
    school?
  • Do findings in our NRT results support findings
    in our CRT results?
  • Can our findings be directly linked to
    curriculum? instruction? assessment?
  • What are our next steps?

37
Making It Personal for Teachers
  • Teachers can use their own data to
  • Identify the strengths of their own students
  • Identify the challenges of their own students
  • Identify common misconceptions error patterns
  • Identify their own successful teaching methods
  • Pinpoint areas needed for professional development

38
(No Transcript)
39
(No Transcript)
40
Making It Personal for Students
  • Students can use their own data to
  • Reflect on their own knowledge test taking
    strategies
  • Reflect on their strengths weaknesses
  • Set goals for improvement

41
(No Transcript)
42
KNOW HOW
SUCCESS
TIME
WANT TO
LEADERSHIP
43
Candie Wattscwatts_at_afsc.k12.ar.us
  • Arch Ford Education Service Cooperative
  • http//af1.afsc.k12.ar.us
Write a Comment
User Comments (0)
About PowerShow.com