Challenges in the Design and Presentation of Largescale, Multisite Education Program Evaluation Resu - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Challenges in the Design and Presentation of Largescale, Multisite Education Program Evaluation Resu

Description:

... provided by state for all RF teachers ... RF Teacher Knowledge Survey: pre ... of parents in each grade level in each RF school (approx. 8,750) ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 33
Provided by: dottieh
Category:

less

Transcript and Presenter's Notes

Title: Challenges in the Design and Presentation of Largescale, Multisite Education Program Evaluation Resu


1
Challenges in the Design and Presentation of
Large-scale, Multi-site Education Program
Evaluation Results Reading First in Georgia
  • Southeast Evaluation Association 18th Annual
    Conference, February 2-3. 2006 Tallahassee, FL
  • Presenter
  • Dorothy Harnish, College of Education,
  • University of Georgia

2
Purpose of this Session
  • Present information about the design of a
    multi-year, multi-site, statewide evaluation
    project in early reading
  • Evaluation questions and design
  • Data collection instruments and methods
  • Evaluation activities
  • Data analysis from multiple sources
  • Identify issues and concerns with summary
    presentation of results
  • Solicit discussion on alternatives, experience of
    other evaluators

3
Overview Reading First in Georgia
  • Federal initiative under NCLB to ensure all
    students are reading at grade level - grades K-3
  • 106 funded K-3 schools in Georgia
  • Participants
  • 35,000 students
  • 2,400 teachers
  • Literacy Coach in each school

4
Overview Reading First in Georgia
  • State RF Office in state education agency
  • RF Director
  • Program Coordinators (3)
  • Regional RF Consultants (12)
  • Professional Development Architects
  • Summer training academies provided by state for
    all RF teachers
  • Ongoing regional training for teachers and
    literacy coaches provided by state

5
External Evaluation
  • UGA College of Education contracted with Ga.
    Dept. of Education as external evaluator for RF
  • Three year evaluation of implementation,
    progress, impact of Reading First in Georgia
  • Evaluation Team
  • Dept. Language/Literacy Ed 2 faculty members,
    5-7 doctoral graduate students
  • Georgia Assessment Center/Test Services research
    analyst, grad student
  • College of Ed project director, professional
    staff

6
Purpose of Evaluation
  • Collect, analyze, and report data to answer the
    following questions about Reading First (RF)
    implementation and impact in Georgia
  • IMPLEMENTATION
  • Is the Reading First program being implemented by
    schools as intended in the Georgia Reading First
    plan?
  • How does the level of implementation of Reading
    First relate to the results being achieved in
    Reading First schools?
  • Is the level of Reading First implementation
    positively correlated with higher reading
    achievement?
  • Are Reading First teachers more knowledgeable of
    scientifically based reading research after the
    three years of professional learning experiences?

7
Purpose of Evaluation
  • PROGRESS
  • What progress is being made by Reading First
    schools in improving student reading achievement?
  • Where progress is not apparent, what are the
    reasons for this?
  • What interventions are required?
  • IMPACT
  • What is the impact of Reading First on student
    achievement in reading as measured by
    standardized test scores?
  • Is reading achievement in Reading First schools
    higher than in non-Reading First schools?

8
Evaluation Design Implementation
  • Observations of classroom instruction in 106 RF
    schools by teams of UGA observers (Fall and
    Spring)
  • Monthly online surveys and end-of-year interviews
    with Literacy Coaches
  • Surveys of RF Teachers, RF School Administrators,
    Parents of RF students, Literacy Coaches, and
    Regional RF Consultants (Spring 2005)
  • RF Teacher Knowledge Survey pre-assessment
    (Summer 2004)
  • End of year summative report of findings

9
Evaluation Design Progress and Impact
  • Student progress in reading from beginning to end
    of school year, based on DIBELS test scores in
    grades K-3 and on PPVT in Kindergarten
  • Student reading achievement gains from 2004 to
    2005, based on ITBS reading tests in grades 1-3
    (grade level and cohort analyses)
  • Confirmatory evidence on student achievement
    gains from 2004 to 2005, based on CRCT reading
    tests in grades 1-3
  • Comparison of RF and non-RF comparison schools on
    ITBS and CRCT reading test gains

10
Data Collection Classroom Observations
  • Observation instrument Instructional Content
    Emphasis (developed by U.Texas), aligned with 5
    essential elements of RF scientifically-based
    reading research
  • Two day observer team training by instrument
    developer UGA team (10) observers, State RF
    staff team (12) observers
  • All schools visited once Fall semester, half
    visited again Spring semester (all schools
    observed once by UGA team, 366 total visits)
  • One hour observation period in each of 2 randomly
    selected RF classrooms per grade level each visit
  • Teacher activities coded as phonological
    awareness, phonics-word study, fluency,
    vocabulary, comprehension, related literacy
    activities, transitions, directions/procedures,
    non-Reading First activities.
  • Reported number of events and number of minutes
    observed in each category

11
Data CollectionLiteracy Coaches
  • Literacy Coach at each RF school (106)
  • Monthly Reports
  • Online web-based survey reporting system through
    UGA
  • Set of questions vary each month, forced-choice
    and open-ended, August April, state RF staff
    input on questions
  • Assessment of ongoing implementation process,
    issues, concerns, accomplishments
  • Summary of aggregate responses sent to state RF
    staff for formative feedback monthly, shared with
    coaches online
  • Annual Interviews
  • End of year telephone interviews with random
    sample of 30 of coaches for in-depth information
    to supplement surveys

12
Data Collection Stakeholder Surveys
  • PARENTS
  • 13 questions about parent perceptions of childs
    reading behaviors, information from schools,
    involvement with reading 7 demographics
    questions one open-ended question
  • 25 random sample of parents in each grade level
    in each RF school (approx. 8,750)
  • paper copy distributed through Literacy Coaches
    to students in schools (Spanish English
    versions)
  • completed surveys mailed to UGA by parents in SAS
    envelope
  • 2,768 completed surveys received (650-700 per
    grade level)

13
Data Collection Stakeholder Surveys
  • TEACHERS
  • All RF teachers K-3 (approx. 2400)
  • Online web-based survey through UGA
  • 54 forced choice response items 3 open-ended
    items
  • Item categories essential domains of reading,
    student assessment, the RF classroom, support for
    RF, professional development, student progress,
    challenges/help needed
  • 1,755 completed surveys received (350-400 per
    grade level K-3)

14
Data Collection Stakeholder Surveys
  • LITERACY COACHES
  • Online web-based survey, 54 forced-choice, 3
    open-ended items, same questions as the RF
    Teacher survey
  • Completed separate survey for kindergarten
    teachers, for first grade teachers, for second
    grade teachers, and for third grade teachers
  • Received 100 surveys for each of 4 grade levels

15
Data CollectionStakeholder Surveys
  • PRINCIPALS
  • All RF schools, online web-based survey through
    UGA
  • 12 forced-choice questions, 1 open-ended item on
    RF familiarity, implementation, support, and
    involvement
  • 111 completed surveys received
  • REGIONAL COORDINATORS
  • Paper survey, email attached file
  • 14 forced-choice questions, 2 open-ended on how
    RRFC worked with schools, challenges, gains
  • 12 completed surveys received (100)

16
Data Collection Teacher Knowledge Survey
  • Assessment of change in teacher knowledge of
    reading practices, pre-post measure, all K-3
    reading teachers
  • Administered by UGA evaluators at each of 5 RF
    teacher training academies June-July 2004 during
    opening session or prior to start of workshops
    (600-800 teachers/session)
  • Instrument Content Knowledge for Teaching
    Reading Measures (developed by U.Michigan) 48
    items, questions about classroom teaching
    scenarios grade, prior reading training, years
    teaching
  • 2,324 surveys completed 99 schools
  • Re-administer after third year of RF grant

17
(No Transcript)
18
Analysis of Student Test Data DIBELS
  • Dynamic Indicators of Basic Early Literacy Skills
    (DIBELS)
  • Short, individually administered to each student
    by teacher 3 times/year for screening, diagnostic
    assessment, intervention, and progress
    monitoring data entry by teachers using
    palm-pilot
  • Common outcome measure of progress in 5 critical
    reading areas, each grade level, in all RF
    schools
  • Statewide database provided to UGA evaluators by
    state education agency after each testing

19
Analysis of DIBELS
  • Comparison of RF students DIBELS reading scores
    at beginning, mid-year, and end of school year to
    identify progress in reading
  • Has the percent of students meeting the benchmark
    goals for each DIBELS measure improved from the
    beginning to the end of the school year?
  • How do differences in improvement within each
    year vary for each grade level and for each
    reading measure?
  • Which schools are making the greatest and least
    progress, based on DIBELS scores?

20
Analysis of Student Test Data ITBS
  • Iowa Test of Basic Skills, norm-referenced test,
    administered in all RF schools to grades 1, 2,
    and 3
  • Three types of comparisons
  • Grade level comparisons from spring 2004 to 2005
  • Cohort comparisons 2004-05
  • Comparison of RF students to non-RF students in
    matched sample of schools
  • Analyses of data
  • Percent of students reading at/above grade level
  • Changes in NCE mean score changes

21
Grade Level Analysis of ITBS Data
  • Grade level analysis of ITBS results from Spring
    2004 testing to Spring 2005 testing
  • Has the percent of students reading at/above
    grade level improved in RF schools compared to
    the previous year for each grade level?
  • Is there an improvement in ITBS mean NCE scores
    for students in RF schools compared to the same
    grade level in the previous year?
  • Grade level students scoring at or above 25th
    percentile and 50th percentile
  • Independent t-test by grade level (1,2,3) of mean
    score change for each of 7 ITBS Reading subscales
  • Not all students and schools had two years of
    data to analyze

22
Cohort Analysis of ITBS Data
  • Cohort (same students) analysis of ITBS results
    from Spring 2004 testing to Spring 2005 testing
  • Has the percent of students reading at/above
    grade level improved in RF schools compared to
    the performance of these same students in the
    previous year?
  • Is there an improvement in ITBS mean NCE scores
    for students in RF schools compared to the same
    students in the previous year?

23
Cohort Analysis of ITBS Data
  • Second grade and third grade cohorts
  • Student matched by ID from 2004 2005 ITBS data
    files (38 match)
  • Dependent groups t-test used to evaluate
    hypothesis that difference in NCE mean score
    changes in pre-test (2004) scores and post-test
    (2005) ITBS Reading subscales was significantly
    different from zero
  • Statistical significance and effect size reported
    for each cohort and subtest
  • Disaggregated analysis provided, but limited by
    small numbers in subgroups

24
Comparative Analysis of ITBS Data
  • Comparison of RF 3rd grade students with those in
    non-RF schools
  • Is student achievement on third grade ITBS
    reading tests different in schools using RF and a
    sample of schools not using RF?
  • Schools administering ITBS spring 2004 and spring
    2005 in third grade, matched by race/ethnic, LEP,
    and economic disadvantage
  • NCE mean score comparisons, year-to-year
  • 2x2 (year by group) analysis of variance for five
    ITBS subtests for RF and non-RF matched sample of
    66 schools

25
Analysis of Student Test DataPPVT
  • Comparison of Peabody Picture Vocabulary Test
    (PPVT) scores of RF kindergarten students at
    beginning and end of school year to identify
    gains in oral vocabulary
  • What progress did kindergarten students make as
    measured by the PPVT?
  • Which schools made the greatest and least
    progress?
  • Data submitted by RF schools to UGA via online
    survey/spreadsheet
  • Mean NCE score differences fall 2004 to spring
    2005, statistical significance and effect size

26
Analysis of Student Test DataCRCT
  • Comparison of RF and non-RF schools on Criterion
    Referenced Competency Test (CRCT) reading tests
    as confirmatory evidence of RF impact
  • Is student achievement in reading as measured by
    CRCT different in schools using RF and those not
    using RF?
  • What percentage of students showed improvement
    within each group from spring 2004 to spring
    2005?
  • How does the percent of students meeting or
    exceeding state standards in CRCT reading improve
    for cohorts of RF students each year?
  • Comparisons for 5 reading subscales on CRCT,
    performance level change (improve, same, worse),
    of cohort meeting/exceeding state standard
    year-to-year, matched RF and non-RF sample
    comparison

27
Presentation of Results
  • Implementation
  • Monthly reports of online literacy coach surveys
    to state director, RF staff
  • Baseline report on Teacher Knowledge Survey
  • Mid-year report on observation findings
  • Final comprehensive report on annual results of
    evaluation, July 2005
  • Progress and Impact
  • Mid-year data from DIBELS
  • Final comprehensive report on annual results of
    evaluation, July 2005
  • Website posting of Year One RF Evaluation Report
    http//www.glc.k12.ga.us/pandp/readingfirst/homep
    g.htm

28
Issues
  • How to compile and present complex, detailed
    findings from both qualitative and quantitative
    data collection in a format that can be easily
    understood and accessed by multiple audiences?
  • How are results used by the state agency staff
    responsible for oversight, development, and
    outcomes of RF grant?
  • Level of detail desired statewide, regional,
    grade level, school level?

29
Options
  • Organization of report to follow evaluation
    questions
  • Use of simplified tables, charts, and graphs
    visuals wherever possible
  • Summary tables pulling together and relating
    information from different areas of the
    evaluation
  • Reporting survey results together with interview
    and observation findings, use of conceptual
    categories for qualitative data, triangulation of
    information

30
Options
  • Charts to compare and contrast responses of
    different groups to similar questions, analyses
    based on this comparison
  • Simplified statistical description and
    definitions of analyzes used with quantitative
    data
  • Regular meetings, emails, phone contact with
    state clients to discuss evaluation needs,
    preliminary findings attendance at state staff
    meetings and training sessions
  • Presentation/discussion of findings by evaluators
    with key groups at state, regional levels

31
Discussion
  • Q A
  • Suggestions from other evaluators about how to
    address issues of large-scale, multi-site
    evaluations

32
Further Information
  • Contact information
  • Dr. Dorothy Harnish
  • Project Director, Occupational Research Group,
  • College of Education, University of Georgia,
  • Athens, GA
  • Phone 706-542-4690
  • E-mail harnish_at_uga.edu
Write a Comment
User Comments (0)
About PowerShow.com