North Carolina Department of Public Instruction - PowerPoint PPT Presentation

1 / 109
About This Presentation
Title:

North Carolina Department of Public Instruction

Description:

Quasi-Experimental Design: LPI Principal Ratings, ... Quasi-Experimental Design: ... Quasi-Experimental Design: Student Use of Computers in Grades 3-5, 2004-05 ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 110
Provided by: educ456
Category:

less

Transcript and Presenter's Notes

Title: North Carolina Department of Public Instruction


1
  • North Carolina Department of Public Instruction
  • NC State University/Friday Institute for
    Educational Innovation
  • SERVE Center at the University of North
    Carolina-Greensboro
  • SETDA

2
Quasi-Experimental, Experimental, and Case Studies
  • North Carolina State University/
  • Friday Institute for Educational Innovation

3
Comprehensive Multi-method Evaluation Design
Case Study
Quasi-Experimental Design
Experimental Design
4
Quasi-Experimental Design Process
  • Quasi-experimental approach
  • A matched-groups, mixed between/within
    longitudinal design
  • In 2003, 11 comparison schools were selected
    based on
  • Grade structure
  • Geographical proximity
  • 01-02 End of Grade (EOG) Scores
  • Student Demographics
  • Size
  • For all variables, hypotheses were framed as ?1d
    - ?1a gt ?2d - ?2a

5
Quasi-Experimental Design Variables and
Measures (selected)
  • Measures
  • EOG tests
  • Tech skills surveys
  • NETS-T survey
  • TAC and TAT surveys
  • AOI survey
  • SOC questionnaire
  • Leadership Practices Inventory
  • STNA
  • Variables
  • Student achievement
  • Student technology skills
  • Teacher tech skills
  • Attitudes toward technology
  • Instructional strategies
  • Stages of Concern
  • Leadership Style
  • Implementation of model

6
Quasi-Experimental Design IMPACT and Comparison
Student Demographics, 2005-06
  • Source DPI 2005-06 School Report Cards,
    http//www.ncreportcards.org/src/.

7
Quasi-Experimental Design Teacher Quality
Statistics
8
Quasi-Experimental Design IMPACT and Comparison
Teacher Retention, Year 1-3
9
Quasi-Experimental Design Implementation (STNA)
Impact schools were rated more highly by teachers
in all 13 areas
  • Vision and leadership
  • Technology planning, budgeting, evaluation
  • Supportive environment for risk-taking
  • Resource media, software tools
  • Community linkages
  • Professional development
  • Classroom practice-instructional strategies
  • Classroom practice-planning
  • Student activities
  • Teaching practices,
  • Student outcomes (perceived)

All effects significant at plt.001, partial ?2
ranged from 0.05 to 0.43
10
Quasi-Experimental Design Implementation
(IMPACT rubric)
Impact schools were rated more highly by teachers
in 8/16 areas over 3 years
  • Instruction
  • Collaboration
  • Needs assessment
  • Managing resources
  • Designing facilities
  • Policies
  • Planning
  • Evaluation

All effects significant at plt.05, partial ?2
ranged from 0.33 to 0.68
11
Quasi-Experimental Design Leadership Ratings
(LPI)
  • All IMPACT principals who were present for all
    three years of the grant were rated more highly
    in Year 3 than in Year 1 on all 5 constructs
    (Challenging the Process, Inspiring a Shared
    Vision, Enabling Others to Act, Modeling the Way,
    and Encouraging the Heart).
  • These principals grew most in Challenging the
    Process and Inspiring a Shared Vision.

12
Quasi-Experimental Design LPI Principal
Ratings, Year 1-Year 3
13
Quasi-Experimental Design Leadership Team
Ratings on LPI
  • On all 5 constructs, media coordinators
    out-scored principals, in absolute terms.
  • On 4 of 5 constructs, technology facilitators
    out-scored principals, in absolute terms.
  • These findings indicate that IMPACT teachers
    value the leadership qualities of media
    coordinators and technology facilitators, and
    that these individuals are seen as better
    leaders, in some respects, than school
    principals.

14
Quasi-Experimental Design LPI Ratings for Media
Coordinators, Principals, and Technology
Facilitators
15
Quasi-Experimental DesignTeacher Outcomes (ISTE
NETS-T)
16
Quasi-Experimental DesignTeacher Outcomes
(attitudes toward technology)
IMPACT teachers showed stronger change in
attitudes or more positive attitudes overall on
  • Perceived utility of IT
  • Email
  • Internet
  • Multimedia
  • Productivity-teacher
  • Productivity student

17
Quasi-Experimental DesignTeacher Outcomes
(attitudes toward technology)
IMPACT teachers showed stronger change in
attitudes or more positive attitudes overall on
18
Quasi-Experimental Design Teacher Stages of
Concern Years 1-3
19
Quasi-Experimental Design Student Use of
Computers in Grades 3-5, 2004-05
20
Quasi-Experimental DesignIMPACT v. Comparison
Media Center Visitation, Year1-Year 3
21
Quasi-Experimental Design IMPACT v. Comparison
Math Achievement
Effect significant at plt. 0001, controlling for
grade, race, exceptionality, Free/reduced lunch,
sex, absenteeism
22
Quasi-Experimental Design Reading Growth
2003-2005, by Grade
EOG growth from baseline to Year 2
Effect significant at plt. 05, controlling for
free/reduced lunch, race, exceptionality, sex,
absenteeism, parent education
23
Case study process
  • In the 2004-2005 school year, a preliminary case
    study of one intervention schools community
    outreach program was conducted
  • Data sources included
  • Phone interviews with patrons
  • Structured interviewed with staff
  • Archival documents (e.g. attendance data, course
    offerings, budget data)
  • In 2005, funds for the case study were redirected
    to the experimental design component

24
Case study outcomes
  • Findings suggest that low-cost technology
    alternatives can be beneficial to school-based
    community outreach programs
  • At the same time, personal attributes of key
    staff played a pivotal role in the success of
    programming.

25
Experimental Design Process
  • In 2004, Schools Attuned was selected as the
    intervention in the experimental design
  • However, a different intervention (IRCMS) was
    approved and implemented, beginning in the spring
    of 2005.

26
Experimental Design Process (continued)
27
Experimental Design Student Measures
  • Measures
  • Gates-MacGinitie Reading Comprehension Test
  • Reading EOG
  • Metacomprehension Strategy Index
  • Jr. MAI (Metacognitive Awareness Inventory)
  • Reading Efficacy
  • Teachers rating of student metacognition

28
Experimental Design Teacher Measures
  • Measures
  • Technology use survey
  • TAC survey
  • DeFord Theoretical Orientation to Reading Profile
    (TORP) (pretest only)
  • MAI (Metacognitive Awareness Inventory)
  • Teachers Sense of Efficacy Scale (TSES)
  • Teaching Reading Efficacy

29
Formative Evaluation
  • SERVE Center at the University of North
    Carolina-Greensboro

30
LANCET Implementation and Outcomes
  • Capacity for Applying Project Evaluation (CAPE)
  • www.serve.org/evaluation/capacity/
  • Elizabeth Byrom, SERVE
  • Jenifer Corn, SERVE

31
CAPE is
  • A suite of resources, tools, and professional
    development activities, designed to help
    educators collect and use data to make decisions
    that will help them improve the implementation
    and impact of their technology projects.

32
SERVEs Role
  • Collaborate with partners
  • Identify or develop resources and tools
  • Design and facilitate on-going professional
    development and support for school/district team
  • Document lessons learned about capacity building
    for project evaluation

33
School/District Teams Role
  • Create a project logic map
  • Develop an evaluation plan for their EETT project
  • Implement evaluation plan
  • Collect and analyze data
  • Use data to make informed decisions
  • Make adjustments to project implementation

34
Capacity for Evaluation
  • Formative Evaluation used to monitor and
    adjust projects, to the ultimate benefit of
    students
  • Capacity the organizational wherewithal to
    undertake project evaluation, more than just
    skills and knowledge for individuals

35
CAPE Components
  • A Theoretical Framework for Capacity Building
  • An Evaluation Framework
  • A Professional Development Model

36
Framework for Capacity Building
37
(No Transcript)
38
Evaluation Planning Tools
  • Logic mapping activities and templates
  • Strategy and objective planning templates and
    guides
  • Data-collection planning guides

39
Evaluation Planning
  • Map project logic
  • Clarify strategies and objectives
  • Define evaluation questions
  • Propose benchmarks
  • Select methods and measures
  • Conduct the evaluation
  • Draw inferences from data

40
IMPACT Model School Logic Map
41
IMPACT Model School Objective Planning Guide
42
Data Sources for EETT Projects
  • Technology Needs Assessment
  • Classroom Observation
  • Technology-Partnership Survey
  • Professional Development Questionnaire
  • Rubrics for lesson plans and student products
  • Teacher Reflection Log

43
CAPE Instruments and Protocols
  • School Technology Needs Assessment (STNA)
  • Professional Development Questionnaire (PDQ)
  • Looking for Technology Integration (LoFTI)
    drop-in protocol
  • Technology and School-Family-Community
    Partnership survey

44
School Technology Needs Assessment (STNA)
45
Online STNA
  • Bar graphs
  • Repeated use indicates changing
    needs over time
  • Used in about 200 schools to date, with more than
    7914 respondents
  • Now in Version 3.0

46
STNA Report
47
STNA Research Study
  • Internal Consistency Reliability (N2094)
  • Data analyses showed each of STNA constructs and
    subconstructs to have high internal consistency
    reliability (alpha ranged from .807 to .967).
  • These results indicates that STNA is a high
    quality survey instrument that provides schools
    and districts with information that can be used
    to make decisions about each of the constructs
    and subconstructs.

48
STNA Research Study
  • Exploratory Factor Analysis (N2050)
  • The initial analyses revealed 13 factors with an
    eigenvalue greater than one, accounting for
    62.32 of the total variance.
  • Ten of the 13 factors were largely the same
    constructs initially identified for STNA.
  • These results provided strong support for the
    validity of the constructs identified within
    STNA.

49
Professional Development Questionnaire (PDQ)
50
PDQ
  • Easily adapted to specific settings or activities
  • Assesses participants perceptions of the quality
    of professional development implementation
  • Does not provide data about the impact of PD
    activities whether they made a difference

51
LoFTI Looking For Technology Integration
Classroom technology observation protocol
52
LoFTI
  • Designed through collaboration with team of
    school practitioners
  • Reports a profile of technology use at the school
    level
  • Paper-pencil version available
  • Palm version almost ready

53
School-Family-Community Survey
54
School-Family-Community Survey
  • Designed for a range of stakeholdersstaff,
    parents, others
  • Use results in making decisions about technology
    for supporting family and community involvement
    efforts
  • Version 1.0 is available online or in
    paper-pencil form

55
CAPE Professional Development
56
CAPE Professional Development
  • Academies and Institutes
  • Workshops
  • Virtual Meetings, conference calls and
    videoconferences (CMPDs)
  • Presentations
  • Online community of practice
  • Teams sharing successes and lessons learned
  • Technical Assistance

57
CAPE Professional Development
  • CMPD Topics
  • Initial Implementation of Evaluation Plan
  • Evaluation Management Plans
  • Baseline Data Collection
  • Maximizing School Buy-in Community Support
  • Data Analysis Interpretation

58
Notes to Project Leaders
  • Identify and address the challenges and costs of
    evaluating projects/programs.
  • Use team-based planning and implementation of
    evaluations.
  • Recognize that collecting data is relatively
    easyanalyzing and using data is the hard part.
    Both require a lot of time.

59
Notes to Project Leaders
  • Communicate to generate buy-in.
  • Define and share the evaluation purposeneeds
    assessment, required reporting, data-driven
    planning, or program improvement?
  • Reach consensus on a definition of the program or
    project being evaluated.

60
Notes to Project Leaders
  • Separate project implementation from impact, and
    measure both.
  • Define the evaluation questions that matter to
    the evaluation purpose.
  • Plan for and collect all of the data, and only
    the data necessary to answer the questions.
  • Manage the evaluation process.

61
Capacity for Evaluation
62
Capacity Building NC DPI
  • IMPACT I Schools
  • IMPACT II Schools
  • 1-2-1 Grant Schools
  •  
  • IMPACT Academies based on the SERVE ATA Model
  • Collaboration Toolkit
  • IMPACT Video Series
  • NC LEA and Charter School Educational Technology
    Plans

63
Dissemination Activities
64
LANCET Dissemination
NC State Dissemination Activities
  • Ellen Vasu
  • Jason Osborne
  • Lisa Grable

65
NC State Dissemination Activities
  • Publications
  • Corbell, K.A., Osborne, J.W., Grable, L.L. (in
  • press). Examining the Performance Standards
  • for Inservice Teachers A confirmatory
    factor
  • analysis of the Assessment of Teachers
    NETS-T
  • Expertise. Computers in Schools.
  • Osborne, J.W., Overbay, A., Vasu, E.S. (in
    press).
  • Designing grant proposals and evaluation
    plans
  • in the age of No Child Left Behind.
    Journal of the
  • American Association of Grant
    Professionals.
  • Overbay, A., Grable, L.L., Vasu, E.S. (2006).
  • Scientifically-based research Postcards
    from
  • the edge. Journal of Technology and
    Teacher
  • Education (JTATE), 14(3), 623-632.

66
NC State Dissemination Activities
  • Manuscripts in Preparation
  • Measuring Teacher Attitudes Toward Computers and
    Teacher Attitudes Towards Information Technology
  • Dimensions of Technology Skills
  • Learning Styles and Resistance to Change

67
NC State Dissemination Activities
  • Presentations
  • 26 National and International Conference
    Presentations and Workshops
  • 6 State and Regional Conference Presentations

68
SERVE Dissemination Activities
  • CAPE Website http//www.serve.org/Evaluation/Capac
    ity/
  • 5,434 hits since November 10, 2006
  • Manuscripts in Preparation
  • CAPE Framework, STNA, CAPE PD Model, Data Sources
    for Evaluating Technology Projects
  • Presentations
  • 13 Evaluation Academies/Institutes/Workshops
  • 13 National Conference Presentations and
    Workshops
  • 9 State Conference Presentations and Workshops

69
SERVE Dissemination Activities
  • Instruments
  • School Technology Needs Assessment (STNA)
    (n7914)
  • Professional Development Questionnaire (PDQ)
  • Looking for Technology Integration (LoFTI)
    drop-in protocol
  • Technology and School-Family-Community
    Partnership survey (n88)

70
SERVE Dissemination Activities
  • Building Evaluation Capacity Studies
  • Microsoft Partners in Learning
  • Irvine Foundation study participant
  • Spread
  • REL-SERVE Evidence-Based Education
  • National Center for Homeless Education
  • SETDA-Polyvision Study
  • Graduate Courses NCSU, UCF, Johns Hopkins
  • Dissertation/Thesis NCSU, UNC

71
Dissemination NC DPI
  • IMPACT Grants
  • 1-2-1 Grants
  • IMPACT Guidelines revision
  • IMPACT for Administrators
  • IMPACT Website

72
Dissemination NC DPI
  • North Carolina State Board of Education
    Future-Ready Students
  • Future-Ready Classrooms initiative

73
Roadmap to Replicability
  • NC State University/Friday Institute for
    Educational Innovation

74
Previously Validated Instruments
  • State End-of-Grade tests (grades 3-8)
  • NC Writing Test (grades 4 8)
  • NC Computer Skills Test (grade 8)
  • Gates-MacGinitie Reading Test (Grade 2, primary
    schools only)
  • Computer Attitude Questionnaire (4-8)
  • Young Childrens Computer Inventory (K-3)
  • Teacher attitude toward technology integration
    (TAT)
  • Teacher attitude toward computers (TAC)
  • Stages of concern questionnaire
  • Resistance to Change
  • Leadership Practices Inventory (LPI)

75
Reviewed Instruments
  • Examined the factor structures of
  • Teachers Attitudes Toward Computers (TAC)
  • Teachers Attitudes Towards Information
    Technology (TAT)
  • Performance Standards for Inservice Teachers
  • Technology Skills Checklist 3-5
  • Technology Skills Checklist 6-8
  • School Technology Needs Assessment (STNA)
  • Activities of Instruction Survey was also reviewed

76
Other Instruments Used
  • Classroom Climate (3-8)
  • Teacher and Administrator Demographic surveys
  • NETS-A Performance Profile (Administrators)
  • IMPACT Rubric
  • IMPACT Implementation Checklist
  • Classroom Equipment Inventory
  • Media and Technology Inventory

77
Treatment and Control Considerations
  • Competitive grant application process
  • Comparison group incentives
  • Cross contamination
  • Time intensive matching process
  • Requires personal contact with all groups
  • Attrition

78
Assessing Students in K-2
  • State prohibition of primary grade standardized
    academic assessment
  • Expense of appropriate instruments and cost of
    extra testing administrators
  • Group administration requires one-on-one
    attention
  • Young ELLs

79
Exposure Issues
  • Teacher concerns about observation/evaluation
  • Only a few schools involved in a very high
    profile project
  • Desire to look good

80
Data Collection
  • Paper and pencil or electronic
  • Computer access, reduced response rate
  • Logistics of distribution and collection of paper
    surveys
  • Middle school students- no single classroom
    teacher
  • Student information systems
  • Formative v. external evaluation
  • Site visits, no normal school days

81
Navigating the Regulations
  • Obtaining disaggregated student information and
    interpreting policy
  • Family Educational Rights and Privacy Act
  • Department of Agriculture controls Free and
    Reduced Lunch Information

Overbay, A., Grable, L.L., Vasu, E.S. (2006).
Scientifically-based research Postcards from the
edge. Journal of Technology and Teacher
Education (JTATE), 14(3), 623-632.
82
Roadmap to Replicability
  • SERVE Center at the University of North
    Carolina-Greensboro

83
LANCET Roadmap to Replicability
  • Inferred Insights into Capacity Building for
    Project Evaluation Lessons Learned from the
    IMPACT Schools
  • The SERVE Center at UNCG
  • Elizabeth Byrom
  • Jenifer Corn

84
Lessons Learned
  • Lessons learned are derived from a content
    analysis of qualitative data from focus groups
    and individual interviews with educators in the
    IMPACT schools.

85
Framework for Capacity Building
86
Lesson
  • Project evaluation is a complicated process
    requiring cooperation among multiple people it
    is important that everyone involved speak the
    same language.

87
Hint for evaluation capacity builders
  • Help project management and/or project evaluation
    teams establish a glossary of evaluation terms
    that will be used for their project.
  • Its more important for evaluation teams to use
    the same definitions than it is for them to use
    the right definitions.

88
Lesson
  • In order to build capacity for evaluation, the
    purpose or purposes of any evaluation effort must
    be meaningful, explicit, and understood by
    everyone involved.
  • It helps tremendously if everyone involved
    believes in the purpose of the evaluation.

89
Hint for evaluation capacity builders
  • Because purposes for an evaluation may differ at
    various levels (SEA, LEA, school, IHE), its
    important to clarify the different purposes.
    Make sure that everyone participating in the
    evaluation understands each organizations
    purposes, roles, and responsibilities.

90
Lesson
  • Learning how to evaluate a project requires
    change, and change takes time and energy.

91
Change, contd
  • Evaluations can change not only projects, but
    also the people implementing the projects.

92
Hint for evaluation capacity builders
  • Help educators understand that they are going
    through a change process. From time to time, help
    them reflect on where they are in the process.
  • Show project leaders how they can use
    already-dedicated time when asking teachers to
    participate in evaluation efforts.

93
Hint for evaluation capacity builders
  • Understand and prioritize the changes being asked
    of education project participants by recognizing
    that some changes are harder than others.

94
Lesson
  • Some specific knowledge and skills will help make
    project participants evaluation efforts more
    valuable, effective, and efficient.

95
Hints for evaluation capacity builders
  • Actively teach educators how to collect, analyze,
    and interpret data.
  • Help educators formalize informal data and
    evaluation practices.
  • Show teachers how to use technology to access
    evaluation data previously not readily available.

96
Hints for evaluation capacity builders
  • Educators who are inexperienced with project
    evaluation tend to collect the wrong data or too
    much data. Show them how to select and use data
    sources that will be the most meaningful for
    their projects.

97
Hints for evaluation capacity builders
  • Find out what data educators are already
    collecting, and if appropriate and feasible, show
    them how they might use the data for their
    project evaluation.

98
Hints for evaluation capacity builders
  • Dont be surprised if some project stakeholders
    are reluctant to provide necessary data. This
    can happen especially when stakeholders do not
    see value in the evaluation.

99
Hints for evaluation capacity builders
  • Help educators learn to provide feedback to
    stakeholders, showing the results and findings of
    the data collected.

100
Hints for evaluation capacity builders
  • Dont be surprised if administrators and teachers
    streamline data collection procedures or
    instruments.
  • Dont be surprised if teachers use their new
    evaluation knowledge and skills in their own
    teaching.

101
Lesson
  • Success of a project evaluation and likely of
    the project itself depends on participants
    sharing a sense of identity around the effort.

102
Identity contd
  • Leadership of project evaluation might come from
    unexpected individuals, but regardless of where
    it comes from, leadership is most effective when
    shared.

103
Hint for evaluation capacity builders
  • Help educators develop a plan for actively
    sharing their project and evaluation plans,
    activities, and results with stakeholders.

104

Hint for evaluation capacity builders
  • Help project participants and those who are
    evaluating the project reach a consensus
    understanding of how the project is supposed to
    work.

105
Hint for evaluation capacity builders
  • If logic mapping is considered worthwhile, show
    educators how to use them early in the project
    planning process. Allow enough flexibility for
    teams to illustrate their actual understanding of
    how their project works, i.e., dont be rigid
    about their using a particular logic map design.

106
Lesson
  • The leadership, shared understandings, and sense
    of community required for effective project
    evaluation are heavily dependent on good
    communication.

107
Hint for evaluation capacity builders
  • Help educators develop a plan for communication
    among everyone involved, such that communication
    is early, often, and in ways that support their
    efforts.

108
NCDPI Roadmap to Replicibility
  • IMPACT Products
  • SBE Future-Ready Agenda
  • Future-Ready Classrooms

109
  • In compliance with federal laws, N C Public
    Schools administers all state-operated
    educational programs, employment activities and
    admissions without discrimination because of
    race, religion, national or ethnic origin, color,
    age, military service, disability, or gender,
    except where exemption is appropriate and allowed
    by law.
Write a Comment
User Comments (0)
About PowerShow.com