Good Data as a Tool to Good Outcomes - PowerPoint PPT Presentation

1 / 72
About This Presentation
Title:

Good Data as a Tool to Good Outcomes

Description:

... Educational and Psychological Testing (1999) by American Educational Research Association, American Psychological Association, National Council on Measurement ... – PowerPoint PPT presentation

Number of Views:79
Avg rating:3.0/5.0
Slides: 73
Provided by: FPG
Learn more at: https://nectac.org
Category:

less

Transcript and Presenter's Notes

Title: Good Data as a Tool to Good Outcomes


1
Good Data as a Tool to Good Outcomes
  • Kathy Hebbeler
  • Early Childhood Outcomes Center
  • SRI International

Measuring Child and Family Outcomes, Baltimore,
MD August, 2007
2
Using Data
  • What are the goals for children and families?
  • How does a state to achieve the goal?
  • How does a state know if it has achieved the
    goals? Completely? Where? For whom?
  • How do you identify what your state needs to do
    if you are not there yet?

3
  • What is the goal?

4
Goal for Children
  • Active and successful participants now and in the
    future across a variety of settings and
    situations
  • Have positive social relationships
  • Acquire and use knowledge and skills
  • Take appropriate actions to meet their needs

5
Goal for Families
  • Enable families to provide appropriate care for
    their child and have resources they need to
    participate in community activities
  • Understand their childs strengths, abilities,
    and special needs
  • Know their rights and advocate effectively for
    their children
  • Help their children develop and learn
  • Have support systems
  • Access desired services, programs, activities in
    their community

6
  • Who does the goal apply to?

7
Children in age range who meet your states
eligibility criteria and their families
13,000?
1000?
5000?
8
  • How does a state achieve these goals?

9
System for Producing Good Child and Family
Outcomes
Adequate funding
Good outcomes for children and families
High quality services and supports for children
0-5 and their families
Good Federal policies and programs
Good State policies and programs
Good Local policies and programs
Strong Leadership
  • Profl Development
  • Preservice
  • Inservice

10
(No Transcript)
11
  • Proposition A healthy system of services and
    supports produces good outcomes for children and
    families

12
What constitutes a healthy system?
  • Services and supports provided
  • By qualified personnel
  • In a timely manner
  • Consistent with recommended practices,
    evidence-based practices, etc.
  • Transdisciplinary
  • Family-centered
  • Build on natural learning opportunities and
    everyday routines
  • .

13
System for Producing Good Child and Family
Outcomes
Adequate funding
Good outcomes for children and families
High quality services and supports for children
0-5 and their families
Good Federal policies and programs
Good State policies and programs
Good Local policies and programs
Strong Leadership
  • Profl Development
  • Preservice
  • Inservice

14
What else constitutes a healthy system?
  • Culture of accountability/shared responsibility
  • Willingness (even an eagerness) to regularly use
    data to examine how the system is functioning
  • Data are available to people who need them
    analyzed the way they are needed when they are
    needed.
  • Who needs access to data reports?
  • You! Local administrators? Practitioners?
  • Data regularly are used to make decisions and
    take action

15
FMA
  • Findings
  • Meanings
  • Action

16
Findings
  • Findings are the numbers
  • 10 of families responded
  • 45 of children in OSEP category b
  • The numbers are not debatable
  • Data need to be analyzed in interpretable ways.
  • There are choices with regard to how to analyze
    and present findings.
  • Some choices are better than others.

17
Meaning
  • The interpretation put on the numbers
  • Is this finding good news? Bad news? News we
    cant interpret?
  • Meaning is debatable and reasonable people can
    reach different conclusions from the same set of
    numbers
  • Stakeholder involvement can be helpful in making
    sense of findings

18
Action
  • Given the meaning put on the findings, what
    should be done?
  • Recommendations or action steps
  • Action is always debatable and often is
  • Another role for stakeholders

19
Building the capacity to use data
  • Administrators and practitioners need to be able
    to ask questions about the system and the
    outcomes being achieved, get the data to address
    those questions, and make decisions based on what
    was learned.
  • Capacity building
  • Ability to ask good questions
  • Have good data to answer the questions
  • Have the data analyzed in meaningful ways
  • Be able to interpret what the data mean
  • Be able to decide on appropriate actions

20
  • How does a state know if it has a good system of
    services and supports?
  • How does a state know if the state system is
    producing good outcomes?

21
The Answer
  • Asking good questions and answering them with data

22
Children in age range who meet your states
eligibility criteria and their families
13,000?
1000?
5000?
23
Children in age range who meet your states
eligibility criteria
24
Children in age range who meet your states
eligibility criteria
25
Children in age range who meet your states
eligibility criteria
26
What are your states proportions?
27
(No Transcript)
28
Hypothetical State Data OSEP Categories
Unless otherwise indicated, data in this
presentation are made up
29
Ways to Look at Child Outcome Data
  • By locality (Program or LEA)
  • Which locals are doing well?
  • Which are not?
  • What other information are you going to want if
    you find some locals have poorer outcomes than
    others? What are you going to want to know so
    you will know what actions to take?
  • NOTE This is not about blame It is about
    providing programs the support they need to do a
    good job.

30
Hypothetical State Data OSEP Categories
31
Looking at Data by Locality
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
32
Ways to Look at Outcome Data
  • By locality (Program or LEA)
  • By child or family characteristic
  • Which children are doing well?
  • Child characteristics
  • Disability related information
  • Health
  • Demographic information (gender, race, ethnicity,
    language)
  • Family Characteristics
  • Demographic information (maternal education,
    poverty, employment, immigrant)

33
Sample questions related to child and family
characteristics
  • Do children with only a speech language disorder
    have better outcomes than other groups?
  • How do minority childrens outcomes compare to
    outcomes for other children?
  • Do child outcomes vary as a function of maternal
    education? Are children of more highly educated
    mothers experiencing better outcomes?
  • Are family outcomes lower for families in
    poverty?

34
Looking at Data by Child Characteristics
Disability
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
35
Ways to Look at Outcome Data
  • By locality (Program or LEA)
  • By child or family characteristic
  • By outcome area
  • Are we doing a better job helping children make
    progress in Outcome 1 (social relationships) than
    Outcome 2 (knowledge and skills)?

36
Looking at Data by Outcome Area
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
37
Looking at Data by Outcome Area
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
38
Children in age range who meet your states
eligibility criteria
39
Putting Meaning on the Data
  • What are alternative explanations for the
    finding?
  • Are there other ways of looking at the data that
    might provide insight into a possible
    explanation?

40
Ways to Look at Outcome Data
  • By locality (Program or LEA)
  • By child or family characteristic
  • By outcome area
  • By service characteristics
  • Type of service
  • Intensity (Minutes of services per week)
  • Duration (Months of service)
  • Total minutes of service
  • Service coordination model
  • Location of services
  • Makeup of team

41
Looking at Data by Service Characteristic
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
42
Looking at Data by Service Characteristic
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
43
Looking at Service Characteristic for by Level of
Outcome Attainment
Average Total Minutes of Scheduled Service per
Week for Children in Each OSEP Category for
Outcome 2
44
Putting Meaning on the Data
  • What are alternative explanations for the
    finding?
  • A reasonable alternative explanation for an
    outcomes pattern is that the groups of children
    are different.
  • It only makes sense to compare outcomes across
    groups if you are comfortable that the groups
    were comparable to begin with
  • Are there other ways of looking at the data that
    might provide insight into a possible
    explanation?
  • How could you use your data to check on the
    comparability of the groups?

45
Ways to Look at Outcome Data
  • By locality (Program or LEA)
  • By child or family characteristic
  • By outcome area
  • By service characteristics
  • By family outcomes/family involvement/familys
    perception of help
  • Have to be able to link individual child outcomes
    and to that childs family outcomes

46
Looking at Data by Family Outcome
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
Family Report of Knowing How to Help Child
Develop and Learn
47
Ways to Look at Outcome Data
  • By locality (Program or LEA)
  • By child or family characteristic
  • By outcome area
  • By service characteristics
  • By family outcomes/family involvement/familys
    perception of help
  • Longitudinally

48
National Early Intervention Longitudinal Study
Former EI Participants Need for Special
Education and Disability Status at Kindergarten
49
National Early Intervention Longitudinal
StudyPercentage of Children Rated by
Kindergarten Teachers as Intermediate or
Proficient in Language and Literacy Skills, By
IEP Status Compared with General Kindergarten
Population(ECLS-K data)
50
System for Producing Good Child and Family
Outcomes
Adequate funding
Good outcomes for children and families
High quality services and supports for children
0-5 and their families
Good Federal policies and programs
Good State policies and programs
Good Local policies and programs
Strong Leadership
  • Profl Development
  • Preservice
  • Inservice

51
Formulating good questions
  • All of the question and analysis examples have
    focused on outcomes
  • Many important questions to be asked about
    services and supports
  • Are we identifying all eligible children?
  • Are services delivered in a timely manner?
  • Are services family-centered?

52
Using Indicators to Examine Services
  • Indicators are selected to be a small number of
    important markers to give a picture of how a
    system is doing
  • Indicators do not provide a comprehensive look
  • Your system must be able to produce the data for
    all of the OSEP indicators
  • Is that all you want to know?

53
What is on your data system wish list?
  • Have a data system in place that regularly
    provides valid data on
  • Characteristics of the children and families
    being served
  • Characteristics of the services and supports
    provided
  • Family outcomes
  • Child outcomes
  • Can link each of these
  • Can follow children longitudinally

54
  • Are the outcome data valid?

55
Validity encompasses
  • Quality assurance procedures
  • Examining the validity of the data

56
Procedures to Promote Quality
  • Preparing for data collection
  • Adequate training and communication
  • During data collection
  • Commitment to the data collection
  • System of supports for the data providers
  • After data collection
  • Data entry
  • Data follow up
  • Data analysis

57
Preparing for data collection
  • Training and Communication
  • Is there a process for checking whether all of
    the data providers understand what they are to
    do?
  • Is there a process for checking whether they do
    it?
  • Do they know why they are doing it?
  • What do we know about one shot trainings??

58
During Data Collection
  • Commitment to the data collection
  • Do providers understand the importance of the
    activity?
  • Has the system been designed so providers (and
    families) will receive benefit from collecting
    and providing data?
  • Do providers know someone will be checking on
    what they are doing?
  • Supports
  • Has the process been designed to make it as easy
    and to take as little time as possible? (Can any
    part be streamlined?)
  • Is a knowledgeable person observing or tracking
    data collection activities and providing feedback
    in a timely manner?
  • Is there a way for providers to get ongoing
    questions addressed?

59
After Data Collection
  • Data entry
  • Are there safeguards to minimize data entry
    errors?
  • Data follow up
  • Verification Is there a process in place for
    check a sample of records for accuracy and
    completeness?
  • Is there a process for providing timely feedback
    when errors are discovered?
  • Data analysis
  • Cleaning individual data Are there procedures
    for identifying out of range values, anomalies,
    incomplete data?
  • Is there a plan to looking at the aggregates data
    in various ways to identify unexplainable
    variations, strange patterns, etc.?
  • Is there a process for providing timely feedback
    when errors are discovered?

60
Building quality into the state system
  • Keep errors from occurring in the first place
  • Catch them when they occur
  • Provide ongoing feedback to programs and providers

61
Validity Checks
  • Validity refers to the use of the information
  • Does evidence and theory support the
    interpretation of the data for the proposed use?
  • Standards for Educational and Psychological
    Testing (1999) by American Educational Research
    Association, American Psychological Association,
    National Council on Measurement in Education

62
Validity Checks
  • What are the uses to which the child outcomes
    data will be put?
  • Federal level By OMB, for reaching conclusions
    about the effectiveness of the program.
  • If the data allow OMB to reach the right
    conclusion about the effectiveness of these
    programs, the data are valid
  • State level
  • Conclusions about the effectiveness of the
    program?
  • Identifying areas for program improvement
  • Do the data identify the right areas?

63
Looking at Data by Locality Scenario 1
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
64
Looking at Data by Locality- Scenario 2
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
65
Looking at Data by Locality-Scenario 3
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
66
Validity Problem vs. Interpretation Error
  • Are the data capturing real differences
    accurately?
  • (or, are the differences the data have captured
    real?)
  • Are there alternative explanations for the
    differences that you have not considered?
  • Programs needs help with service provision
  • vs.
  • Programs serving different kinds of children or
    families

67
Stability of Data
of Children Who Changed Developmental
Trajectories or Maintained Age Appropriate
Functioning (c d e)
68
Validity Argument
  • Accumulation of evidence from a series of
    if-then propositions about the data
  • If the data are valid, then, e.g.,
  • Data should not vary wildly across programs
    serving the same kinds of children
  • Data for children with certain kinds of
    disabilities should look different than data for
    other children
  • Etc.
  • Are there sensible patterns in the data?
  • ECO proposal on validity of the Child Outcome
    Summary Form data

69
To Dos
  • Create a culture that values accountability and
    using data for decision-making
  • Continue to put procedures in place for
    collecting outcomes and systems data
  • Insure the quality and validity of the data
    through a variety of checks and processes

70
To Dos
  • Provide appropriate access to data to local
    administrators and practitioners
  • Identify the key questions to address about your
    state system (today, someday)
  • Analyze data, interpret data, build better
    systems based on what is learned

71
  • Be patient this is a long, iterative process
  • Be persistent keep pushing for what needs to
    happen to eventually get good data
  • Be vigilant dont let this ship get off course
  • Be proud celebrate the incredible
    accomplishments of the last 2 years!!

72
Finally
  • We look forward to another great year of working
    with you
  • Watch ECO web site guidance on how to organize
    your APR/SPP material
  • Contact us for help with data quality and data
    analysis issues
  • Conference calls coming on data analysis
  • Data workshop at the National Early Childhood
    Meeting
  • Contact information
  • www.the-ECO-Center.org
  • or e-mail to staff_at_the-eco-center.org
Write a Comment
User Comments (0)
About PowerShow.com