American Institutes for Research - PowerPoint PPT Presentation

1 / 97
About This Presentation
Title:

American Institutes for Research

Description:

What is Data Literacy? ... Don t Overlook Trends Data Because trends have clear direction, ... PLAN IMPLEMENT EVALUATE IMPROVE Guiding Principles Vision, ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 98
Provided by: HeideW6
Learn more at: https://nrsweb.org
Category:

less

Transcript and Presenter's Notes

Title: American Institutes for Research


1
Learning to be an NRS Data Detective The Five
Sides of the NRS
  • American Institutes for Research
  • June-July 2006

7/20/2015
L. Condelli/M.Corley
2
Objectives
  • (Refer to H-01)
  • By the end of this training, participants
  • will be able to
  • Identify the characteristics of good data
    collection procedures and database systems
  • Describe NRS requirements for assessment, goal
    setting, and follow-up procedures
  • Identify ways to motivate state and local staff
    to take an interest in data and become data
    literate

3
Objectives (Cont.)
  • (Refer to H-01)
  • Use data reports to highlight data quality
    problems and promote program improvement
  • Create a suite of data reports for program
    quality and improvement and
  • Develop a dissemination plan.

4
Broad Concepts of this Training
  • Effective processes and procedures for collecting
    data, both for the NRS and for state and local
    purposes
  • The role of understanding and motivation the
    elements
  • How to use data for program improvement
  • How to become a data detective look for clues
    and identify potential problems with data
  • human

5
Agenda
  • Day 1
  • Welcome, introductions, objectives, agenda
  • Warm-up activity
  • Overview The Five Sides of the NRS
  • Tools every data detective needs
  • Day 2
  • Developing your suite of reports
  • Day 3
  • Sharing your reports
  • Disseminating your reports
  • Developing your action plan for program
    improvement

(Refer to H-02)
6
Introductions
  • Each member of state team introduce self (name,
    title, role re NRS)
  • One team member name one thing state has done to
    improve data quality and the biggest challenge
    you face re data quality
  • Another team member name one policy decision
    state has made as a result of reviewing data
    and/or one thing state plans to do for
    continuous program improvement
  • (Refer to H-03 Take 5
    minutes to prepare responses)

7
Warm-up ActivityList as Many Statements as You
Can
  • One statement per blue Post-It Note
  • What is the value of using data in adult
    education programs? (5
    minutes)
  • One statement per purple Post-It Note
  • How can we influence/create a state and local
    program culture in which adult educators use data
    continuously, collaboratively, and effectively?
    (5 minutes)
  • Following whole group discussion, post notes to
    appropriate wall charts
  • (Refer to H-04)

8
Overview of Guide
  • Learning to be a Data Detective
  • The Five Sides of the NRS
  • On the Top 10 Non-fiction
  • Best-Seller List for 2006

9
What are the 5 Sides of the NRS?
  • Foundational Elements
  • Solid Database for Recording
    and Retrieving Data
  • Sound Data Collection
    Procedures and Policies
  • Policies and Procedures for Collecting Core
    Outcome Data
  • Assessment
  • Goal Setting
  • Follow-up Measures

5 Easy Pieces ?
10
Side 1
  • Solid Database for Recording
  • and
  • Retrieving
  • Data

11
Characteristics of anEffective Data System
  • Tracks a relevant and complete set of data based
    on needs you anticipate
  • Provides tools for detecting missing data and for
    identifying potential data quality problems
  • Provides data that is up-to-date and accurate.

12
Data Reports, Elements, Functions
  • (Refer to H-05)
  • Review the NRS-required data reports, data
    elements, and system functions listed on H-05.
    Then consider the following questions
  • What data system are you using?
  • Please complete the Data System Inventory chart
    on the wall for your state (by the end of the
    day).
  • Does your data system
  • Enable you to produce each of these reports?
  • Contain all the necessary data elements?
  • Perform all the required functions?

13
NRS Data System Reports
Report Purpose Notes
NRS Tables (Statewide) Reporting Required NRS reports
NRS Tables (by Program) Program Monitoring Enables state to review performance of individual programs
Class Lists Instruction Provides basic contact information for use by teachers
Student Profile Report Instruction Enables program staff to review individual student needs, goals, and achievements
Program Profile Program Monitoring Enables state to review demographic snapshot of each program. Useful for planning and understanding data trends
Attendance Report (by Class) Instruction Enables teachers to monitor student attendance for their classes
Student Goals and Achievements Follow-up Provides detailed student information for conducting follow-up surveys
Student Posttest Planning Report Instruction Provides list of students nearing need for posttesting, based on contact hours
14
Data Elements
Student Contact Information
Student Demographics
Student Goals and Achievements
Student Enrollment Information
Student Assessment Information
Student Attendance
Staff Information
15
NRS Data System Functions
Function Description
Intake Collects basic demographics, NRS reporting
Testing and Placement Provides place to record test scores and automatically places student in a level
Enrollment Registers student in class
Attendance Provides way of entering contact hours for each student
Achievement Provides way of recording student achievements such as earning a GED, goals, needs, and contact information about student for retaining employment, etc.
Separation Provides means for recording student separation from a program
Reporting Provides reports to meet NRS requirements, program monitoring, or program operations
16
Developing an NRS Data System
  • Refer to H-06a and b
  • In the process of designing or developing a
    database? Then you may wish to use this checklist
    as a guide to help in writing the requirements
    document.
  • Already have a database that meets your needs?
    Then you may wish to use this checklist to
    consider potential adjustments to your database
    or to congratulate yourselves that your system is
    solid and contains all required features.

17
Side 2
  • Data Collection
  • Procedures
  • and Policies

18
Good Data Collection
  • A series of regimented procedures and policies
    that people must perform routinely and with
    little error.
  • So whats the problem here?

19
The Data Equation
  • Data Procedures People
  • with many opportunities for
  • error

20
Simplified View of Data Flow

Federal Level
State Level
Program Database Clerical staff Teachers
Students
21
(No Transcript)
22
4 Keys to the Success of a
Good Data Collection System
  • Many people working together as a team
  • Each person has specific role and
    ongoing training
  • Different levels of staff review data, look
    for clues, and decipher them to identify problems

23
4 Keys to the Success of a
Good Data Collection System
  1. Standardization of definitions, forms, and coding
    categories tied to the database to ensure that
    all members of the team operate from a common
    understanding
  2. There are various checkpoints and feedback loops
    for correcting errors and providing missing
    information
  3. Constant monitoring and adjustment

24
Do You have Each of These Essential Elements in
Place?
  • (Refer to H-07)
  • Staff knowledge and training
  • Standard forms and definitions
  • Error checking
  • Data entry
  • If not, whats missing?

25
Questions for Consideration
  • (Refer to H-08a and H-08b)
  • How good is your data collection system?
  • Do you have total confidence in the quality of
    your data? Why/Why not?
  • Where are the points along your data flow process
    at which error can be introduced?
    At the state level?
    At the local level?
  • Who reviews data along each step of the data
    collection and reporting process?
    At the state level? At the local level?
  • How can you improve your data collection
    processes and system?
    At the state level? At the local
    level?

26
Side 3
  • Assessment
  • Policies
  • and
  • Procedures

27
Policies and Procedures for Collecting Core
Outcome Measures
  • Assessment
  • Select tests that
  • Are standardized
  • Have different but equivalent pre- and posttest
    forms
  • That provide formative and summative information
  • Evaluate overall performance at various levels
    (e.g., class, program, state).
  • Determine students educational gain and level
    advancement.
  • Administer tests within the appropriate timeframe
  • Between program entrance and pre-test
  • Between pre- and posttest.

28
Side 4
  • Goal Setting
  • Policies
  • and
  • Procedures

29
Policies and Guidelines for Learner Goal
Setting
  • Four outcome (follow-up) measures are
    goal-dependent
  • Receive a secondary credential
  • Enter postsecondary education
  • Enter employment
  • Retain employment
  • Have clear, documented procedures for helping
    learners to set realistic goals, both short-term
    and long-term
  • SMART goals
  • Specific,
  • Measurable,
  • Attainable,
  • Reasonable, Time-limited
  • Help learners revisit and revise goals, as needed

30
Side 5
  • Follow-up
  • Policies
  • and
  • Procedures

31
Policies and Procedures for Collecting Outcome
(Follow-up) Measures
  • Database must have ability to identify students
    who exited program and had one of the following
    goals
  • Obtaining a job
  • Retaining current job
  • Obtaining a secondary diploma or passing the GED
    Tests
  • Entering postsecondary education or training.
  • Collect data either through data matching or by
    conducting student survey.
  • Identify Students for follow-up
  • Process for identifying contacting students
    from database
  • Policy for sampling procedures for survey, if
    appropriate

32
Procedures for Follow-up Survey and Data Matching
  • Collect dataSurvey
  • Survey conducted at proper time
  • Uniform survey instrument used statewide
  • Staff trained to conduct the survey
  • Resources available to conduct survey
  • Procedures to improve response rates
  • If sampling is used, use randomization procedure
    to draw the sample.
  • Collect dataData Matching
  • Data matching requires 3 pieces of student info
  • SSN, student goal, and exit quarter for
    employment outcomes
  • Data in proper format for matching to external
    database
  • Manage and report follow-up data
  • State database and procedures for reporting
    results.
  • Data archived for multi-year reporting.

33
What is Data Literacy?
  • The ability to
  • Examine multiple measures and multiple layers of
    data,
  • Draw sound inferences,
  • Engage in reflective dialogue, and
  • Design program improvement and evaluation
    strategies

34
What are Some Reasons for
Staff Resistance to Data?
  • Lack of Proper Training
  • Lack of Time
  • Feast or Famine
  • Fear of Evaluation
  • Fear of Exposure
  • Confusing a Technical Problem (Lack of Know-how)
    with a Cultural Problem (Lack of Data-use
    Culture!)
  • Source Holcomb, E. (1999).
  • Getting Excited about Data.
  • Thousand Oaks, CA Corwin Press.

35
The Best Data Collection Procedures
  • by themselves are not enough.
  • Your approach to data collection may also
    empower and motivate program administrators and
    teachers.

36
Data Use in the Classroom
  • For example, teachers may use the data to
  • Check their implementation
  • Learn more about their students
  • Learn more about their teaching
  • Use that learning to be a better teacher!

37
Six Psychological Motivators
Pane, N. (2004).  The Data Whisperer Strategies
for Motivating Raw Data Providers.  In A. R.
Roberts., K. R. Yeager (Eds), Evidence-based
Practice manual. Oxford University Press.
38
Motivator Examples
  • Compete How do I compare?
  • Rank
  • Anonymous comparisons
  • Reward Can I make it to the top?
  • Reward top 1-5
  • Learn What am I doing well and what might I do
    better?
  • Benchmarks linked to resources

39
(No Transcript)
40
Even anonymous comparisons can make a point
41
Motivator Examples
  • Compete How do I compare?
  • Rank
  • Anonymous comparisons
  • Reward Can I make it to the top?
  • Reward top 1-5
  • Learn What am I doing well and what might I do
    better?
  • Benchmarks linked to resources

42
Teachers of the Year!
  • Teachers who had the most GEDs!
  • Teachers who had the largest student gains!
  • Teachers who had the best retention!

43
Motivator Examples
  • Compete How do I compare?
  • Rank
  • Anonymous comparisons
  • Reward Can I make it to the top?
  • Reward top 1-5
  • Learn What am I doing well and what might I do
    better?
  • Benchmarks linked to resources

44
(No Transcript)
45
How Can You Use These Motivators?
(Refer to H-09)
  • How can you give teachers a voice and a lens for
    looking at data?
  • Are your state and local staff members Data
    Literate?
  • It is only with teachers as change agents that we
    will begin to see real improvement

46
Motivating Staff and Teachers Building Data
Literacy
  • (Refer to H-09)
  • In your state team, brainstorm strategies you
    might employ to motivate local program staff to
    become data literate and to use their data.
  • Take 10 minutes. List your ideas on H-09.
  • Select one team member to record and one to be
    prepared to report your ideas to the whole group.

47
Put Your Data Fears on the Table
  • What concerns you most about using data to
    make policy decisions?
  • Afraid you dont understand the data?
  • Afraid your questions will sound silly?
  • Afraid the truth about your data will make your
    program look bad?
  • Afraid people will take data out of context for
    their own agendas?
  • Afraid your data might not be valid and reliable?
  • Other?

48
Dont Get Stuck in a Data Swamp
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
DATA
49
Data are Merely Numbers
  • To turn data into information,
  • we must first
  • Organize the data
  • Describe the data
  • Interpret the data

50
  • Businesses dont keep data thats useless,
    that doesnt inform them of anything yet, in
    education, we have data that just runs all over
    us. We have to target it and organize it in such
    a way that it informs us so that we can make
    better decisions.
  • -David Benson, Superintendent
  • Blue Valley (KS) School District

51
The Importance of Data
  • The importance of data for administrators,
    policymakers, and teachers in the classroomto be
    able to break data down and know where the
    strengths and weaknesses resideis crucial if you
    want to make any kind of improvement.
  • -Monte Moses, Superintendent,
  • Cherry Creek (CO) School District

52
Transforming Data into Information
Transformation of Data
Information and Analysis
Database
Insight Knowledge Decisions Improvement
Bright ideas
53
Quote of the Day/Year
  • Data must be converted to information,
    knowledge, understanding, and wisdom. Then
    data-based decisions can be made at the level of
    understanding and wisdom.
  • - Russell Ackoff, Professor Emeritus,
    Wharton School

54
AND NOW
  • What every state director and local program
    manager wants!
  • Easy steps for becoming.

55
A Data Detective


56
You, Too, Can
  • Have effortless access to accurate and up-to-date
    data!
  • Spend less time and energy collecting data and
    more time actually using it for program
    improvement!

57
To Crack a Case
  • The data detective needs info that is
  • Relevant and complete
  • Accurate
  • Timely

58
Some Tools for Monitoring for Data Quality
  • Perform error and validity checks
  • Look for trends over time
  • Compare data within and across
    programs
  • Look for the red flags

59
1. Perform Error and Validity Checks
  • In basic reports of number of students in each
    demographic category, number with pre- and
    posttest scores, and number of students by goal
  • Look for
  • Out-of-range values
  • Incomplete or missing data

60
For Example
  • Teachers can enter student data in a timely
    manner and review class rosters to ensure that
    they accurately reflect number of students
  • In each demographic category
  • With both pre- and posttest scores
  • By goal.

61

62
So Whatcha Gonna Do?
  • Dig more deeply
  • Ask more questions

63


64
2. Look For Trends over Time
  • For data measures that have been relatively
    stable and predictable (e.g., number of contact
    hours, types of goals set, educational gains
    within levels)
  • Look for
  • Sudden or unexpected changes
  • Consider
  • Are there political, social, or economic factors
    in the community that explain these sudden or
    unexpected changes?
  • If notdig more deeply. Whats really going on?

65
Superficial data analysiscan be worse than none
  • Whats wrong with this picture?
  • Story of One Texas High School
  • What one data set showed
  • Decision school almost made
  • What they found when they dug deeper into the
    data

66
Lesson Learned.
  • Dont be too quick to
  • to conclusions!
  • JUMP

67
Lesson Learned Dont Overlook Trends Data
  • Because trends have clear direction, instead of
    causing turbulence, they actually help reduce it
    because they have a significant amount of
    predictability. - Joel Barker
  • The farther backward you can look,
  • the farther forward you are likely to see.
  • - Winston Churchill

68
Lesson Learned
  • There is no decision
  • that should be made
  • without looking
  • at more data.
  • One source of data is not enough.

69
3. Make Comparisons Within and Across Programs
  • Within a program, look for internal
    consistency of data. Does some measure seem
    out-of-whack?
  • For similar programs, are there comparable data
    results?
  • Look for the red flags!

70
Example Number and Percent of Student Goals by
Ethnic Group
Student Ethnicity Number of Students Enter Employment Enter Employment Enter Postsecondary Education Enter Postsecondary Education
Student Ethnicity Number of Students Number with Goal of Total Students with Goal Number with Goal Percent with Goal
Asian 64 16 25 57 89
African American 200 17 9 8 4
Latino 750 89 12 0 0
White 125 25 20 40 32
Total 1139 147 13 105 9
Wheres the red flag? Whats out-of-whack?
71
Questions to Ask of Your Data
  1. What do these data seem to tell me?
  2. What do they not tell me?
  3. What else do I need to know to get a complete
    picture?
  4. What good news is here for me to celebrate?
  5. What needs for continuous program improvement
    arise from these data?

72
Why Disaggregate Your Data?
  • Robert Reich, former U.S. Secretary of
  • Labor, once quipped that
  • he (at 5 feet) and
  • Shaquille ONeil (at 7 feet)
  • had an average height of 6 feet
  • but the coach would be well advised to
  • consider more than their combined
  • average before putting Reich on the
  • basketball team.

73
The Drill-Down Process of Disaggregating Data
  • First-layer Disaggregations How many students
    are there?
  • Male v. female What do you want
  • ABE v. ESL v. ASE want to know
  • Ethnicities about these
  • Ages students?
  • Second-layer Disaggregations How have the
    demographics changed over time?

74
The Drill-Down Process of Disaggregating Data
  • Third-layer Disaggregations
  • What percentage of students experienced increased
    learning gains or achieved their goals?
  • Is this equally distributed among genders and
    ethnicities?
  • Fourth-layer Disaggregations
  • Do students with higher attendance have greater
    learning gains?
  • Do classes meeting for more hours a week than
    others have greater percentages of students with
    increased learning gains or greater percentages
    of students meeting their goals?

75

Disaggregation
  • Not a problem-solving strategy
  • But a problem-finding strategy.
  • And one of the data detectives useful tools

76
Data Carousel Exercises
77
Data Carousel Exercises (Cont.)
  • Directions for Part I (refer to H-10ae)
  • Divide into 5 groups
  • Each group will note
  • Observations,
  • Possible Causes, and
  • Next Steps on flipcharts around the room.
  • Each group will report its conclusions.

78
Data Carousel Exercises (Cont.)
  • Directions for Part II (refer to H-11)
  • Your team will be assigned to one of the sites
    within the program (Marple Meadows, Fells Point,
    Poirot, Holmestead, or Wimseyville).
  • In your team, look across all five graphs posted
    on flipcharts around the room.
  • Be prepared to report out the story of your
    program and respond to questions on H-11.

79
Data Analysis Helps You
  • Understand where your program is now with respect
    to student achievement (Overview)
  • Examine who is and who is not meeting the
    agency/program and state standards
  • Predict the causes of failures and successes,
  • Learn what needs to change instructionally to
    Prevent future failure and to ensure future
    successes.
  • If you know why, you can figure out how
  • - W. Edwards Deming

80
What is Data-driven Decision-making?
  • Collecting data Involves both
  • Analyzing data problem-finding and
  • Reporting data problem-solving
  • Using data for program improvement
  • Communicating through data

81
What Do You Want to Know From Your Data? How Will
You Use What You Learn?
  • (Refer to H-12a and H-12b)
  • Depending on whether the info you learn from your
    data is good news or not-so-good news, how will
    you use what you learn?
  • What actions can state and local program staff
    take to spread the good news or work toward
    program improvement?

82
Sleuthing for DQ and PI Issues
  • (Refer to H-13a and H-13b)
  • The next five slides pose questions about
  • data quality and
  • program improvement
  • in the following areas
  • Assessment
  • Goal setting
  • Follow-up
  • Review questions in each area and place a check
    beside each question that you want to ask of
    your data. Then, in your state teams, prioritize
    your top 2 questions in each of the five areas.

83
Questions about Data Quality
  • Assessment
  • How many students have pre- and posttest data?
  • How has the percentage of students with pre- and
    posttest data changed over time?
  • Which students are not tested?
  • Are pre- and posttests given at the right time?
  • Are the right tests given?
  • Are the percentages of completers relatively
    stable?

84
Questions about Data Quality
  • Goal Setting
  • Which students are setting goals and how do they
    compare over time?
  • Are the percentages of students setting
    educational attainment goals consistent with
    their NRS level and program goals?
  • Does the percentage of students setting the goal
    of entering employment reflect the percentage of
    students who are unemployed?
  • How does goal setting differ by subgroup?

85
Questions about Data Quality
  • Follow-up (Survey and Data Matching)
  • How do response and data matching rates compare
    across programs and to the state average or
    standard and how have they changed over time?
  • How do response and data matching rates differ by
    subgroup?
  • Were the times for collecting Entered and
    Retained Employment data consistent with NRS
    Requirements?
  • Are the percentages of students obtaining
    follow-up outcomes relatively stable?

86
Questions about Program Improvement
  • Assessment
  • How do program completion rates compare with the
    state average, state standard, and/or other
    programs?
  • What are the trends in completion rates and how
    do they compare with the state average, state
    standard, and/or other programs?
  • What are completion rates by student goal?
  • How do completion rates of subgroups compare
    within a program?
  • How have completion rates for subgroups changed
    over time?
  • What is the relationship of completion to
    attendance?
  • What is the investment per completer (program
    efficiency) and how does it compare by program?
  • How has efficiency changed over time?

87
Questions about Program Improvement
  • Follow-up
  • How do goal attainment rates compare among
    programs?
  • What are the trends in goal attainment rates and
    how do they compare across programs and with the
    state average and standard?
  • How do subgroups compare on goal attainment and
    how has that changed over time?
  • What is the investment per goal attained (program
    efficiency)?

88
Sample Data Analysis Exercises
  • (Refer to H-14aH-14i)
  • Facilitators model process using first exercise
    (H-14a)
  • Eight teams Each complete one exercise
    (H-14bH-14i)
  • Sample responses from each area (assessment, goal
    setting, follow-up)
  • Questions/Clarification
  • (survey and data matching)

89
Steps in Solving a Problem


90
Planning Your Work
  • and Working Your Plan
  • Planning is essential.
  • No one plans to fail.
  • But many fail to plan.
  • Same results in the end.

91
Developing Goals and a Plan
for Program Improvement
  • (Refer to H-15)
  • Whats your most urgent or pressing problem?
  • What outcome do you want 5 years from now? 1 year
    from now?
  • Develop a 1-year goal statement to address
    problem.
  • What will your data look like when youve
    achieved this goal?
  • What do you need to achieve this goal?
  • What barriers might prevent you from reaching
    this goal?
  • How can you overcome these barriers?
  • What specific actions will you take to achieve
    this goal? Timeline? Persons responsible?
  • How will you evaluate the success of your
    actions?
  • (Now Refer to H-16 and develop your action plan )

92
Focusing the Data
RANDOM ACTS OF IMPROVEMENT


Bernhardt, V. (2004). Data analysis for
continuous school improvement. Larchmont, NY Eye
on Education.
93
Focusing the Data
FOCUSED IMPROVEMENT


Bernhardt, V. (2004). Data analysis for
continuous school improvement. Larchmont, NY Eye
on Education.
94
10 Steps to Using
Data for Program Improvement
  1. Convene an agency-based data team
  2. Review various data reports of the agency
  3. Analyze data patterns
  4. Ask questions, identify problem(s)Create a
    visual image that helps team see the problem(s)
  5. Review multiple data sources to verify problem(s)
  6. Generate hypotheses of root cause of each
    problemaccording to evidence
  7. Brainstorm solutions and prioritize
  8. Develop program improvement goal(s)
  9. Design action plan strategies, timeline,
    persons responsible, and evaluation criteria
  10. Make the commitment to follow through

95
Plan for Disseminating Your Reports and for
Rolling Out this Training
  • (Refer to H-17ab)
  • In your state teams, review your suite of
    reports.
  • Will you disseminate them? How and to whom?
  • Do you need to create additional reports? Which
    ones?
  • Will you roll out this training for local
    program staffs?
  • If so, how will you make it happen? Who will
    conduct training? When? For which audiences? What
    do you expect participants to be able to do as a
    result of the training? How will you evaluate the
    success of the training?

96
  • In Conclusion
  • Data-driven decision-making is not someplace
    weve arrived its still a journey and always
    will be. But its a journey where people
    understand how you buy the ticket and get on the
    train. And it youre not on the train, you need
    to find another place to be, because this is the
    way we have to operate.
  • -Yvonne Katz, Superintendent
  • Beaverton (OR) School District

97
Thank you
  • Great Audience!
  • Great Participation!
  • Great Ideas!
  • Live Long and Prosper!
  • Good Luck!!
Write a Comment
User Comments (0)
About PowerShow.com