American Institutes for Research - PowerPoint PPT Presentation

Loading...

PPT – American Institutes for Research PowerPoint presentation | free to download - id: ae3bf-YzIxN



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

American Institutes for Research

Description:

Use of high quality graphics and photos. Simple, short, succinct. Data Presentation ... Buy. Contract. Hard. Lose. Worry. Liability. Taxes. Sell. Cost. Deal ... – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 54
Provided by: MaryAnn137
Learn more at: http://www.nrsweb.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: American Institutes for Research


1
NRS Institute on Developing State and Local
Report Cards for Adult Education
  • American Institutes for Research
  • Miami, Florida
  • February 28-March 2, 2007

2
Goals of the Institute
  • By the end of the training, participants will be
    able to
  • Discuss the purposes of, and audiences for,
    report cards for state and local adult ed
    programs
  • Identify essential elements of a report card
  • Distinguish between effective and ineffective
    report cards
  • Use a template to develop state and local report
    cards
  • Create a rubric for evaluating performance and
  • Develop a dissemination plan.

02
3
Agenda for Wednesday, Feb. 28
  • Lessons Learned States Experiences with Report
    Cards
  • A Word about Quality States Plans to Improve
    Quality
  • Step 1 Define Purpose of and Audience for State
    and Local Report Cards
  • Step 2 Select Data Elements/Measures for State
    and Local Report Cards
  • Step 3 Select Evaluative Criteria or Rubric
  • Step 4 Design and Format Report Cards
  • State Planning Time and Status Reports
  • Introducing the User Guide and Report Card
    Templates
  • Wrap-up and Evaluation for Day 1

03
4
Agenda for Thursday, March 1
  • Navigating the Template
  • A Trial Run with a Hypothetical State
  • QA about the Template
  • State Teams Build Local Report Cards
  • Progress Checks and Support
  • State Teams Demonstrate Local Report Cards
  • Parking Lot/Discussion, Wrap-up, and Evaluation
    for Day 2

04
5
Agenda for Friday, March 2
  • State Teams Build State Report Cards
  • Progress Checks and Support
  • State Teams Demonstrate State
    Report Cards
  • Step 5 Disseminate and Promote Program
    Improvement
  • State Teams Develop Dissemination Plans
  • Next Steps, Wrap-up, and Evaluation

05
6
Why Report Cards?
  • AccountabilityMeasure Adequate Yearly Progress
  • Program Improvement
  • Inform and Advocate for Program

06
7
Questions for Consideration
  • What would a report card on
    hospitals tell you?
  • What about one on surgeons?
  • What about one on airlines?
  • What about one on stocks?

07
8
What is a Report Card?
  • A concise presentation of data and other
    information about a school or program that
    assesses performance
  • The focus on evaluating performance is what makes
    report cards a unique type of accountability
    report.

08
9
Characteristics of Effective Report Cards
  • Focus on outcomes and other data that reflect
    program quality
  • Provide a basis of comparison for evaluating
    these data and
  • Present contextual data or interpretive
    information that aid interpretation and promote
    understanding.

09
10
Characteristics of Ineffective Report Cards
  • Provide poor indicators of quality
  • Do not support program improvement
  • Are not informative

10
11
Three Primary Factors of Ineffective Report Cards
  • Wrong measures
  • Insufficient measures
  • Unreliable or low quality data

11
12
A Word about Aggregated v. Disaggregated
Data
  • We weighed each of you as you entered this room,
    so we know that the average weight of
    everyone in this room is 195 lbs. And we
    posted that average on the wall.
  • Is this good or bad news?
  • What does this tell us about our entire
    population?
  • What does it tell us about males v. females?
  • Is this a weight loss group or a steroid users
    support group?

12
13
Disaggregated Data
  • Help us understand patterns of success and
    failure in the student population
  • Help us separate the whys from the whines
  • Caution When we report information on student
    learning as a mean, it tells us about as much as
    the weight sign on our wall.

13
14
NCLB Requirements for State Report Cards
  • Concise understandable and uniform format
  • Info on student achievement, both aggregated and
    disaggregated
  • Comparison between actual achievement levels and
    the states annual goals/objectives
  • Most recent trends in student achievement
  • Performance of local agencies re AYP with names
    of schools identified for school improvement
  • Professional qualifications of teachers
  • Description of states accountability system

14
15
NCLB Requirements for Local District Report Cards
  • The same data required for the state report card
    as applied to the local education agency
  • Performance of students of local district
    compared to performance of students in the state
    as a whole
  • Performance of local agencies re AYP with names
    of schools identified for school improvement

15
16
Differences Between Uses ofState and Local
Report Cards
  • Local
  • Include a broader range of outcome measures to
    evaluate program
  • More suited for program improvement efforts
  • Use evaluative standard such as program past
    performance, performance of similar programs, or
    local performance standard
  • State
  • Use a more limited range of outcome measures
  • Use evaluative standards such as state past
    performance, state performance standards, or
    national averages

16
17
NRS Outcome Measures The
Centerpiece of the Report Card
  • Educational gain
  • Receipt of a secondary credential
  • Entered and retained employment
  • Entry into postsecondary education

17
18
Other Measures?
  • Student attendance and persistence data
  • Student and employer satisfaction with program
  • Learner accomplishments
  • Number of hours of instruction
    offered
  • Average class size
  • Teacher qualifications

18
19
Contextual Data that Aid Interpretation
  • Student demographics
  • Socioeconomic data about the community
  • Per student cost

19
20
Basis for Evaluation
  • To evaluate program quality, must have a
    standard or basis of comparison
  • Compare students and programs to each other and
    to fixed standards
  • Examples letter grades, past performance,
    average state performance

20
21
Five Steps to Developing a Report Card for Adult
Education
  • Define purpose and audience
  • Select measures
  • Select evaluative criteria or rubric
  • Design and format
  • Disseminate and promote program improvement

21
22
1. Define Purpose and Audience
  • Purpose evaluation, program improvement, or
    information?
  • State or local report card?
  • Audience local program staff, state staff,
    funding agencies, legislators, general public?
  • Which information to show/not show? Is this
    different for difference audiences?
  • In what areas are your programs doing well/doing
    poorly? Why?
  • What messages do you want to send to audience and
    what do you want them to do about it?

22
23
2. Select Measures
23
24
Matrix of Audience, Report Card Purposes,
Measures and Comparisons
24
25
3. Select Evaluative Criteria or Rubric
25
26
What is a Rubric?
  • A scheme for classifying products or behaviors
    into categories that vary along a continuum
  • A vehicle for describing varying levels of
    quality, from excellent to poor, or from meets
    expectations to unacceptable
  • Examples of rubrics we are familiar with letter
    grades, or the GED essay scoring guide

26
27
Advantages of Scoring Rubrics
  • Clarify expectations about the characteristics of
    quality programs
  • Set standards for program performance
  • Provide an objective measure for examining and
    evaluating programs
  • Can provide formative feedback to programs
  • Can be used by programs for self-assessment and
    improvement
  • Can lead to shared standards among staff and the
    public about what makes a good program.

27
28
Writing Descriptions of Rubrics
  • Keep criteria or indicators specific, objective,
    and value neutral
  • Describe what the levels of quality look like,
    without using judgmental language
  • Be sure indicators define progress along a
    continuum from lowest to highest quality.

28
29
Seven Steps to Developing a Rubric
  • Identify the measure for the rubric and a
    possible range of responses.
  • Identify the highest possible range of scores to
    define the top category.
  • Define an unacceptable level.
  • Define the lowest level of acceptable
    performance.
  • Define an intermediate level, between the top and
    the lowest.
  • Label the categories.
  • Test and refine.

29
30
4. Design and Format
  • Design Principles
  • Use of high quality graphics and photos
  • Simple, short, succinct
  • Data Presentation
  • Avoid needless data breakdowns and disaggregation
  • Simplify data for your audience
  • Be careful with percentages, base numbers, and
    response rates

30
31
Graphic Displays Should
  • Show the data
  • Induce the viewer to think about the substance,
    not the methodology or graphic design
  • Avoid distorting what the data have to say
  • Make large data sets coherent
  • Encourage the eye to compare different pieces of
    data
  • Reveal data at several layers of detail, from
    a broad overview to the fine structure
  • Be closely integrated with the statistical
    and verbal descriptions of a data set.

31
32
Some Dos and a Dont
  • Do make report card text short and easy to read.
  • Do make student performance prominent but report
    more than test scores.
  • Do be cautious about assigning labels to
    programs.
  • Dont overdo displays of demographic data.

32
33
Questions to Ask When Framing the Message
  • What does this audience segment need to know?
  • What are the few key points that best illustrate
    what you want this audience to know about your
    program?
  • What data best supports your message?
  • Remember How you say it does matter.

33
34
Dont Use Killer Words
34
35
Dont Use Jargon
  • Disaggregated data
  • Standards-based tests
  • Alternative assessment
  • Performance index
  • Chi-square, p-value, theta, coefficient
  • Others?

35
36
The 30-3-30 Formula
  • Audience time spent perusing a product
  • 30 seconds most people.
  • 3 minutes a smaller segment that will read
    headings, subheads, illustrations, opening and
    summary statements
  • 30 minutes the smallest segment that will read
    the whole product
  • Guess which of the above applies to OVAE staff??

36
37
Does Your Message Pass the Joe Six-Pack Test?
  • (i.e., Does your neighbor understand the
    message?)
  • If not, radically oversimplify.
  • If you dont, the media will, and will inevitably
    get it wrong.

37
38
Users Guide for Creating Report Cards
  • Users Guide for Creating
  • and Formatting State and
  • Local Report Cards

38
39
Users Guide for Creating Report Cards
39
40
Getting Started Create a Local Report Card
  • Purpose What is the purpose of this report card?
    e.g., training or monitoring?
  • Audience Who will review this report card? What
    message should this audience take away from the
    report card about program quality?
  • Measures What combination of measures will
    provide locals with a clear message, given the
    purpose?

40
41
Getting Started Create a Local Report Card
  • Rubric What criteria/rubric will indicate
    the quality of the outcomes or measure progress
    that programs are achieving?

41
42
Getting Started Create a Local Report Card
  • Design
  • Which graphs, tables, text, and pictures, will
    most effectively communicate the intended
    message?
  • Dissemination
  • Begin to consider how the report card will be
    disseminated to the audience. How will that
    affect the design, including how data might be
    interpreted by various audiences?

42
43
Action!
  • Begin by selecting a Local Report Card template
  • Use examples as guides
  • Report Card Templates tab contains sample NRS
    report card templates for guidance
  • Other Report Cards tab contains current Adult Ed
    and K-12 examples for ideas.
  • Report out
  • Share the Local Report Card draft
  • Highlight purpose, audience,
    measures, rubric, and design
    choices.

43
44
5. Disseminate and Promote Program Improvement
  • Disseminate
  • Develop a distribution plan.
  • Vehicle meetings, mailings, Web site, etc.
  • Audience legislators, funding agencies, public,
    etc.
  • Make the report cards accessible.
  • Provide guidance in interpreting the data.
  • Use credible messengers in the community.
  • Avoid defensiveness.
  • Make the report card easy to find on Web sites or
    easy to access by mail or phone.
  • Distribute to legislators and funding agencies.
  • Cultivate relationships with the media, if
    permitted.

44
45
Questions to Answer Before Disseminating Report
Cards
  • Who gets the report card (state or local)?
  • By what means (distribution channel)?
  • By what date?
  • How will you evaluate the effectiveness
  • Of the report card? Did it serve the purpose you
    planned or hoped for?
  • Of the distribution channel? Did the message
    reach the intended audience effectively and
    efficiently?

45
46
Questions to Answer Before Disseminating Report
Cards
  • How will the report cards be packaged?
  • How will staff handle incoming requests for the
    report cards? Will there be a change for postage?
  • Will the report cards be made available in pdf
    format on your Web page?
  • Who will maintain the Web page?

46
47
Questions to Answer Before Disseminating Report
Cards
  • Will you arrange for media coverage upon the
    report cards release?
  • If so, how? What media sources?
  • Will you work through your systems/districts
    public relations/public affairs officer?
  • After the report cards are out there, will you
    have a system to accept input from the audience
    on the quality of the reports and the
    dissemination system?

47
48
Distribution Channels
  • Direct mail
  • Email
  • Web Pages (Your home page)
  • Newspapers and other print media
  • TV/radio
  • Board meetings and other appropriate community
    meetings
  • Credible spokespersons (representatives of the
    intended audience segment)

48
49
The Bottom Line in Dissemination
  • Do not rely on any single messenger or
    distribution method.
  • The most effective communications strategy uses
    multiple messengers and channels.

49
50
5. Disseminate and Promote Program Improvement
(Cont.)
  • Promote Program Improvement
  • Promote use and understanding
  • Opportunities for technical assistance
  • Recognition for good performance

50
51
Questions for Consideration
  • As you look at your data
  • What do these data seem to tell you?
  • What do they not tell you?
  • What else do you need/want to know?
  • What good news is here for you to celebrate?
  • What needs for program improvement arise from
    these data?

51
52
Next Steps?
  • How can we help you?
  • www.nrsweb.org

52
53
Best Wishes in Your Continuing Quest for
Quality..
  • Thank You and
  • Good Luck!

53
About PowerShow.com