Developmental Disabilities Program Independent Evaluation (DDPIE) Project - PowerPoint PPT Presentation

1 / 55
About This Presentation
Title:

Developmental Disabilities Program Independent Evaluation (DDPIE) Project

Description:

Developmental Disabilities Program Independent Evaluation (DDPIE) Project. UCEDD ... To understand the background and progress of the ADD independent evaluation ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 56
Provided by: lynn133
Learn more at: http://www.aucd.org
Category:

less

Transcript and Presenter's Notes

Title: Developmental Disabilities Program Independent Evaluation (DDPIE) Project


1
Developmental Disabilities Program Independent
Evaluation (DDPIE) Project
  • UCEDD Meeting Technical Assistance Institute
  • May 31, 2007
  • Lynn Elinson, Ph.D.
  • Project Director

2
Developmental Disabilities Program Independent
Evaluation (DDPIE) Project
  • Also known as ADD Independent Evaluation

3
Purpose of PowerPoint
  • To understand the background and progress of the
    ADD independent evaluation
  • To obtain a background and context for giving
    feedback on ADD independent evaluation materials

4
PowerPoint Outline
  • 1. Background of ADD Independent Evaluation
  • A. Purpose of the DDPIE Project
  • B. Challenges
  • 2. Research design
  • 3. Project implementation
  • A. Overview
  • B. Project activities
  • C. Evaluation tools
  • D. Validation
  • 4. Seeking individualized input
  • 5. Progress and timing

5
1. Background
6
A. Purpose of the DDPIE Project
  • Demonstrate impact of DD Network programs on
  • Individuals
  • Families
  • Service providers
  • State systems
  • Provide feedback to ADD to help improve the
    effectiveness of its programs and policies
  • Promote positive achievements of DD Network
    programs by storytelling
  • Promote accountability to the public

7
Why the independent evaluation?
  • In 2003 ADD conducted a Program Assessment Rating
    Tool (PART) self-assessment under OMB guidance.
  • PART is a series of questions designed to provide
    a consistent approach to rating programs across
    the Federal Government.
  • PART has four parts (1) Program Purpose
    Design (2) Strategic Planning (3) Program
    Management and (4) Program Results.
  • PART 4 asks whether an agency has conducted an
    independent evaluation of sufficient scope and
    quality to indicate that the program is effective
    and achieving results?
  • ADD answered no which lowered overall score.

8
Challenges
  • Each UCEDD program is unique.
  • Challenge is to develop performance standards
    that
  • are relevant to all UCEDD programs
  • capture the differences among the programs
    (variability) and
  • will be useful to ADD in demonstrating impact.

9
2. Research design
10
Design Considerations
  • PART prefers experimental or quasi-experimental
    research designs
  • The structure of the ADD programs does not lend
    itself to conducting randomized trials or pre-
    and post-tests.

11
Research Design Standards-Based Evaluation
  • NOT a randomized control trial or
    quasi-experimental design
  • IS a standards-based evaluation to
  • - Set national standards
  • - Determine levels that characterize extent to
    which national standards are being met
  • - Determine impact DD Network programs (and
    collaboration among programs) are having on
    people with developmental disabilities, family
    members, State systems, and services providers

12
Reporting at national level
  • Data will be collected on individual programs and
    rolled up to national level.
  • Independent evaluation will NOT be comparing
    programs to one another
  • Independent evaluation will NOT replace MTARS,
    which is specific to individual programs.

13
2 Types of Standards
  • Evidence-based
  • Consensus-based
  • Performance standards for DDPIE are
    consensus-based
  • Performance standards will be developed for each
    DD Network program and collaboration among the
    three DD Network programs

14
Key assumptions for designing performance
standards
  • State programs vary on their level of performance
    across the standards.
  • Consistently high performance across the
    standards is related to better outcomes.
  • Consistently low performance across the standards
    is related to poor outcomes.

15
Research design seeks input and participation
from stakeholders
  • Seeks input from
  • Project Advisory Panel
  • DD Network Program Working Groups
  • All State programs
  • Validation Panels
  • The public

16
Role of Advisory Panel
  • To provide balance, impartiality, and expertise
  • To provide advice on
  • DDPIE process
  • Benchmarks, indicators, performance standards,
    and performance levels
  • Data collection protocols
  • Pilot study
  • Synthesis of findings and recommendations

17
Composition of Advisory Panel
  • Self-advocates
  • Family members
  • Representatives from 3 programs Richard Carroll
    from Arizona UCEDD
  • Child/disability advocates
  • Evaluation expert
  • Federal representative (for PAIMI evaluation)

18
Working Groups
  • 4 Working Groups (PA, UCEDD, DD Council,
    Collaboration)
  • Process In-person and telephone meetings
  • Role
  • - To assist Westat in understanding programs
  • - To provide feedback on benchmarks,
    indicators, performance standards

19
UCEDD Working Group members
Carl Calkins Kansas City, MO
Tawara Goode Washington, DC
Gloria Krahn Portland, OR
David Mank Bloomington, IN
Fred Orelove Richmond, VA
Fred Palmer Memphis, TN
Lucille Zeph Orono, ME
Collaboration Working Group
20
3. Project implementation
21
A. Overview
22
Phases of DDPIE Project
  • DDPIE will be conducted in 2 phases.
  • - Phase 1 development and testing of
    evaluation tools (measurement matrices and data
    collection protocols)
  • - Phase 2 full-scale evaluation
  • Westat was contracted by ADD to implement Phase
    1.
  • - Project began September 30, 2005
  • - End of contract September 29, 2008
  • Phase 2 will be funded upon completion of Phase
    1.

23
B. Project activities
24
Steps in Phase I
  • Construct evaluation tools (measurement matrices
    and data collection protocols) that contain
    performance standards and performance levels
  • Conduct Pilot Study to test evaluation tools
    (measurement matrices and data collection
    protocols)
  • Revise evaluation tools

25
C. Evaluation tools
26
2 types of evaluation tools
  • Measurement matrices, which include
  • - Key functions, benchmarks, indicators,
    performance standards
  • - Performance levels
  • Data collection protocols

27
Definitions of key terms in measurement matrices
  • Key functions
  • Benchmarks
  • Indicators
  • Performance standards
  • - Outcome performance standards
  • - Program performance standards

28
Logic model/format for measurement matrices
Benchmarks
Key Functions
Indicators
Performance Standards
29
Key Functions
  • Groups of activities carried out by DD Network
    programs
  • Cover all aspects of program activity
  • 5 UCEDD key functions
  • 1st four key functions identified by Working
    Group (core functions in DD Act)
  • Governance and Management Relevant to other
    four key functions
  • Benchmarks, indicators, and performance standards
    are being developed for all key functions.

30
UCEDD Key Functions
  1. Interdisciplinary pre-service training and
    continuing education
  2. Conduct of basic and/or applied research
  3. Provision of community services
  4. Dissemination of information
  5. Governance and management

31
Benchmarks
  • Broad, general statements
  • Set bar for meeting expected outcome(s) of each
    key function
  • About 20 UCEDD benchmarks
  • 3-4 benchmarks for each key function

32
Indicators
  • Identify what gets measured to determine extent
    to which benchmarks and performance standards are
    being met
  • 4 types of indicators outcome, output, process,
    structural
  • Will guide the development of data collection
    instruments

33
Performance standards
  • Criterion-referenced (measurable)
  • Consensus-based
  • 2 types
  • - Outcome performance standards
  • - Program performance standards

34
Outcome performance standards
  • Linked to expected outcomes of
  • each key function
  • Answer the questions
  • - Were the expected outcomes
  • met?
  • - To what extent?

35
Program performance standards
  • What the program should achieve, have, and do to
    effectively
  • - meet the principles and goals of the DD Act
    and
  • - have an impact on people with developmental
    disabilities, family members, State systems,
    service providers

36
Program performance standards (continued)
  • Linked to the structures, processes, and outputs
    of UCEDD program
  • Answers the questions
  • - What structures should be in place to carry
    out UCEDD network key functions? What should
    they be like?
  • - What processes should be used? What should
    they be like?
  • - What should the UCEDD network produce? What
    should products be like? To what extent should
    they be produced (e.g., how often, how many)?

37
D. Validation
38
Overview of validation
  • There is no gold standard for an effective
    UCEDD, so another approach needs to be used to
    identify performance standards.
  • The ADD independent evaluation uses a consensus
    approach.
  • This implies participation in the process and
    validation from a wide variety of stakeholders.
  • There will be several opportunities for
    validation throughout the development of
    performance standards.
  • Stakeholders hold a variety of perspectives and,
    therefore, may not always agree with one another.

39
Validation approach for DDPIE project
  • Consists of obtaining input, feedback, and
    consensus
  • Consists of validating measurement matrices
    (indicators and performance standards) and data
    collection instruments
  • Is a multi-step process
  • Provides validation opportunities to several
    types of stakeholders (e.g., consumers, family
    members, program representatives, advocates,
    evaluation experts)
  • Provides opportunities for validation at
    different points in the process

40
Opportunities for validation
  • Working Group process
  • Advisory Panel meetings
  • State programs (at TA meetings, by telephone, in
    writing)
  • Validation Panel process
  • OMB process
  • Pre-test and pilot study

41
Validation Panels
  • There will be 4 Validation Panels (UCEDDs, PAs,
    DD Councils, Collaboration).
  • Process
  • - Telephone call orientation
  • - Paper approach (not face-to-face)
    accommodation will be provided
  • - Opportunity for discussion by telephone

42
Criteria for Validation Panel selection
  • Stakeholder groups (e.g., people with
    developmental disabilities, family members,
    advocates, programs, service providers)
  • Researchers

43
Criteria for Validation Panel selection
(continued)
  • Understands consumer needs
  • Understands DD Network programs
  • Diverse composition (gender, race/ethnicity)
  • Mix of junior and senior program staff
  • Urban and rural representation

44
Focus of Validation Panel process
  • Will achieve consensus
  • Formal process
  • Builds in objective methodology (e.g., criteria
    for eliminating and accepting indicators and
    performance standards)

45
OMB approval process is another form of validation
  • OMB approval process results from the Paperwork
    Reduction Act
  • Act is administered by Office of Management and
    Budget (OMB)
  • Purpose of Act is to ensure that information
    collected from the public minimizes burden and
    maximizes public utility
  • All Federal agencies must comply

46
OMB approval process (continued)
  • When contemplating data collection from the
    public, Federal agencies must seek approval from
    OMB.
  • Must submit an OMB package consisting of
    description of study and data collection effort,
    an estimate of burden, and data collection
    instruments.
  • Approval process consists of making data
    collection instruments available for public
    comment in the Federal Register.
  • ADD will be submitting an OMB package all
    interested parties will have opportunity to
    comment during public comment period.

47
Pre-test and Pilot Study additional form of
validation
  • Data collection protocols will be pre-tested in
    one state.
  • A pilot study will be conducted in up to 4
    states.
  • Pilot study states will be chosen randomly.
  • Pilot study will test reliability and validity of
    measurement matrices and feasibility of data
    collection.

48
4. Seeking individualized input
49
Opportunities for individualized input
  • UCEDD TA meeting (May 31, 2007)
  • - Distribution of draft benchmarks, indicators,
    and a few examples of performance standards
  • - Small group discussions facilitated by AUCD
  • Telephone meetings scheduled in June and July
  • In writing

50
Small Group Discussions at UCEDD Technical
Assistance Meeting (May 31, 2007)
  • Westat will
  • - Distribute draft performance standards on
    UCEDD
  • Network and Collaboration
  • - Review organization of materials
  • - Describe feedback process for individual UCEDD
    programs
  • - Answer questions on process for feedback
  • UCEDD programs will
  • - Continue to meet in small groups to discuss
    the materials (facilitated by AUCD)
  • - Report out in a large group on first
    impressions

51
Type of Input Sought
  • Benchmarks and indicators Are they the concepts
    that need to be addressed?
  • Benchmarks and performance standards Do they
    represent what the programs should be
    achieving/should have/should do in order to be
    effective in meeting the principles and goals of
    the DD Act and have an impact on people with
    developmental disabilities, families, State
    systems, and service providers?
  • Indicators Which seem the most important and
    feasible to measure? Which could be eliminated?
  • If not these, then what?

52
5. Progress and Timing
53
Progress to Date
  • Meetings with ADD, head of national associations,
    TA contractors November, 2006
  • Site visit to programs in one state December,
    2006
  • Review of background materials (provided by ADD
    Working Groups national websites other)
    October, 2005 February, 2007
  • Meetings with Working Groups March, 2006
    September, 2006
  • Meetings with Advisory Panel - March, 2006,
    October, 2006, March, 2007
  • Synthesis of all information by Westat
    September, 2006 to February, 2007
  • Draft benchmarks, indicators, performance
    standards February, 2007

54
Upcoming DDPIE Project Milestones
Feedback from UCEDD Working Group April May, 2007
UCEDD TA meeting May 31, 2007
Feedback from all UCEDD programs June - July, 2007
UCEDD Validation Panel Sept. Dec., 2007
DD Council Validation Panel Oct. Jan., 2008
PA Validation Panel Nov. Feb., 2008
Collaboration Validation Panel Feb. April, 2008
55
DDPIE Project Milestones (continued)
Data collection instruments June, 2008
Measurement matrices July, 2008
Final report (with evaluation tools) Sept., 2008
OMB Comment Period
Pilot Study New contract
Write a Comment
User Comments (0)
About PowerShow.com