Improving Prevention Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006 - PowerPoint PPT Presentation

1 / 49
About This Presentation
Title:

Improving Prevention Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006

Description:

Title: Efficiency and Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006 Author: Bill Hansen – PowerPoint PPT presentation

Number of Views:182
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Improving Prevention Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006


1
Improving Prevention EffectivenessThe Maryland
Alcohol and Drug Abuse Administration Annual
Management Conference October 5, 2006
  • William B. Hansen, Ph.D.
  • Tanglewood Research, Inc.
  • Greensboro, NC

2
SAMHSAs Strategic Prevention Framework
Assessment
Evaluation
Capacity
Planning
Implementation
3
SAMHSAs Strategic Prevention Framework
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
4
What Do Programs Want To Do?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
5
What Do Programs Want To DoWhen They Are
Required to Evaluate?
Assess
Select a Strategy
Evaluate
  1. The quality of delivery.

2. The effects achieved.
Develop Capacity
Implement
6
What Do We Mean Quality Of Delivery?
  • Dosage
  • How much?
  • How often?
  • Adherence
  • Was the program delivered as intended?
  • Was new content added?
  • Was important content deleted or modified?
  • Relevance
  • Was the program engaging to participants?
  • Did the program meet participants needs?

7
Dosage 1
  • Meta-analysis of 25 SAMHSA model programs.
  • Programs that were delivered more frequently,
    generally had larger effects.

8
Dosage 2
  • Same 25 SAMHSA model programs
  • Programs that had more opportunities for contact
    were generally more effective.

9
All Stars Dosage Tracking
10
All Stars Dosage Tracking
11
All Stars Dosage Tracking
12
Adherence
  • An evaluation of Life Skills Training
  • Observers rated the percent of objectives met and
    lesson points covered
  • High-fidelity classes (gt60 adherence) did best.

13
Adherence
  • An evaluation of Life Skills Training
  • Observers rated the percent of objectives met and
    lesson points covered
  • High-fidelity classes (gt60 adherence) did best.

14
Adherence
  • An evaluation of Life Skills Training
  • Observers rated the percent of objectives met and
    lesson points covered
  • High-fidelity classes (gt60 adherence) did best.

15
Local Adherence
  • Drug Strategies assessed adherence of Life Skills
    Training implemented in Baltimore.
  • Teachers re-taught lessons they had previously
    delivered.
  • Observers rated adherence.
  • Teachers implemented 65 of objectives (Range
    45-100).
  • Teachers implemented 58 of main points (Range
    38-93).

16
Local Adaptation
  • All teachers made adaptations
  • 3.5 definable adaptations, on average, per
    observed session (range 1 to 7)
  • Overall, 63 of adaptations were judged to be
    negative

17
Helpful Adaptations
  • The addition of reading material, videos, and
    testimonials
  • Changes in methods to make them more interactive
  • Inclusion of examples for cultural relevance or
    interest

18
Important Correlates of Adherence
  • Teachers Understanding of LST
  • (r.784 plt.01)
  • Quality of Process
  • (r.663 p.03)
  • Level of Experience
  • (r.756 plt.01)

19
All Stars Approach to Assessing Adherence
20
Relevance
  • As part of the SAMHSA program meta-analysis, we
    coded relevance.
  • Three aspects of relevance were significantly
    correlated with outcomes

Developmental Appropriateness .28
Cultural Appropriateness .27
Engaging to Students .30
21
All Stars Approach toAssessing Student Engagement
22
Quality of Implementation Summary
  • Data are needed to assess
  • Whether a sufficient dose has been delivered
  • How closely delivery adhered to design
  • The relevance of implementation for participants
  • Gathering and reporting data will improve quality
    of implementation.

23
Outcome Evaluation
  • Everybody is afraid of outcome evaluation.
  • Why?
  • No one likes to fail.
  • It is perceived to be mysterious, complex, and
    expensive.
  • Outcomes are not controllable.
  • It shows you are normal.

24
What Are Prevention Goals?
  • The goal of prevention is not behavior change but
    either
  • Non-behavior maintenance
  • Delay in onset
  • Reduce the intensity of use

25
How Do Programs Work?
  • All Programs are based on a logic model.
  • The program changes a mediator
  • Characteristics of the participant (skill,
    motivation)
  • Characteristics of the social environment
  • Characteristics of the physical environment
  • Characteristics targeted for change affect
    behavior.

26
All Stars Logic Model Example
  • All Stars Core targets
  • Lifestyle incongruence (idealism)
  • Normative beliefs
  • Commitment
  • Bonding to school
  • Positive parental attentiveness

27
Mediators in Prevention
  • Motivation
  • Attitudes
  • Bonding
  • Beliefs consequences
  • Commitment
  • Normative beliefs
  • Lifestyle incongruence
  • Personal Competencies
  • Academic skills
  • Decision-making skills
  • Emotional self-regulation
  • Goal setting skills
  • Self-esteem
  • Social Competencies
  • Resistance skills
  • Media literacy
  • Communication skills
  • Social problem solving skills
  • Social skills
  • Environment
  • Availability, access, enforcement
  • Alternatives
  • Classroom management
  • Family management
  • Monitoring and supervision
  • Positive peer affiliations
  • Support and involvement

28
The Easy Part ofOutcome Evaluation
  • Collecting survey data is easy.
  • After over 30 years development, there are
  • Many measures for assessing alcohol, tobacco,
    drug use, and consequences of use.
  • Many measures for measuring mediators (risk and
    protective factors targeted for change).

29
Sample Outcome ResultsAll Stars in a Community
Setting
30
Sample Outcome ResultsAll Stars in a Community
Setting
31
Sample Outcome ResultsAll Stars in a Community
Setting
32
What Do When Programs Succeed?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
33
Results Looking at Mediators
34
Results of the All Stars Community Trial
35
Sample Local Evaluation Resultsfrom a Community
Program in MN
36
What Do You Do When YouHave Not Yet Succeeded?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
37
Working Backwards
Evaluate
Implement
Develop Capacity
38
How To Develop Capacity?
  • Two predictors of quality implementation
  • Experience
  • Training

39
Experience Counts!
  • Teachers with more experience were
  • Most adherent when the taught LST
  • (r  .630)
  • More likely to meet objectives
  • (r  .590)
  • More likely to cover major points
  • (r  .756)
  • More likely to make positive adaptations
  • (r  .577)

40
Developing Skill
41
Training Counts, Too!
  • Teachers understanding of LST was strongly
    correlated with adherence (r  .784).

42
Improving Understanding
  • Program-specific training
  • Coaching and feedback
  • Independent study

43
Program-Specific Training
  • Introductory training
  • Basics of theory and methods
  • Technical assistance
  • Help with specific issues and adaptations
  • Booster training
  • When important questions are asked
  • Certification of Mastery
  • A process of demonstration and certification

44
Does Training Matter?
  • Video Training Project
  • Two conditions
  • 3-hour course without video
  • 3-hour course with video
  • Topic Norm Setting
  • Knowledge pretest-posttest survey

http//www.PreventionABCs.com
45
All Stars Certification of Mastery
  1. Basic All Stars facilitator training
  2. Implement one cycle of All Stars
  3. Videotape implementation with feedback about
    mechanisms of delivery
  4. Videotape implementation with feedback about
    interactivity
  5. Implement Strategies for Success
  6. Implement Parent Intervention
  7. Complete online course (Prevention ABCs)
  8. Videotape to demonstrate expert delivery
  9. Improved student outcomes

46
Just-In-Time Support
  • New project
  • Life Skills Training
  • Emailed helpful hints just before you teach
  • Links to a streaming video demonstration
  • Recruiting test schools
  • lindadusenbury_at_tanglewood.net
  • 1-888-692-8412

47
Conclusion
  • I was asked to answer two questions
  • How can prevention programs use data to improve
    program effectiveness?
  • How can programs become data-driven?

48
Improving Effectiveness
  • How can prevention programs use data to improve
    program effectiveness?
  • Quality of implementation data
  • Behavioral outcome data
  • Targeted mediating variable data
  • All improve the potential of a program to be
    implemented with greater rigor

49
Improving Effectiveness
  • How can programs become data-driven?
  • Collect data
  • Look at the data you have collected
  • Start with modest expectations
  • Find meaning in the data
  • Find alternatives in the data
Write a Comment
User Comments (0)
About PowerShow.com