Title: Improving Prevention Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006
1Improving Prevention EffectivenessThe Maryland
Alcohol and Drug Abuse Administration Annual
Management Conference October 5, 2006
- William B. Hansen, Ph.D.
- Tanglewood Research, Inc.
- Greensboro, NC
2SAMHSAs Strategic Prevention Framework
Assessment
Evaluation
Capacity
Planning
Implementation
3SAMHSAs Strategic Prevention Framework
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
4What Do Programs Want To Do?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
5What Do Programs Want To DoWhen They Are
Required to Evaluate?
Assess
Select a Strategy
Evaluate
- The quality of delivery.
2. The effects achieved.
Develop Capacity
Implement
6What Do We Mean Quality Of Delivery?
- Dosage
- How much?
- How often?
- Adherence
- Was the program delivered as intended?
- Was new content added?
- Was important content deleted or modified?
- Relevance
- Was the program engaging to participants?
- Did the program meet participants needs?
7Dosage 1
- Meta-analysis of 25 SAMHSA model programs.
- Programs that were delivered more frequently,
generally had larger effects.
8Dosage 2
- Same 25 SAMHSA model programs
- Programs that had more opportunities for contact
were generally more effective.
9All Stars Dosage Tracking
10All Stars Dosage Tracking
11All Stars Dosage Tracking
12Adherence
- An evaluation of Life Skills Training
- Observers rated the percent of objectives met and
lesson points covered - High-fidelity classes (gt60 adherence) did best.
13Adherence
- An evaluation of Life Skills Training
- Observers rated the percent of objectives met and
lesson points covered - High-fidelity classes (gt60 adherence) did best.
14Adherence
- An evaluation of Life Skills Training
- Observers rated the percent of objectives met and
lesson points covered - High-fidelity classes (gt60 adherence) did best.
15Local Adherence
- Drug Strategies assessed adherence of Life Skills
Training implemented in Baltimore. - Teachers re-taught lessons they had previously
delivered. - Observers rated adherence.
- Teachers implemented 65 of objectives (Range
45-100). - Teachers implemented 58 of main points (Range
38-93).
16Local Adaptation
- All teachers made adaptations
- 3.5 definable adaptations, on average, per
observed session (range 1 to 7) - Overall, 63 of adaptations were judged to be
negative
17Helpful Adaptations
- The addition of reading material, videos, and
testimonials - Changes in methods to make them more interactive
- Inclusion of examples for cultural relevance or
interest
18Important Correlates of Adherence
- Teachers Understanding of LST
- (r.784 plt.01)
- Quality of Process
- (r.663 p.03)
- Level of Experience
- (r.756 plt.01)
19All Stars Approach to Assessing Adherence
20Relevance
- As part of the SAMHSA program meta-analysis, we
coded relevance. - Three aspects of relevance were significantly
correlated with outcomes
Developmental Appropriateness .28
Cultural Appropriateness .27
Engaging to Students .30
21All Stars Approach toAssessing Student Engagement
22Quality of Implementation Summary
- Data are needed to assess
- Whether a sufficient dose has been delivered
- How closely delivery adhered to design
- The relevance of implementation for participants
- Gathering and reporting data will improve quality
of implementation.
23Outcome Evaluation
- Everybody is afraid of outcome evaluation.
- Why?
- No one likes to fail.
- It is perceived to be mysterious, complex, and
expensive. - Outcomes are not controllable.
- It shows you are normal.
24What Are Prevention Goals?
- The goal of prevention is not behavior change but
either - Non-behavior maintenance
- Delay in onset
- Reduce the intensity of use
25How Do Programs Work?
- All Programs are based on a logic model.
- The program changes a mediator
- Characteristics of the participant (skill,
motivation) - Characteristics of the social environment
- Characteristics of the physical environment
- Characteristics targeted for change affect
behavior.
26All Stars Logic Model Example
- All Stars Core targets
- Lifestyle incongruence (idealism)
- Normative beliefs
- Commitment
- Bonding to school
- Positive parental attentiveness
27Mediators in Prevention
- Motivation
- Attitudes
- Bonding
- Beliefs consequences
- Commitment
- Normative beliefs
- Lifestyle incongruence
- Personal Competencies
- Academic skills
- Decision-making skills
- Emotional self-regulation
- Goal setting skills
- Self-esteem
- Social Competencies
- Resistance skills
- Media literacy
- Communication skills
- Social problem solving skills
- Social skills
- Environment
- Availability, access, enforcement
- Alternatives
- Classroom management
- Family management
- Monitoring and supervision
- Positive peer affiliations
- Support and involvement
28The Easy Part ofOutcome Evaluation
- Collecting survey data is easy.
- After over 30 years development, there are
- Many measures for assessing alcohol, tobacco,
drug use, and consequences of use. - Many measures for measuring mediators (risk and
protective factors targeted for change).
29Sample Outcome ResultsAll Stars in a Community
Setting
30Sample Outcome ResultsAll Stars in a Community
Setting
31Sample Outcome ResultsAll Stars in a Community
Setting
32What Do When Programs Succeed?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
33Results Looking at Mediators
34Results of the All Stars Community Trial
35Sample Local Evaluation Resultsfrom a Community
Program in MN
36What Do You Do When YouHave Not Yet Succeeded?
Assess
Select a Strategy
Evaluate
Develop Capacity
Implement
37Working Backwards
Evaluate
Implement
Develop Capacity
38How To Develop Capacity?
- Two predictors of quality implementation
- Experience
- Training
39Experience Counts!
- Teachers with more experience were
- Most adherent when the taught LST
- (r .630)
- More likely to meet objectives
- (r .590)
- More likely to cover major points
- (r .756)
- More likely to make positive adaptations
- (r .577)
40Developing Skill
41Training Counts, Too!
- Teachers understanding of LST was strongly
correlated with adherence (r .784).
42Improving Understanding
- Program-specific training
- Coaching and feedback
- Independent study
43Program-Specific Training
- Introductory training
- Basics of theory and methods
- Technical assistance
- Help with specific issues and adaptations
- Booster training
- When important questions are asked
- Certification of Mastery
- A process of demonstration and certification
44Does Training Matter?
- Video Training Project
- Two conditions
- 3-hour course without video
- 3-hour course with video
- Topic Norm Setting
- Knowledge pretest-posttest survey
http//www.PreventionABCs.com
45All Stars Certification of Mastery
- Basic All Stars facilitator training
- Implement one cycle of All Stars
- Videotape implementation with feedback about
mechanisms of delivery - Videotape implementation with feedback about
interactivity - Implement Strategies for Success
- Implement Parent Intervention
- Complete online course (Prevention ABCs)
- Videotape to demonstrate expert delivery
- Improved student outcomes
46Just-In-Time Support
- New project
- Life Skills Training
- Emailed helpful hints just before you teach
- Links to a streaming video demonstration
- Recruiting test schools
- lindadusenbury_at_tanglewood.net
- 1-888-692-8412
47Conclusion
- I was asked to answer two questions
- How can prevention programs use data to improve
program effectiveness? - How can programs become data-driven?
48Improving Effectiveness
- How can prevention programs use data to improve
program effectiveness? - Quality of implementation data
- Behavioral outcome data
- Targeted mediating variable data
- All improve the potential of a program to be
implemented with greater rigor
49Improving Effectiveness
- How can programs become data-driven?
- Collect data
- Look at the data you have collected
- Start with modest expectations
- Find meaning in the data
- Find alternatives in the data