Title: Contributing to the Research Base on Abstinence Education: Making the Most of the Upcoming Program E
1Contributing to the Research Base on Abstinence
Education Making the Most of the Upcoming
Program Evaluation Year
- Dennis McBride, Ph.D.
- Stan Weed, Ph.D.
- Harry Piotrowski, M.S.
- Olivia Silber Ashley, Dr. P.H.
October 15, 2009
2Contributing to the Research Base on Abstinence
Education Making the Most of the Upcoming
Evaluation Year
- Presented by
- Dennis McBride, Ph.D.
- October 15, 2009
3Refocusing the Evaluation
- Consider limiting program components to evaluate
- Consider limiting the length of time to follow
clients - Consider modifying your design
- Be constructive
- Be critical
4Consider limiting program components to evaluate
- If data collection is to be continued ensure that
ample time is given to obtain usable data for
analysis. This may mean refocusing your
evaluation and eliminating some components of
your evaluation.
5Consider limiting the length of time to follow
clients
- If you are early to the middle of your project,
focus efforts on your process evaluation
(outputs) and short term outcomes while offering
supportive links (via literature) to intermediate
and longer term outcomes.
6Consider modifying your design
- If you are in the middle of data collection and
will have smaller sample sizes than anticipated
-- consider augmenting your design with
manageable qualitative assessment such as a focus
group(s) or in-depth case histories.
7Be Constructive
- Construct a lessons learned document that can
inform the field on the dos and donts of
implementing abstinence education projects and
their evaluations.
8Be Critical
- The evaluations for these projects, like most
evaluations for these types of programs generally
have weak designs. Based upon your experiences,
include in your report what challenges you have
had in constructing more rigorous designs. - Was the support that you received from the funder
adequate? - Is the approach of evaluating these programs
sufficient or should other approaches be
considered?
9Contributing to the Research Base on Abstinence
Education Making the Most of the Upcoming
Evaluation Year
Presented by Stan Weed, Ph.D. October 15, 2009
10Webinar Focus
- This webinar addresses the question of What
might be the best use of evaluation resources if
2010 ends up being the final year of CBAE
funding? - The answer to this question really depends on a
programs current status and history.
11Program Status
- You may be a fairly new program, recently funded,
and still getting your feet on the ground. - You may have been around for a while, but your
data to date have not shown program impact, and
you want to rethink your program strategy and
approach. - You may have introduced new components to your
program and want to know how they are being
received and to what extent they are making an
impact.
12Qualitative Evaluation Strategy
- These examples lead to an evaluation strategy
that relies not only on pre-post tests which
target general impact, but on formative and
qualitative methods that target the more specific
and early indicators of program potential.
13Getting to the First Step
- First step where students see the learning
experience as helpful, meaningful, relevant, and
interesting. - If this happens, we can expect to see program
impact on short-term / intermediate outcomes that
should lead to behavioral changes - Retained knowledge
- Values
- Attitudes
- Behavioral intentions
-
14Qualitative Research Methods
- The early first step questions can be answered
through formative and qualitative research
methods interviews, focus groups, observation,
etc. - These methods are typically underutilized in the
field, in part because of the common belief that
employing a questionnaire is sufficient. - The quantitative data, when coming from a
rigorous research design, carries more weight. - Pre- and post-testing with change score analysis
can answer important questions. But it should
not be the only arrow in the evaluation quiver.
15Ask the Early Questions
- 1. Target Population
- What is the population that I am serving?
- What do they think and do regarding sexual
activity? - What basis do I have for answering these
questions if I have not tapped into the thoughts
of my target population? - How many untested assumptions have I made about
the adolescents in my target group?
16Ask the Early Questions
- 2. Which messages or concepts will be most
relevant and useful to my target group? Why? - Are the individual components/activities/messages
of the intervention well-suited to the target
group? - Do they speak to the audience in a compelling,
relevant way? - Do they engage the audience, persuade them, move
them? - Are some components more compelling, more
engaging, more persuasive, more relevant than
others? How do you know?
17Ask the Early Questions
- 3. Are the features of program implementation
well designed? - Is there a practical, doable procedure in place
to reach the target audience as intended, and
that will insure the integrity of the program? - What is the minimum level of implementation
required to produce an expected program outcome?
Is that level of implementation being reached?
18Ask the Early Questions
- 4. What other competing forces, events, and
exposures are occurring in youths lives that are
concurrent with the intervention? - Other health and sex education programs?
- Media messages?
- What is the content, duration, recency, and
intensity of that exposure? - These questions can be addressed early and
quickly with formative and qualitative methods.
19Ask the Early Questions
- 5. If my short-term and intermediate results
leave me disappointed (assuming I have measured
those that are known predictors of risk
behavior), how can I identify the reasons for
this shortcoming? - Dont assume you know the reasons unless you have
compelling evidence to support your premise. The
formative and qualitative methods can provide a
richer and deeper look into the classroom
experience of students.
20Ask the Early Questions
- 6. I have added a new component to my program.
How do I know whether it is working? - Utilize the same formative and qualitative
methods mentioned earlier to determine if you are
getting results.
21Contributing to the Research Base on Abstinence
Education Making the Most of the Upcoming
Evaluation Year
- Presented by
- Harry Piotrowski , M.S.
- October 15, 2009
22Objectives
- Increase knowledge about and explore with your
own CBAE Evaluation Design possible natural
Comparison Groups. - Explore how the comparison groups can be derived
or formed with your Pre-Post Abstinence Classroom
instruction evaluation design and thereby improve
ability to make causal statements about impact.
23Terminology
- Experimental Design
- Quasi Experimental
- Longitudinal (changes in individuals, matched
whenever possible) - Cross Sectional (differences in treatment/no
treatment groups)
24Experimental Design
- Youth randomly assigned to abstinence classes/no
abstinence classes - Classes randomly assigned to abstinence/no
abstinence Instruction
25Quasi-Experimental Design
- Matched Comparison Group (age , gender,
ethnicity, STI rate individual, school,
community) - Same school
- Different schools
- Different city
- Different state
26Scenario One Mann Elementary School
- 2-Year AE program, Pre-Test, Post-test, Follow-up
Design - Cohort 1 6th Grade Instruction Group-Oct 2008
- Pre-Test
- Post-Test
- Follow-up beginning of next year, 7th grade
- Cohort 2 7th Grade Instruction Group-Oct 2008
- Pre-Test
- Post-Test
- Follow-up beginning of next year, 8th grade
27Scenario Two Mann Elementary School
- 3-year program, PreTest, Post-Test, Follow-Up
Design - Cohort 1 6th Grade Instruction Group Oct 2008
- Pre-Test
- Post-Test
- Follow-up beginning of next year, 7th grade
- Cohort 2 7th Grade Instruction Group Oct 2009
- Pre-Test
- Post-Test
- Follow-up beginning of next year, 8th grade
- Cohort 3 8th Grade Instruction Group Oct 2010
- Pre-Test
- Post-Test
- Follow-up beginning of high school, 9th grade
28Scenario Three Instruction provided to High
School Freshmen
- Feeder schools include Mann Elementary School
- Pre-test all students before receiving
instruction - Longitudinal Test is a follow-up for to those
who were taught 1, 2, or 3 years in Elementary
School - Cross Section Test is comparison for students
who were taught or not taught in previous years
29References
- http//www.socialresearchmethods.net/kb/quasioth.p
hp - http//www.ecs.org/html/educationIssues/Research/p
rimer/appendixA.asp
30Contributing to the Research Base on Abstinence
Education Making the Most of the Upcoming
Evaluation Year
- Presented by
- Olivia Silber Ashley, Dr. P.H.
- October 15, 2009
31Importance of Disseminating Your Findings Beyond
a Final Report
- The evaluation is not completed until you have
written up your findings and sent them to a
journal. - - Karl Bauman
32Change Your Focus to Analysis and Writing
- Consider switching gears to focus on analysis and
dissemination of what you already have instead of
continuing to collect data with no time to
analyze/disseminate
33Analysis
- Use your evaluation funding to bring on a
statistician to conduct the most rigorous outcome
evaluation analyses possible to maximize chances
of publishing in a peer-reviewed journal - Analyze psychometric properties of measures and
use these results to inform outcome measure
selection - State-of-the art techniques to address missing
data - If you have a comparison group
- Test for baseline differences between groups
- Conduct multivariate analyses controlling for
baseline differences
34Writing Up Your Findings
- Writing is easy. All you do is stare at a blank
sheet of paper until drops of blood form on your
forehead. - - Gene Fowler
35Tips for Publishing from a Journal Editor
- Think small
- Think simple
- Dont strive for perfection
- Source Thyer, B.A. (2009). Moving ahead with
publishing Adolescent Family Life projects.
Webinar presented to Office of Adolescent
Pregnancy Programs Prevention and Care Grantees.
36Assess Your Staffing Needs
- Make sure at least one person on the team has
experience writing journal articles
37Decide on the Focus of Your Article
- Identify gaps in the literature
- What has been done?
- What can you offer that is unique?
- If a qualitative or quantitative results paper
- Complete analyses
- Summarize key findings in simple, plain English
- Share with program participants/stakeholders
- Consider what is unique about your findings
- Statistically significant
- New variables
- New population
- Unexpected null findings
- Revisit research questions/aims and literature to
make sure you have something unique and
justifiable
38Select an Appropriate Journal
- Select appropriate target journal early in the
process - Read about journals purpose and mission
- Read some of their articles to assess fit
- Select based on evaluation limitations
- Find a model article in the selected journal that
looks similar to your design - Quasi-experiment
- No comparison group
- Non-behavioral measures
- Attrition rate
39Option Describe the Program
- Target population and why focus on this
population - Theoretical framework, conceptual model
- Formative research
- Program components
- Topics and activities
- Implementation characteristics
- Delivery characteristics
- Participation rates
- Fidelity assessment
- Program cost
- Breakdown of costs to implement over a specified
period of time - Evaluation design
40An Example
- Description of a jail and community program to
reduce drug use, HIV risk, and re-arrest among
adolescent males returning home from jail - Program helps participants
- Examine alternative paths to manhood
- Consider racial/ethnic pride as a source of
strength - Address assets
- Address challenges
- Source Daniels, J., Crum, M., Ramaswamy, M.,
Freudenberg, N. (2009). Creating REAL MEN
Description of an Intervention to Reduce Drug
Use, HIV Risk, and Re-arrest Among Young Men
Returning to Urban Communities From Jail. Health
Promotion Practice. Epub ahead of print.
41Option Formative Research about the Program
- Describe how you
- Developed your program
- Theoretical basis
- Lessons learned
- How you adapted your program to a specific target
population - Present qualitative findings about the program
- What participants thought about it
- Challenges
- Successes
42An Example
- Found that
- School workers do not have time to read the
health information they receive - Providing too much information can be
counterproductive and result in low attention and
limited behavior change - School employees did not perceive asthma as a
threat to their students - Many school staff believed they knew what to do
about asthma without training - Identified innovative characteristics of the
program and their basis in formative findings and
theory - Source Goei, R., Boyson, A. R., Lyon-Callo, S.
K., Schott, C., Wasilevich, E., Cannarile, S.
(2006). Developing an asthma tool for schools
The formative evaluation of the Michigan Asthma
School Packet. Journal of School Health, 76,
259-265.
43Publishing in Peer-Reviewed Journals
- Staff with experience publishing journal articles
- Optimism
- A good idea or innovation
- A gap in the field
- Clear presentation, explanation, and
justification - Prioritizing
- Perseverance
- Knowledge of journal and audience
- Revise and resubmit
- Read reviewers comments, revise if sensible and
workable, promptly prepare a letter of submission
to another journal and resubmit immediately
(Thyer, 2009)
44Strategies
- Review the literature
- Review other articles like yours
- Make a generous timeline for analysis/writing if
you dont have publication experience - Utilize internal review
- Read as if for the first time with a critical eye
45- For more information,
- contact CREAE at
- info_at_abstinenceevaluation.org