Do After-school Programs Affect Important Youth Outcomes? If So, Do We Know Why? - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

Do After-school Programs Affect Important Youth Outcomes? If So, Do We Know Why?

Description:

Ann Arbor, MI. http://www.srcd.org/spr.html ... 1-day site visit. Local coach. Findings from the APAS pilot ... Kick-off, 2-day training on RIPQA ... – PowerPoint PPT presentation

Number of Views:150
Avg rating:3.0/5.0
Slides: 35
Provided by: lauram54
Category:

less

Transcript and Presenter's Notes

Title: Do After-school Programs Affect Important Youth Outcomes? If So, Do We Know Why?


1
Do After-school Programs Affect Important Youth
Outcomes? If So, Do We Know Why? Robert C.
Granger, Ed.D. Remarks prepared for Making a
Difference in After-school - Measuring and
Improving Program Quality Sacramento, CA /
March 17, 2009
2
Two questions
  • Do after-school programs improve academic
    performance?
  • Do we know why some programs make a difference
    while others do not?

2
3
Two answers
  • Yes
  • Starting too
  • Yes, but

3
4
The review
Background Policymakers and practitioners want
to know if after-school programs affect academic
achievement. Goal Review strong evidence
regarding the effects of after-school programs
and examine the practices of effective
programs. MethodSummarize the results from three
rigorous reviews of over 90 evaluations of
after-school programs.
4
5
Society for Research in Child Development. (2008,
April). After-school Programs and Academics
Implications for Policy, Practice, and Research.
(Social Policy Report Vol. XXII, No. 2). Ann
Arbor, MI Robert C. Granger. Society for
Research in Child Development. (2008, April).
Improving After-school Programs in a Climate of
Accountability. (Social Policy Report Brief Vol.
XXII, No. 2). Ann Arbor, MI. http//www.srcd.org/
spr.html
5
6
The findings
  • On average after-school programs improve
    important academic outcomes like test scores and
    grades.
  • A subset of the evaluated programs that achieved
    outstanding results account for the overall
    positive picture.
  • The most effective programs had explicit goals,
    activities aligned with those goals, and got
    youth actively involved in their own learning.

6
7
The two most important questions facing
policymakers and practitioners in education and
youth programs
  • What do effective teachers, youth workers, or
    mentors do differently than their less effective
    colleagues?
  • Can you make teachers, youth workers, or mentors
    more effective?

7
8
Sources of useful information about both questions
  • Practitioner consensus on best practices (Forum
    for Youth Investment, 2003)
  • In-depth studies of program practices (Halpern,
    Larson, Hirsch)
  • Practitioner efforts to improve program
    effectiveness (Many)
  • Measures of program quality (Forum for Youth
    Investment, 2009)

8
9
Measuring what matters
  • Importance of the point-of-service.
  • Good measures have clear, unambiguous items.
  • The best measures also teach.

9
10
Making a Difference in After School Measuring
and Improving After School Quality Nicole
Yohalem, Forum for Youth Investment
  • Sacramento, CAMarch 17, 2009

11
Quality assessment tools
  • Assessing Afterschool Program Practices Tool
    (APT)
  • National Institute on Out-of-School Time and the
    MA Department of Education
  • CORAL Observation Tool (CORAL)
  • Public/Private Ventures
  • Out-of-School Time Observation Instrument (OST)
  • Policy Studies Associates
  • Program Observation Tool (POT)
  • National Afterschool Association
  • Program Quality Observation (PQO)
  • Deborah Vandell and Kim Pierce
  • Promising Practices Rating Scale (PPRS)
  • WI Center for Education Research and Policy
    Studies Associates, Inc.
  • Quality Assurance System (QAS)
  • Foundations Inc.
  • Program Quality Self-Assessment Tool (QSA)
  • New York State Afterschool Network
  • School-Age Care Environment Rating Scale (SACERS)
  • Frank Porter Graham Child Development Center,
    UNC
  • Youth Program Quality Assessment (YPQA)

Measuring Youth Program Quality A Guide to
Quality Assessment Tools Updated January 2009
12
Quality assessment tools
  • There is a lot of similarity in how quality
    practice is defined. All tools assess
  • Relationships
  • Environment
  • Engagement
  • Social/Behavioral Norms
  • Skill Building Opportunities
  • Routine/Structure

Note CA self-assessment tool includes items that
address these areas.
13
Measuring what matters
  • Importance of the point-of-service.
  • Good measures have clear, unambiguous items.
  • The best measures also teach.

14
Emphasis on point-of-service
  • CA Tool 16 of 77 items focus on POS
  • SACERS NAA lt half focus on POS
  • APT YPQA gt half focus on POS

15
Clear and unambiguous?
  • Examples from the CA tool
  • High inference
  • Ensures staff volunteers have respectful
    interactions with participants families.
  • Low inference
  • Regularly provides families with program
    information in multiple languages and literacy
    levels.

16
Measures that teach?
  • Examples from the CA Tool
  • Diagnostic
  • Provides opportunities support for participants
    to take on leadership roles.
  • Diagnostic and prescriptive
  • Regularly provides collaborative partners with
    program information, such as program progress and
    evaluation reports and information about program
    events, in a variety of formats and in multiple
    languages if appropriate.

17
Quality improvement
  • Key components of quality improvement systems
  • Quality standards that include what should happen
    at the point of service
  • Ongoing assessment of how well services compare
    to the standards
  • Targeted plans for how to improve
  • Training and coaching that fits improvement plans

18
Emerging examples and lessons
  • Afterschool Program Assessment System (APAS)
  • National Institute on Out-of-School Time
  • Youth Program Quality Intervention (YPQI)
  • Weikart Center for Youth Program Quality

19
APAS pilot
  • Conducted by NIOST, Wellesley College
  • October 2006-July 2008
  • Atlanta, Boston, Charlotte, Middlesex Cnty NJ
  • 65 individuals, 28 programs, 3 intermediaries
  • Well-established K-8 after-school programs
  • Low stakes
  • Emphasis on continuous improvement, flexibility

20
Core APAS tools and supports
  • Tools
  • Survey of Afterschool Youth Outcomes Tool (SAYO)
  • Assessing Afterschool Program Practices Tool
    (APT)
  • Web-Based Data Management System
  • Supports
  • Training (2 days up front, online training
    ongoing)
  • 1-day site visit
  • Local coach

21
Findings from the APAS pilot
  • APAS helped programs identify areas for
    improvement and staff development
  • Most sites said they made program changes as a
    result.
  • Coaches are key to implementation and useful to
    sites
  • Engagement across staff levels is important
  • Engaging funders is important (even with low
    stakes)
  • based on follow-up phone interviews with sites
    and coaches
  • For more on APAS www.niost.org/content/view/1654/
    282/

22
Youth Program Quality Intervention
  • Systemic quality improvement systems (QIS)
    anchored by the YPQA being developed in
  • Statewide strategies MI, ME, RI, KY, NM, AR, MN,
    IA, WA, NY
  • Cities and Counties Austin, Chicago, Rochester,
    Detroit, Grand Rapids, Palm Beach County,
    Baltimore, Nashville, St. Louis, Louisville,
    Georgetown Divide/Sacramento, Columbus IN,
    Indianapolis IN, Tulsa OK

Seattle
Rochester
Minnesota
Grand Rapids
Washington
M
i
n
n
e
a
p
o
l
i
s
Maine
Chicago
New York
e
t
r
o
i
t
Iowa
Rhode Island
Indianapolis
Sacramento/ Georgetown Divide
m
b
u
s
Columbus
Baltimore
St. Louis
Kentucky
Oklahoma
Nashville
Arkansas
New Mexico
Austin
West Palm Beach County
23
YPQI Focus POS quality in context
Youth PQA Form A
Engagement Interaction Support Safety
PLC Professional Learning Community
Youth PQA Form B
  • Org policies/practices
  • Management values
  • Performance feedback
  • Continuity/staffing
  • Standards and metrics
  • Staff development

SAE System Accountability Environment
24
The Providence AfterSchool Alliance (PASA)
Quality Improvement Strategy
Quality Standards
Improvement Efforts
  • What exists
  • What we know
  • What works
  • Based on national examples
  • Learning communities
  • Site visits
  • Model curricula
  • School alignment

Quality Indicators
  • Measure of standards
  • Promising practices
  • Provider/Community Input

Capacity Building/ Professional Development
Self-Assessment Tool
Tracking Tool
  • Staffing Prof. Dev. Survey
  • Workshop series tied to RIPQA
  • BEST Youth Worker Training
  • Standards workshops aligning academics with
    enrichment
  • Partnership with High/Scope
  • Rhode Island Program Quality
  • Assessment Tool (RIPQA)
  • -Adopted by 21st CCLC initiative and in use
    statewide
  • Youthservices.net
  • Participation
  • retention data
  • Citywide data
  • management system

25
Incentivizing participation
  • PASA endorsed programs must
  • Maintain certain enrollment and retention
    benchmarks
  • Have a written curriculum
  • Undergo self-assessment using RIPQA annually
  • In exchange for
  • Streamlined grant application process
  • Small administrative funding supplement

26
Requiring participation
Excerpt from Rhode Island 21st CCLC
RFP Applicants must participate in the 21st
CCLC Rhode Island Youth Program Quality
Assessment Process (RIPQA), which includes the
use of a self-assessment tool, outside
observations, development and implementation of
action plans to strengthen the program over time,
working with a Technical Advisor, including
designation of staff to coordinate the process.
27
Rhode Island 21st CCLC pilot
  • Assessment Planning
  • Kick-off, 2-day training on RIPQA
  • Quality Advisor (QA) meets with programs
    individually to orient
  • Observation visits (3-8 programs per site)
  • QA develops progress report, teams meet with
    instructors to share reports and develop action
    plans
  • ED and other key staff complete Form B
    individually
  • QA summarizes, meets with team to discuss scores
    and improvement strategies
  • QA generates overall report on strengths and
    improvement steps
  • Training Technical Assistance
  • Series of 2-hour workshops focused on RI-PQA
    content
  • Additional training on behavior management
  • AYD training (32 hours) offered twice annually
  • 4-session supervisor training
  • 5 hours of on-site coaching per site from QA

28
RI 21st CCLC pilot lessons
  • Lessons Learned
  • Programs liked tool and found process worthwhile
  • Initial data collection model was time consuming
  • Timing is important to ensure changes get
    implemented
  • Needs across sites are very similar
  • Strong desire for on-site TA/coaching
  • Adjustments for Cohort 2
  • Smaller observation teams, fewer observations per
    site
  • One program report as opposed to individualized
    reports
  • Additional TA/training
  • Start with Form B, then observations (Form A)
  •  
  • For more information www.mypasa.org/pasa-strategi
    es

29
Palm Beach County QIS Pilot
  • Centerpiece of the Prime Time Initiative
  • 38 providers in pilot now working with 90
  • January 2006 fall 2007
  • Based on the PBC-PQA
  • Financial incentives for programs

PD Training
30
Findings from the Palm Beach pilot
  • Most programs completed all phases of QIS
  • Quality improved
  • Quality improvement is a long-term process
  • On-site TA very important component
  • Clarity of purpose is critical
  • Spielberger Lockaby, 2008
  • www.chapinhall.org

31
Coaching
  • Characteristics
  • Willing to listen
  • Experienced
  • Accessible
  • Flexible
  • Responsive
  • Creative
  • Resourceful
  • Roles/functions
  • Keep programs engaged
  • Deliver training
  • Answer questions on tools, process
  • Participate in observations
  • Generate reports
  • Facilitate improvement planning
  • Provide on-site feedback, modeling
  • Key considerations
  • Program vs. system-level coaching, role of
    intermediaries
  • Dosage

32
Purposes and methods
Smith, Devaney, Akiva Sugar forthcoming in New
Directions
33
Lessons for California
  • Have well defined purposes for the system.
  • Focus on the point of service.
  • Anchor quality improvement efforts with data
    about the POS.
  • Create incentives for continuous improvement.
  • Build in on-site, ongoing technical
    assistance/coaching.
  • Be intentional about pilot participation.
  • Build learning communities.
  • Recognize that management is a key lever.
  • Worry about the quality of your measures and
    data.

34
For more information Nicole Yohalem, Program
DirectorForum for Youth Investmentnicole_at_forumfy
i.org
  • www.forumfyi.org
Write a Comment
User Comments (0)
About PowerShow.com