Cancer Disparities Research Partnership Program Process - PowerPoint PPT Presentation

About This Presentation
Title:

Cancer Disparities Research Partnership Program Process

Description:

Cancer Disparities Research Partnership Program Process & Outcome Evaluation Purpose of Evaluation Measure relevance, effectiveness, and impact of CDRP Program in a ... – PowerPoint PPT presentation

Number of Views:225
Avg rating:3.0/5.0
Slides: 15
Provided by: JohnPau4
Learn more at: https://rrp.cancer.gov
Category:

less

Transcript and Presenter's Notes

Title: Cancer Disparities Research Partnership Program Process


1
Cancer Disparities Research Partnership
ProgramProcess Outcome Evaluation
2
Purpose of Evaluation
  • Measure relevance, effectiveness, and impact of
    CDRP Program in a consistent fashion
  • Evaluation findings to identify Program
    attributes to be applied to other settings
  • Meaningful annual reports for stakeholders within
    and outside the NCI

3
Type of Program Evaluation
  • 3-year evaluation
  • Process components
  • Is program being implemented as planned
  • Is targeted audience being reached
  • Outcome components
  • Were short-term and intermediate objectives
    achieved

4
Evaluation Design Focus of Key Questions
  • Improvement and conduct of radiation oncology
    clinical research in community-based health care
    institutions serving large populations
    experiencing health disparities
  • Clinical trial participation by target
    populations
  • Influence of partnerships between awardee
    institutions and academic research centers
  • Influence of TELESYNERGY
  • Influence of Patient Navigation

5
CDRP Program Conceptual Framework

6
Target Populationsfor Evaluation Data Collection
  • Cancer patients from targeted populations
    experiencing cancer health disparities
  • Principal Investigators, participating radiation
    oncologists, and awardee institution staff
  • Patient Navigators

7
Data Sources
  • Archival Data
  • CDRP Database
  • Program-related written documents
  • PI meeting minutes, organizational records and
    charts
  • Historical data on local cancer incidence,
    prevalence, mortality
  • Baseline data on clinical trials and research
    activities at participating sites
  • New Data
  • In-depth structured interviews with PIs and
    partners
  • Patient focus groups
  • Group interviews with Patient Navigators and
    Navigator support staff
  • Surveys on issues related to recruitment to
    clinical trials

8
Comparison Group
  • 6 similar community-based radiation oncology
    facilities
  • Comparison sites will be chosen based on same
    criteria as stated in RFA
  • Conduct brief survey about independent and
    collaborative clinical research capabilities and
    current participation in radiation oncology
    clinical trials

9
Data Analysis
  • Each program component will be analyzed
  • Clinical trials
  • Partnerships between awardee and academic partner
  • TELESYNERGY
  • Patient Navigation
  • Sources of information will be combined to
    analyze overall Program and linkages
    between/among components.

10
Data Analysis (continued)
  • Quantitative
  • Descriptive statistics
  • Complex analyses and causal modeling
  • Resource cost-allocation analysis
  • QualitativeThematic Analysis
  • Intended and unintended successes
  • Failures
  • Critical incidents
  • Lessons learned
  • Recommendations for future programs

11
Evaluation Advisory Committee
  • Norman Coleman, M.D. NCI/DCTD/RRP
  • Larry Solomon, Ph.D.NCI/OSPA
  • Paul Johnson, Ph.D.NICHD/OSPAC
  • Ted Trimble, M.D.NCI/DCTD/CTEP
  • Martin Ojong-Ntui, M.D.Chief, Radiation
    Oncology, GWU Cancer Center
  • Frank Govern, Ph.D. NCI/DCTD/RRP

12
Evaluation Advisory CommitteeRoles and
Responsibilities
  • Provide advice on programmatic needs and types of
    information RRP needs
  • Provide guidance to ensure validity of evaluation
    design, approach, and data analysis
  • Data Quality Subcommittee
  • Review data collection methods and data analyses
    for accuracy, biases, interpretations, and
    conclusions
  • Make recommendations for maintenance and/or
    improvement of data management and data quality
    assurance

13
Products of Evaluation Reports
  • Annual process/progress reports
  • Program implementation, accomplishments, and
    progress to date
  • Needed midcourse changes to meet objectives.
  • Final outcome report
  • Evidence about Programs attainment of goals and
    expected outcomes.
  • Summarize plausible mechanisms of change
  • Delineate temporal sequences between activities
    and effects

14
Evaluation Timeline
Write a Comment
User Comments (0)
About PowerShow.com