Introduction to Improving the Patient Experience Series - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Introduction to Improving the Patient Experience Series

Description:

Introduction to Improving the Patient Experience Series Part 2 March 9, 2011 Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance ... – PowerPoint PPT presentation

Number of Views:316
Avg rating:3.0/5.0
Slides: 42
Provided by: SeanD151
Category:

less

Transcript and Presenter's Notes

Title: Introduction to Improving the Patient Experience Series


1
Introduction to Improving the Patient Experience
Series
Part 2 March 9, 2011
  • Measuring the Patient Experience
  • Tammy Fisher, MPH
  • Director, Quality Performance Improvement
  • San Francisco Health Plan

2
Agenda
  • Purposes of Measurement
  • Measurement to identify areas for improvement
  • Tools, methodologies , frequency
  • Measurement for testing implementing changes
  • Data collection strategies, tools, and
    methodologies .
  • Measurement to spread and sustain improvements
  • Tools, methodologies, frequency
  • Lessons Learned from the field
  • San Francisco Health Plan

3
Purposes of Measurement
Aspect Improvement Accountability Research
Aim Improvement of care Comparison, choice, reassurance New knowledge
Test Observability Test observations Evaluate current performance no test Test blinded
Bias Sample Size Consistent bias just enough data Measure and adjust to reduce bias 100 of data Design to eliminate bias just in case data
Flexibility of hypothesis Improvement of care No hypothesis Fixed hypothesis
Testing strategy Sequential tests No tests 1 test
Is change an improvement? Run or control charts No change focus Hypothesis tests (F-test, T-test, Chi-squared, P-value)
Confidentiality of data Only used by those involved in improvement Available for public consumption Identities protected
3
4
Applying it to Patient Experience
  • Research
  • Source for changes to try
  • Helps build will to try changes
  • Improvement
  • Understand impact of changes quickly
  • Provide rapid feedback engagement strategy
  • Convince others to try changes
  • Accountability
  • Sustainability- public reporting, pay for
    performance

4
5
Measurement Continuum for Improvement
5
6
Identify Areas and People for Improvement
  • Robust surveys
  • Robust measurement methodologies
  • Review trended results
  • Data at the organization and individual provider
    level
  • Look at composites strongly correlated with
    overall ratings of experience
  • Align areas with strategic goals organizational
    or clinic energy

6
7
Example of a Priority Matrix for CAHPS Health
Plan Survey Results                             
                                                  
                                                  
                                                  
                 
7
8
Surveys
  • Clinician Group CAHPS Survey
  • https//www.cahps.ahrq.gov/content/products/CG/PRO
    D_CG_CG40Products.asp?p1021s213
  • PBGH Short PAS Survey
  • PAS website http//www.cchri.org/programs/program
    s_pas.html
  • Short PAS survey http//www.calquality.org/progra
    ms/patientexp/documents/Short_Form_Survey_PCP_feb2
    010.doc
  • Other surveys Press Ganey and Avatar

8
9
Survey Options
Vendor Method of Administration Cost Considerations Groups using it
MTC Ph-800-295-9681, ask for Guy Swenson Telephonic 5-10/ completed survey can customize survey and development costs are low and turn around is quick rapid feedback (usually within two weeks of survey completion) - reporting is limited so need resources internally to manipulate data for reporting purposes MG John Muir Physician Associates Camino Medical Group CQC doctors in first Collaborative
Sullivan/Luallin ph- 619.283.8988 or at www.sullivan-luallin.com  Mailed Survey Variable recognized by CAPG good reporting capabilities in wide use by multiple groups option for customization Many CA groups ( , Beaver, Sharp)
Press Ganey www.pressganey.com Mailed Survey Call for a quote. robust survey, good reputation excellent reporting capability - especially good in hospitals/homecare, less so in outpatient UCSF
PBGH doctor level survey Ted VonGlahn, ph- 415-615-6318 Mailed survey once a year 185/per doctor very robust reporting, including physician detailed actionable report robust algorithms for selecting random samples - limited for QI purposes 40 groups in CA
AMGA http//www.amga.org /QMR/PSAT/index_psat.asp Point of service survey Check out costs on their website. A little complicated. in wide use provides feedback regularly analytic and reporting capabilities good benchmarks includes methodologies for assuring random sample - once data are forwarded to , report 5-6 weeks later A large number of national and CA groups using it.
Avatar www.avatar-intl.com Mailed survey Ask for a quote. in wide use nationally provides feedback regularly includes methodologies for assuring random sample good benchmarks analytic and reporting capabilities St. Joseph Heritage Medical Group
9
10
Robust Methodologies
  • Mail administration
  • 3 waves of mailing (initial mail, postcard
    reminder, second mail)
  • Telephone administration
  • At least 6 attempts across different days of the
    week and times of day
  • Mixed mail and telephone administration
  • Boost mail survey response by adding telephone
    administration

10
11
Tips
  • Survey
  • Include questions that matter most to consumers
  • Questions that ask about care experience
  • Applicability across heterogeneous populations
  • Demonstrates strong psychometric properties
  • Sufficient response categories (4 point 6 point
    scales)
  • Reporting
  • Includes internal and external benchmarks
  • Methodology
  • Appropriate sampling (reduce bias, large samples)
  • Standardized protocols
  • Timeframe- in the last 12 months
  • Frequency
  • Annually

11
12
Measurement for quality improvement
12
13
Purposes of Measurement
  1. For Leadership to know if changes have an impact
    and to build a compelling case to spread changes
    to others
  2. For providers and staff to get rapid feedback on
    tests of change to understand their progress
    towards their own aims and to spread to others in
    the clinic

13
14
Three Key Questions
  • What are we trying to accomplish? ((Aim)
  • How will we know that a change is an improvement?
    (Measure)
  • What changes can we make that will result in an
    improvement? (Change)

14
15
AIM Statement
15
16
Selected Changes
16
17
PDSA Rapid Cycle Improvement
17
Adapted from the Institute for Healthcare
Improvement Breakthrough Series College
18
Repeated Uses of PDSA Cycle
Adapted from the IHI Breakthrough Series College
18
19
Evaluate Impact of Changes
  • Data collection strategies/tools specific to
    changes tested implemented
  • Methodologies that allow for sequential testing
    small samples, less standardization
  • Data given to individuals testing changes
  • Enough data to know a change is an improvement
    and to convince others to try it
  • Frequent feedback during testing daily, weekly,
    collecting data over time
  • Inexpensive methods

19
20
(No Transcript)
21
Monthly Telephonic Surveys
21
22
Data Collection Tools
  • Point of service surveys
  • Telephonic surveys
  • Comment cards
  • Patient exit surveys
  • Focus groups
  • Kiosks, via web
  • Feedback from people doing the changes
  • Observation
  • Patient Advisory Boards

22
23
Point of Service
  • Focus on meaningful measures tied to AIM
    statement
  • Have 4-6 response choices
  • Include enough measures to appropriately evaluate
    aspect of care
  • Consistent methodology train staff collecting
    information
  • Collect just enough data
  • Need 15 measurement points for a run chart
  • Data collection can be burdensome!

23
24
Telephonic Surveys
  • More rapid feedback than mailed surveys
  • Typically less expensive
  • Outside vendors do it and provide reports
  • Easy to manipulate data for reporting
  • Less frequent monthly data at best
  • Literature suggests more bias than mailed surveys
    (not so important when testing)

24
25
Sample Comment Card
  • Comment Card
  • We would like to know what you think about your
    visit with Doctor X.
  • ? Yes, Definitely ? Yes, Somewhat, ? No
  • Did Dr. X listen carefully to you?
  • Did Dr. X explain things in a
  • way that was easy to understand?
  • Is there anything you would like to comment on
    further?
  • Thank you. We are committed to improving the care
    and services we provide our patients.

25
26
Patient Exit Interviews
  • Rapid feedback on changes tested
  • Not burdensome to collect data
  • Uncover new issues which may go unreported in
    surveys
  • Requires translation of information into
    actionable behaviors
  • Providers see the feedback
  • Include 3-5 questions, mix of specific measures
    and open ended questions

26
27
Patient Visit Walk-through
Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients Through the Eyes of Your Patients
Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive Tips for making the "Walk Through" most productive
1. Determine with your staff where the starting point and ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes. 2. Two members of the staff should role play with each playing a role patient and partner/family member. 3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times. 1. Determine with your staff where the starting point and ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes. 2. Two members of the staff should role play with each playing a role patient and partner/family member. 3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times. 1. Determine with your staff where the starting point and ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes. 2. Two members of the staff should role play with each playing a role patient and partner/family member. 3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times. 1. Determine with your staff where the starting point and ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes. 2. Two members of the staff should role play with each playing a role patient and partner/family member. 3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times. 1. Determine with your staff where the starting point and ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes. 2. Two members of the staff should role play with each playing a role patient and partner/family member. 3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times. 4. Make it real. Note the part of the visit time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up. 5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful. 6. Debrief your staff on what you did and what you learned. 4. Make it real. Note the part of the visit time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up. 5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful. 6. Debrief your staff on what you did and what you learned. 4. Make it real. Note the part of the visit time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up. 5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful. 6. Debrief your staff on what you did and what you learned. 4. Make it real. Note the part of the visit time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up. 5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful. 6. Debrief your staff on what you did and what you learned. 4. Make it real. Note the part of the visit time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up. 5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful. 6. Debrief your staff on what you did and what you learned.
Date Staff Members Staff Members
Walk Through Begins When Walk Through Begins When Walk Through Begins When Walk Through Begins When Ends When

Positives Positives Negatives Negatives Negatives Negatives Surprises Surprises Frustrating/Confusing Frustrating/Confusing Gratifying Gratifying
SIGNING IN/ POINT OF-SERVICE FEE None TIME WITH PROVIDER Spent enough time, all questions were answered during the visit SIGNING IN/ POINT OF-SERVICE FEE None TIME WITH PROVIDER Spent enough time, all questions were answered during the visit Takes forever- made copy of drivers license staff had no change for Pt-of-Svc fee. None. Takes forever- made copy of drivers license staff had no change for Pt-of-Svc fee. None. Takes forever- made copy of drivers license staff had no change for Pt-of-Svc fee. None. Takes forever- made copy of drivers license staff had no change for Pt-of-Svc fee. None. The number of steps involved to register a patient I liked the Agenda-Setting Form the provider used. The number of steps involved to register a patient I liked the Agenda-Setting Form the provider used. Was not directed to waiting room, didnt know what to do next. When provider left, I didnt know what was going to happen next. Was not directed to waiting room, didnt know what to do next. When provider left, I didnt know what was going to happen next. Finally sitting down in waiting room. All my questions were answered by provider. Finally sitting down in waiting room. All my questions were answered by provider.
28
Spreading Sustaining Improvements
  • Survey
  • Include questions that matter most to consumers
  • Questions that ask about care experience
  • Applicability across heterogeneous populations
  • Demonstrates strong psychometric properties
  • Reporting
  • Comparisons within peer group
  • Methodology
  • Appropriate sampling (reduce bias, large samples)
  • Standardized protocols
  • Risk adjustment
  • Frequency
  • Monthly, Quarterly

28
29
Another Look at Data
  • Medical Group in Los Angeles

29
30
Lessons learned San francisco health plan
30
31
Areas for Improvement
  • Provider-patient communication, office staff,
    Access to care
  • Performed in the lowest quartile
  • PPC and Access strongly correlated with overall
    ratings of care
  • Office staff support provider-patient
    communication Team approach

31
32
Improvement Project
  • AIM To improve CAHPS scores by achieving the
    50th percentile in the following composites by MY
    2012
  • Access to care
  • Provider-patient communication
  • APPROACH
  • Begin with 10 community clinics
  • Spread to most clinics by MY 2011

32
33
Purposes for Measurement
  1. For Leadership to know if changes have an impact
    and to build a compelling case to spread changes
    to other clinics
  2. For Clinics to get rapid feedback on tests of
    change to understand their progress towards their
    own aims and to spread to others in the clinic

33
34
Purpose 1 (for Spread)Measures Approach
Measures Methodology Frequency Reports
Patients ratings of their care At provider level with roll up to clinic Point-of-Care survey, about 30 questions, using a nationally recognized tool Quarterly Risk-adjusted data, delineating statistical significance. Showing data over time.
Clinic Site Satisfaction Online survey instrument Quarterly Data over time Anonymous
34
35
CAHPS Survey Results
For this provider, there was an 89 confidence
of change in the 13 improvement for the
measure Doctor Spends Enough Time with the
Patient
35
36
Patient Ratings of their Care
  • Standardized survey instrument based on the
    Clinician-Group CAHPS visit survey, about 30
    questions
  • Administered at the point of care by clinic
  • SFHP provides surveys in 3 languages (English,
    Spanish, Chinese) and picks up surveys on Friday
    of each week
  • Defined methodology all patients, given after
    the visit
  • Three fielding periods April 2010, Oct 2010, Jan
    2011
  • Each fielding period is 4 weeks
  • Risk adjusted results at the provider level with
    roll up at clinic level
  • Patient incentives two movie tickets/survey
  • Extra incentives up to 500 per clinic

36
37
Clinic/Practice Site Satisfaction
  • Survey instrument based on the Dartmouth and
    Tantau Associates, about 20 questions
  • Administered online by SFHP
  • SFHP sends a link to complete the survey online
  • Anonymous, results can be aggregated by role
  • Five fielding periods March 2010, June 2010,
    Sept 2010, Dec 2010, March 2011
  • Each fielding period is 2 weeks
  • Results at the clinic level 2 weeks following the
    close of the measurement period

37
38
Purpose 2 (for Clinics) Measures Approach
Measures Methodology Options Frequency Reports
Patients ratings of their care Select 5-7 measures based on AIM statement Point of service survey Telephonic survey Patient exit interviews Patient Advisory Boards Weekly Monthly Clinics document experience and results in a narrative
38
39
Point of Care Survey
39
40
Staff Patient Feedback
  • During todays visits, my experience was
    excellent! Before today my appointments were not
    that great, but today, I noticed an improvement-
    A big change! Very Helpful, Thank you
  • During todays visit, I noticed the staff with a
    better attitude towards their work, especially in
    the front desk.
  • Our staff and patients are loving the electronic
    patient summary discharge. The patients are
    saying. I know have something to reference back
    to about my visit. It makes it easy on my to
    remember what I need to do to take care of my
    health. I feel that I am responsible for my
    health I have a contract with my doctor

40
41
ChallengesLessons Learned
  • Adapted the CAHPS Visit Based Survey - low
    reliabilities and less variation few response
    categories
  • Point of care methodology introduced a lot of
    bias
  • Incentives were extremely helpful
  • Low literacy patients needed help with the survey
  • Very high scores on survey switched from mean
    to proportional scoring
  • Providers trusted just enough data to implement
    change with their patients

41
Write a Comment
User Comments (0)
About PowerShow.com