Performance Management Presentation Maintain Safe Working Environment Radiation Safety - PowerPoint PPT Presentation

About This Presentation
Title:

Performance Management Presentation Maintain Safe Working Environment Radiation Safety

Description:

www.ors.od.nih.gov – PowerPoint PPT presentation

Number of Views:206
Avg rating:3.0/5.0
Slides: 143
Provided by: Janic116
Learn more at: https://ors.od.nih.gov
Category:

less

Transcript and Presenter's Notes

Title: Performance Management Presentation Maintain Safe Working Environment Radiation Safety


1
Performance Management PresentationMaintain
Safe Working EnvironmentRadiation Safety
  • Team Leader Nancy Newman
  • Team Members Douglas Carter, Janet Thomson,
    Victor Voegtli
  • ORS
  • National Institutes of Health
  • Date February 23, 2005

2
Table of Contents
  • PM Template..
  • Customer Perspective..
  • Internal Business Process Perspective
  • Learning and Growth Perspective
  • Financial Perspective
  • Conclusions and Recommendations..
  • Customer Satisfaction Survey Results.

3
Table of Contents
  • Survey Background.3
  • Satisfaction Ratings on Specific Service Aspects
    .7
  • Importance Ratings on Specific Service
    Aspects....18
  • Comments..29
  • Summary....34
  • Recommendations39

4
(No Transcript)
5
(No Transcript)
6
Customer Perspective
7
Customer Perspective (cont.)
8
Customer Perspective (cont.)
9
C2 Enhance Communications with customers
  • Measures
  • C2a Number of visits to DRS Portal
  • Not able to discriminate between AU or DRS
    employee but will after upgrade.
  • C2b Length of time on Portal
  • Eliminated this measure since no important data
    was retrieved
  • C2c Tasks performed via Portal

10
C2 Enhance Communications with customers
  • C2c Tasks performed via Portal based on
    frequency
  • Material Disposals
  • User Changes
  • Monthly Memo Printing
  • Waste Pickup Requests
  • NIH 88-1 submission
  • User Registrations
  • Lab Changes

11
C2 Enhance Communications with customers
C2d Tasks performed via Portal
12
C2 Enhance Communications with customers
  • Initiative and Measures for FY05
  • Increase auditing capabilities of Portal usage
  • Improve usability of Portal function
  • Increase transactions of infrequent tasks such as
    88-1 form submission

13
Customer Perspective (cont.)
14
C3 Percentage of people training on-line
15
C3 Percentage of people training on-line
  • Goal increase on-line training
  • FY04 Initiative on-line refresher training for
    AUs
  • Data show decrease in on-line training
  • Cause elimination of on-line training module for
    nurses
  • FY06 Initiative new on-line training module for
    nurses

16
Relationship Among Performance Objectives
  • Enhancing communication with our customers would
  • Maintain compliance with regulations
  • Increase customer satisfaction

17
Internal Business Process Perspective
18
Internal Business Process Perspective
19
IB1a Number of Security Violations
20
IB1b Number of Non-security Violations
21
IB1a and b Number of security and non-security
violations
22
  • IB2 Improve effectiveness of radioactive waste
    pick-up scheduling

23
Internal Business Process Perspective (cont.)
24
Improve Effectiveness of Radioactive Waste
Pick-up Scheduling
On-line Scheduling of Radioactive Waste Pickups
2.4
0.9
FY'04
FY'03
0
0.5
1
1.5
2
2.5
Percentage
25
IB2 Percentage radioactive waste pickups
scheduled on-line
  • Baseline .9
  • Target 5
  • Achieved 2.7

26
Internal Business Process Perspective (cont.)
27
IB3 Ensure timely return of dosimeters
28
IB3 Ensure timely return of dosimeters
29
IB3 Ensure timely return of dosimeters
30
Internal Business Process Perspective (cont.)
  • The Focus Group (FG) average absentee rate is
    within 1-sigma of the target rate when comparing
    FG absent dosimeters to FG dosimeters issued.
  • FG absentee rate compares favorably to other
    medical/research institutions with dosimetry
    programs of similar size and type.
  • A primary concern is that the FG is comprised of
    only 11 of the 70 badge groups at NIH, yet they
    account for 44 of the missing dosimeters.
  • None of the corrective actions implemented to
    date have made a substantial impact on
    alleviating the problem.

31
Internal Business Process Perspective
  • Actions taken
  • Reorganized badge groups by size and location to
    make them more manageable.
  • Offered to buy and install badge boards to aid
    with distribution and collection of dosimeters.
  • Distributed informational handouts detailing the
    importance of timely collection of dosimeters and
    the importance of individual roles within the
    program to Authorized Users and Dosimeter
    Custodians.
  • Implemented a program of hand delivery and
    pick-up of dosimeters for all badge groups
    residing on the main campus.

32
Internal Business Process Perspective
  • Actions pending
  • Develop and implement an on-line training program
    for Dosimeter Custodians.
  • Actions to be considered
  • Levy a per dosimeter charge against the parent
    institutes to offset the missing dosimeter fees
    imposed upon us by our contractor (consumes 5
    of our annual dosimetry budget).
  • Consider revoking individual user privileges for
    program participants who persistently fail to
    comply with program requirements

33
Internal Business Process Perspective (cont.)
34
I.B4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
Increased awareness intended to reduce the number
of ASPs involving radioactive materials or
radiation producing equipment that have not been
reviewed by DRS. Baseline study of FY03 ASP
program found that 90 of ASPs involving
radiation were reviewed by DRS To make this
initiative effective it would rely heavily on
cooperation from DRS, ACUC coordinator, DOHS
reps, and also PI.
35
IB4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
  • Steps taken to increase awareness
  • Added information on DRS website as well as the
    Office of Animal Care and Use (OACU) websites.
  • Performed audits to each institutes ASP file and
    compared it to DRS file
  • Surveyed each ACUC coordinator to better
    understand their role in the ASP review process
  • Created a pre-screening checklist for ACUC
    coordinator to help determine if DRS review is
    needed.

36
I.B4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
  • Steps taken to increase awareness (cont.)
  • Created a list of buzzwords to help DOHS reps
    become more familiar with terminology used in
    ASPs involving radiation.
  • Developing a database to track ASPs
  • Annual reviews of existing and new ASPs

37
I.B4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
38
I.B4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
  • On the whole, the level of awareness has been
    increased by 3.
  • A higher level of awareness is hoped to be
    achieved when the ASP database comes online.
  • The ASPs will be tracked and reviews will be
    conducted on an annual basis.
  • The annual review is also hoped to enhance
    communication between the PI and DRS and become
    another mechanism to heighten awareness.

39
Internal Business Process Perspective (cont.)
40
IB5 Ensure HPs have critical data in a timely
manner
41
IB5 Ensure HPs have critical data in a timely
manner
42
IB5 Ensure HPs have critical data in a timely
manner
43
Internal Business Process Perspective (cont.)
  • The Delinquent Analysis rate falls easily within
    1-sigma, and is just slightly above the current
    target rate of 5.
  • The current target rate should be attainable now
    that the process has been established and the
    mindset of involved personnel is such that
    meeting specific timing goals is given
    appropriate priority.

44
Internal Business Process Perspective (cont.)
  • After the current target rate is achieved
    consistently, our long range goal is to lower the
    target rate incrementally until it falls below 1.

45
  • Learning and Growth Perspective

46
Learning and Growth Perspective
47
LG1 Determine and Maintain Effective Staffing
Level
48
LG1 Determine and Maintain Effective Staffing
Levels
  • Reduced FTEs by 2
  • Saved approximately 180,000
  • 3 employees now elsewhere at NIH
  • Reasons career transitions and/or promotions
  • Conducted workshops to enhance teamwork
  • Recruited 2 employees
  • Developed questions for QuickHire

49
LG1 Maintain Effective Staffing Levels
50
Learning and Growth Perspective (cont.)
51
LG2 Number of Awards and Dollars per Award
  • Unable to collect meaningful data
  • No centralized tracking system
  • Difficult to determine value of different types
    of awards
  • Discontinue this objective and measure

52
Learning and Growth Perspective (cont.)
53
LG3a Number of training hours per HP
  • Data collected incomplete
  • Seminars, workshops, etc., not funded by DRS not
    tracked
  • Implemented new tracking mechanism to capture
    total hours of training for each HP

54
Financial Perspective
55
Financial Perspective (cont.)
56
F1 Minimize cost a defined service level for
radiation safety
57
Financial Perspective
  • 36 increase in unit cost
  • Cause incorporation of cost of acquisition and
    distribution of radionuclides, formerly under Fee
    For Service
  • DRS now 100 Membership Service

58
Process Maps
59
I.B4 Increase awareness of requirement for DRS
review of Animal Study Program (ASP) proposals
60
Conclusions
61
Conclusions from PMP
  • Our customers are highly satisfied with the
    services we provide
  • Upgrade tracking system for portal usage
  • Develop on-line training module for nurses by
    FY06
  • Try to decrease number of security violations by
    developing on-line training module on security
  • Successful in reducing non-security violations,
    perhaps due to AU refresher training

62
Conclusions (cont.)
  • Decrease missing dosimeters by training Dosimetry
    Custodians
  • Benchmark dosimetry return issues
  • Create database to track Animal Study Proposals
  • Expedite sample preparation to reduce turnaround
    time for analysis

63
Conclusions (cont.)
  • Develop mechanism for tracking all training hours
    for HPs
  • Continue to re-evaluate necessary staffing level
    and adjust as necessary
  • Continue to look for cost-cutting opportunities

64
Division of Radiation Safety (DRS) Dosimetry
Survey Joe Wolski Office of Quality
Management, Office of Research Services and Janice
Rouiller, Ph.D and Laura Stouffer SAIC 06
January 2005
65
  • Survey Background

66
Survey BackgroundPurpose
67
Survey BackgroundMethodology
68
Survey BackgroundDistribution
Number of surveys distributed
Number of respondents 11
Response Rate

69
  • Satisfaction Ratings on Specific Service Aspects

70
FY04 Satisfaction Ratings on Specific Service
Aspects
Mean Response
N 11
N 11
N 11
N 11
N 11
N 11
N 11
N 11
N 11
Unsatisfactory
Outstanding
71
FY04 Satisfaction Ratings Available
ServicesFrequency of Response
N 11 Mean 9.30 Median 10
100
0
0
Unsatisfactory
Outstanding
72
FY04 Satisfaction Ratings QualityFrequency of
Response
N 11 Mean 9.22 Median 9
100
0
0
Unsatisfactory
Outstanding
73
FY04 Satisfaction Ratings TimelinessFrequency
of Response
N 11 Mean 9.10 Median 10
100
0
0
Unsatisfactory
Outstanding
74
FY04 Satisfaction Ratings ReliabilityFrequency
of Response
N 11 Mean 9.20 Median 10
100
0
0
Unsatisfactory
Outstanding
75
FY04 Satisfaction Ratings Staff
AvailabilityFrequency of Response
N 11 Mean 8.71 Median 9
86
14
0
Unsatisfactory
Outstanding
76
FY04 Satisfaction Ratings ResponsivenessFrequenc
y of Response
N 11 Mean 9.00 Median 9
89
11
0
Unsatisfactory
Outstanding
77
FY04 Satisfaction Ratings ConvenienceFrequency
of Response
N 11 Mean 9.18 Median 10
91
9
0
Unsatisfactory
Outstanding
78
FY04 Satisfaction Ratings CompetenceFrequency
of Response
N 11 Mean 9.20 Median 10
90
10
0
Unsatisfactory
Outstanding
79
FY04 Satisfaction Ratings Handling of
ProblemsFrequency of Response
N 11 Mean 8.38 Median 9
75
25
0
Unsatisfactory
Outstanding
80
  • Importance Ratings on Specific Service Aspects

81
FY04 Importance Ratings on Specific Service
Aspects
Mean Response
N 10
N 10
N 10
N 10
N 10
N 10
N 10
N 10
N 10
Unsatisfactory
Outstanding
82
FY04 Importance Ratings Available
ServicesFrequency of Response
N 10 Mean 8.89 Median 9
0
11
89
Unsatisfactory
Outstanding
83
FY04 Importance Ratings QualityFrequency of
Response
N 10 Mean 8.88 Median 9
0
12
88
Unsatisfactory
Outstanding
84
FY04 Importance Ratings TimelinessFrequency
of Response
N 10 Mean 9.00 Median 9
0
11
89
Unsatisfactory
Outstanding
85
FY04 Importance Ratings ReliabilityFrequency
of Response
N 10 Mean 9.00 Median 9
0
11
89
Unsatisfactory
Outstanding
86
FY04 Importance Ratings Staff
AvailabilityFrequency of Response
N 10 Mean 8.57 Median 9
0
29
71
Unsatisfactory
Outstanding
87
FY04 Importance Ratings ResponsivenessFrequenc
y of Response
N 10 Mean 8.89 Median 9
0
22
78
Unsatisfactory
Outstanding
88
FY04 Importance Ratings ConvenienceFrequency
of Response
N 10 Mean 8.90 Median 10
0
20
80
Unsatisfactory
Outstanding
89
FY04 Importance Ratings CompetenceFrequency
of Response
N 10 Mean 8.89 Median 9
0
22
78
Unsatisfactory
Outstanding
90
FY04 Importance Ratings Handling
ofProblemsFrequency of Response
N 10 Mean 8.57 Median 9
0
29
71
Unsatisfactory
Outstanding
91
  • Comments

92
Survey Comments
  • Total of 6 respondents provided at least one
    comment
  • 55 of respondents
  • Total of 8 comments were made on 3 general
    questions
  • What was done particularly well?
  • What needs to be added or improved?
  • Other comments
  • Realize comments are qualitative data
  • Comments provide a different type of information
    from your customers regarding their satisfaction
  • Comments are NOT representative of the
    perceptions of all your customers
  • Review them but dont over react to an individual
    comment
  • Comments are a great source for ideas on how to
    improve

93
Survey Comments What was done particularly well?
(N 5)
  • As long as things work, I am happy.
  • I get a report every month. Somebody delivers it
    to me.
  • Availability of Radiation Safety officers. Doing
    excellent job! Thank you.
  • Have only had to handle the exchange of the
    monitoring badge.
  • The items are packaged well. When items are
    missing, they have been promptly replaced.

94
Survey Comments What needs to be added or
improved? (N 2)
  • Returning badges through the NIH internal mail is
    somewhat risky. The loss of a badge is such a
    headache, perhaps a more secure return system
    could be developed.
  • It is all fine and well to get reports on my
    exposure, but my main concern is how much
    exposure I'm getting and what that means. The
    reports I get are not very informative. They use
    symbols that are not included in any key, so they
    are basically meaningless to me. Since I work
    with a high energy Gamma emitter, I would like to
    really know I'm safe. I don't really get that
    from the reports.

95
Survey Comments Other Comments (N 1)
  • My RSO, John Jacohus goes out of his way to help
    me any problems/issues. Very competent,
    responsive and reliable.

96
Summary
97
Summary
  • Respondent Characteristics
  • __ of recipients responded to the survey.
  • Satisfaction Ratings on Specific Service Aspects
  • Respondents were asked to rate their satisfaction
    with the following aspects of Dosimetry services
  • Available Services
  • Quality
  • Timeliness
  • Reliability
  • Staff Availability
  • Responsiveness
  • Convenience
  • Competence
  • Handling of Problems
  • The scale ranged from (1) Unsatisfactory to (10)
    Outstanding. Satisfaction mean ratings range
    from a high of 9.30 on Available Services to a
    low of 8.38 on Handling of Problems. Notice that
    the lowest mean rating (8.38) is still well above
    the midpoint of a 10-point scale. In general,
    respondent perceptions are quite positive.

98
Summary (cont.)
  • Satisfaction Ratings on Specific Service Aspects
    (cont.)
  • Response frequencies for each service aspect were
    computed and responses of 8, 9, and 10 grouped as
    indicating outstanding performance. For each
    service aspect, at least 75 of respondents
    perceived that aspect to be outstanding. For 4
    service aspects (Available Services, Quality,
    Timeliness, and Reliability), all respondents
    indicated that the service was outstanding.
  • None of the respondents find service to be
    unsatisfactory (responses of 1, 2, or 3) in any
    of the service aspects.

99
Summary (cont.)
  • Importance Ratings on Specific Service Aspects
  • Respondents were asked to rate the importance of
    the following aspects of Dosimetry services
  • Available Services
  • Quality
  • Timeliness
  • Reliability
  • Staff Availability
  • Responsiveness
  • Convenience
  • Competence
  • Handling of Problems
  • The scale ranged from (1) Unsatisfactory to (10)
    Outstanding. Importance mean ratings range from
    a high of 9.00 on Timeliness and Reliability to a
    low of 8.57 on Staff Availability and Handling of
    Problems. Notice that the lowest mean rating
    (8.57) is still well above the midpoint of a
    10-point scale. In general, respondents find all
    service aspects to be very important.

Note In future surveys, scale anchors for
importance ratings should be changed to (1)
Unimportant to (10) Very Important
100
Summary (cont.)
  • Importance Ratings on Specific Service Aspects
    (cont.)
  • Response frequencies for each service aspect were
    computed and responses of 8, 9, and 10 grouped as
    indicating highest importance. For each service
    aspect, at least 71 of respondents perceived
    that aspect to be of the highest importance.
  • None of the respondents find any of the service
    aspects to be unimportant (responses of 1, 2, or
    3).

101
Recommendations
102
Recommendations
  • Interpret ORS Customer Scorecard data in terms of
    other PM data gathered
  • Does the customer satisfaction data, when
    compared to data in other perspectives, show
    potential relationships?
  • Review comments in presentation for specific
    issues that can be tackled
  • Take the time to read through all comments
  • If appropriate, generate potential actions based
    on what you have learned from the data
  • Can you make changes to address issues raised?
  • How might you implement those actions?
  • Communicate the survey results (and intended
    actions) to important stakeholder groups
  • ORS Senior Management
  • Radiation Safety staff
  • Survey respondent pool
  • Conduct follow-up survey (within next 2 years) to
    check on improvements

103
Division of Radiation Safety (DRS) Analytical
Laboratory Survey Joe Wolski Office of
Quality Management, Office of Research
Services and Janice Rouiller, Ph.D and Laura
Stouffer SAIC 6 January 2005
104
Table of Contents
  • Survey Background.3
  • Satisfaction Ratings on Specific Service Aspects
    .7
  • Importance Ratings on Specific Service
    Aspects....18
  • Comments..29
  • Summary....34
  • Recommendations39

105
  • Survey Background

106
Survey BackgroundPurpose
107
Survey BackgroundMethodology
108
Survey BackgroundDistribution
Number of surveys distributed
Number of respondents 8
Response Rate

109
  • Satisfaction Ratings on Specific Service Aspects

110
FY04 Satisfaction Ratings on Specific Service
Aspects
Mean Response
N 8
N 8
N 8
N 8
N 8
N 8
N 8
N 8
N 8
Unsatisfactory
Outstanding
111
FY04 Satisfaction Ratings Available
ServicesFrequency of Response
N 8 Mean 8.75 Median 9
100
0
0
Unsatisfactory
Outstanding
112
FY04 Satisfaction Ratings QualityFrequency of
Response
N 8 Mean 8.63 Median 9
100
0
0
Unsatisfactory
Outstanding
113
FY04 Satisfaction Ratings TimelinessFrequency
of Response
N 8 Mean 8.13 Median 8
50
50
0
Unsatisfactory
Outstanding
114
FY04 Satisfaction Ratings ReliabilityFrequency
of Response
N 8 Mean 8.75 Median 9
88
12
0
Unsatisfactory
Outstanding
115
FY04 Satisfaction Ratings Staff
AvailabilityFrequency of Response
N 8 Mean 8.25 Median 8
88
12
0
Unsatisfactory
Outstanding
116
FY04 Satisfaction Ratings ResponsivenessFrequenc
y of Response
N 8 Mean 8.50 Median 9
75
25
0
Unsatisfactory
Outstanding
117
FY04 Satisfaction Ratings ConvenienceFrequency
of Response
N 8 Mean 8.75 Median 9
88
12
0
Unsatisfactory
Outstanding
118
FY04 Satisfaction Ratings CompetenceFrequency
of Response
N 8 Mean 8.75 Median 9
100
0
0
Unsatisfactory
Outstanding
119
FY04 Satisfaction Ratings Handling of
ProblemsFrequency of Response
N 8 Mean 8.43 Median 8
86
14
0
Unsatisfactory
Outstanding
120
  • Importance Ratings on Specific Service Aspects

121
FY04 Importance Ratings on Specific Service
Aspects
Mean Response
N 8
N 8
N 8
N 8
N 8
N 8
N 8
N 8
N 8
Unsatisfactory
Outstanding
122
FY04 Importance Ratings Available
ServicesFrequency of Response
N 8 Mean 9.00 Median 9
0
12
88
Unsatisfactory
Outstanding
123
FY04 Importance Ratings QualityFrequency of
Response
N 8 Mean 8.75 Median 9
0
12
88
Unsatisfactory
Outstanding
124
FY04 Importance Ratings TimelinessFrequency
of Response
N 8 Mean 8.88 Median 9
0
12
88
Unsatisfactory
Outstanding
125
FY04 Importance Ratings ReliabilityFrequency
of Response
N 8 Mean 8.88 Median 9
0
0
100
Unsatisfactory
Outstanding
126
FY04 Importance Ratings Staff
AvailabilityFrequency of Response
N 8 Mean 8.63 Median 9
0
25
75
Unsatisfactory
Outstanding
127
FY04 Importance Ratings ResponsivenessFrequenc
y of Response
N 8 Mean 8.63 Median 9
0
25
75
Unsatisfactory
Outstanding
128
FY04 Importance Ratings ConvenienceFrequency
of Response
N 8 Mean 7.75 Median 8
0
50
50
Unsatisfactory
Outstanding
129
FY04 Importance Ratings CompetenceFrequency
of Response
N 8 Mean 9.00 Median 10
0
12
88
Unsatisfactory
Outstanding
130
FY04 Importance Ratings Handling
ofProblemsFrequency of Response
N 8 Mean 8.38 Median 9
0
25
75
Unsatisfactory
Outstanding
131
  • Comments

132
Survey Comments
  • Total of 7 respondents provided at least one
    comment
  • 88 of respondents
  • Total of 18 comments were made on 3 general
    questions
  • What was done particularly well?
  • What needs to be added or improved?
  • Other comments
  • Realize comments are qualitative data
  • Comments provide a different type of information
    from your customers regarding their satisfaction
  • Comments are NOT representative of the
    perceptions of all your customers
  • Review them but dont over react to an individual
    comment
  • Comments are a great source for ideas on how to
    improve

133
Survey Comments What was done particularly well?
(N 6)
  • Never any confusion on the results.
  • Following SOP for counting requests, providing
    consistent results.
  • Lab manager is widely available and willing to
    talk about issue HP's would have that require
    their (TSB's) services.
  • I have always received excellent service and
    quick response to questions and problems.
  • Everything.
  • Doug really takes pride in running the lab well.
    Vince is always a smiling face in the lab.

134
Survey Comments What needs to be added or
improved? (N 6)
  • Turn-around time and transition from one lab
    worker to the next could be improved.
  • Direct communication with dosimetry custodians
    regarding missing dosimetry. ( I do understand
    that efforts are underway to improve this.)
    Closer tracking of situations where missing
    dosimetry requires a close estimation.
    Explaining why users are receiving their annual
    exposure report and what it means.
  • Ability to perform whole body scanning on someone
    with highly contaminated hands.
  • Contractor prep of samples could be more timely
    on occasion.
  • Nothing.
  • I would recommend taking the Analytical Lab
    services back from the contractor and just do the
    function in-house. We have the staff already.

135
Survey Comments Other Comments (N 6)
  • Overall, pretty good job!
  • Timeliness of HP notification has improved
    greatly, takes pressure off HPs.
  • Scale for importance not relevant in survey.
    Better form for evaluation needs to be developed.
  • Overall analytical lab service has improved
    greatly since the hiring of a lab manager.
  • Great work! Keep it up.
  • Get those SOPs done!

136
Summary
137
Summary
  • Respondent Characteristics
  • __ of recipients responded to the survey.
  • Satisfaction Ratings on Specific Service Aspects
  • Respondents were asked to rate their satisfaction
    with the following aspects of Analytical
    Laboratory services
  • Available Services
  • Quality
  • Timeliness
  • Reliability
  • Staff Availability
  • Responsiveness
  • Convenience
  • Competence
  • Handling of Problems
  • The scale ranged from (1) Unsatisfactory to (10)
    Outstanding. Satisfaction mean ratings range
    from a high of 8.75 on Available Services,
    Reliability, Convenience, and Competence to a low
    of 8.13 on Timeliness. Notice that the lowest
    mean rating (8.13) is still well above the
    midpoint of a 10-point scale. In general,
    respondent perceptions are quite positive.

138
Summary (cont.)
  • Satisfaction Ratings on Specific Service Aspects
    (cont.)
  • Response frequencies for each service aspect were
    computed and responses of 8, 9, and 10 grouped as
    indicating outstanding performance. For each
    service aspect, at least 50 of respondents
    perceived that aspect to be outstanding. For 3
    service aspects (Available Services, Quality, and
    Competence), all respondents rated the service as
    outstanding.
  • None of the respondents find service to be
    unsatisfactory (responses of 1, 2, or 3) in any
    of the service aspects.

139
Summary (cont.)
  • Importance Ratings on Specific Service Aspects
  • Respondents were asked to rate the importance of
    the following aspects of Analytical Laboratory
    services
  • Available Services
  • Quality
  • Timeliness
  • Reliability
  • Staff Availability
  • Responsiveness
  • Convenience
  • Competence
  • Handling of Problems
  • The scale ranged from (1) Unsatisfactory to (10)
    Outstanding. Importance mean ratings range from
    a high of 9.00 on Available Services and
    Competence to a low of 7.75 on Convenience.
    Notice that the lowest mean rating (7.75) is
    still well above the midpoint of a 10-point
    scale. In general, respondents find all services
    to be quite important.

Note In future surveys, scale anchors for
importance ratings should be changed to (1)
Unimportant to (10) Very Important
140
Summary (cont.)
  • Importance Ratings on Specific Service Aspects
    (cont.)
  • Response frequencies for each service aspect were
    computed and responses of 8, 9, and 10 grouped as
    indicating highest importance. For each service
    aspect, at least 50 of respondents perceived
    that aspect to be of the highest importance. For
    Reliability, all respondents indicated this
    service aspect to be of the highest importance.
  • None of the respondents find any service aspect
    to be unimportant (responses of 1, 2, or 3).

141
Recommendations
142
Recommendations
  • Interpret ORS Customer Scorecard data in terms of
    other PM data gathered
  • Does the customer satisfaction data, when
    compared to data in other perspectives, show
    potential relationships?
  • Review comments in presentation for specific
    issues that can be tackled
  • Take the time to read through all comments
  • If appropriate, generate potential actions based
    on what you have learned from the data
  • Can you make changes to address issues raised?
  • How might you implement those actions?
  • Communicate the survey results (and intended
    actions) to important stakeholder groups
  • ORS Senior Management
  • Radiation Safety staff
  • Survey respondent pool
  • Conduct follow-up survey (within next 2 years) to
    check on improvements
Write a Comment
User Comments (0)
About PowerShow.com