FY02 ASA Presentation Ensure Integrity of NIH Facilities - PowerPoint PPT Presentation

About This Presentation
Title:

FY02 ASA Presentation Ensure Integrity of NIH Facilities

Description:

www.ors.od.nih.gov – PowerPoint PPT presentation

Number of Views:149
Avg rating:3.0/5.0
Slides: 83
Provided by: Janic132
Learn more at: https://ors.od.nih.gov
Category:

less

Transcript and Presenter's Notes

Title: FY02 ASA Presentation Ensure Integrity of NIH Facilities


1
FY02 ASA Presentation Ensure Integrity of NIH
Facilities
  • Presented by
  • Mehryar Ebrahimi
  • Office of Research Services
  • National Institutes of Health
  • 18 November 2002

2
Table of Contents
  • Main Presentation
  • ASA Template ..4
  • Customer Perspective..6
  • Customer Segmentation .8
  • Customer Satisfaction..10
  • Internal Business Process Perspective
    .12
  • Service Group Block Diagram.13
  • Conclusions from Discrete Services Deployment
    Flowcharts14
  • Process Measures.15
  • Learning and Growth Perspective...
    21
  • Conclusions from Turnover, Sick Leave, Awards,
    EEO/ER/ADR Data.23
  • Analysis of Readiness Conclusions
    24
  • Financial Perspective...25
  • Unit Cost..26
  • Asset Utilization..28
  • Conclusions and Recommendations..
    34
  • Conclusions from FY02 ASA...35
  • Recommendations.37

3
Table of Contents
  • Appendices
  • Page 2 of your ASA Template
  • Customer survey results and graphs
  • Process maps
  • Learning and Growth graphs

4
(No Transcript)
5
Discrete Services
  • We propose some slight changes in wording of the
    Discrete Services as follows
  • Old DS1 Ensure campus infrastructure integrity
  • New DS1 Ensure Integrity of Campus Utility
    Infrastructure
  • Old DS2 Ensure building and building system
    integrity
  • New DS2 Ensure Integrity of Buildings Utility
    Infrastructure

6
Customer Perspective
7
WHO IS THE CUSTOMER?
  • A key point to be made is that the team concluded
    that our primary customer is PWB for both of
    these discrete services.
  • Of course PWBs primary customers are the ICs
    who are the ultimate recipient of the utility
    services.

8
Customer Segmentation
  • DS1 Ensure Integrity of Campus Utility
    Infrastructure
  • Our customers represent the majority of
    supervisors and managers and employees working
    within Central Utilities Section of PWB

Total Population 64 Total Customers 15
9
Customer Segmentation
  • DS2 Ensure Integrity of Campus Building Utility
    Infrastructure
  • Our customers represent a portion of supervisors,
    managers, and other employees within Building
    Maintenance Sections of PWB

Total Population 222 Total Customers 12
10
Customer Satisfaction
  • We sent out 27 customer surveys to our customer
    target population, however we did not receive a
    very high response rate.
  • At this time, due to low response, we cannot make
    any objective conclusions regarding our survey on
    DS-2 dealing with the Building Utilities
    Integrity.
  • We have looked at the comments and the Scatter
    Diagram for Discrete Service 1 Ensure the
    Integrity of Campus Utility Infrastructure, and
    we will be focusing on those areas that rated
    high on level of importance and low on
    satisfaction.
  • We plan to monitor customer satisfaction and
    participation rates throughout the year during
    FY03.

11
Scatter Diagram For DS-2Insure Campus
IntegrityFY02 Customer Importance and
Satisfaction Ratings A Closer Look
Note A smaller portion of the chart is shown
so that the individual data points can be labeled.
12
Internal Business Process Perspective
13
Service Group Block Diagram
  • Ensure Integrity of Campus Utilities
    Infrastructure and Ensure Integrity of Campus
    Building Infrastructure consumes approximately
    25 of the project officers time in DCAB. Our
    primary interface points are with Perform
    Facilities Maintenance and Operations and
    Provide Facilities Management Services

14
Conclusions from Discrete Services Deployment
Flowcharts
  • Our Service Group completed 2 deployment
    flowcharts for 2 discrete services
  • Based on the deployment flow charts that we
    created and analyzed, we believe that we need to
    increase the involvement of our customers (PWB)
    and ensure that we receive, and utilize their
    comments during Planning, Design, Construction
    and Closeout.

15
Process Measures
  • Process measures for each discrete service
  • DS1 Percent of Projects impacting Campus
    Utilities vs. the number submitted to PWB for
    review at different stages.
  • DS1 Percent of Projects submitted for review to
    PWB vs. the number of reviews returned to DCAB
    and their timeliness
  • DS2 Percent of Projects impacting Building
    Utilities vs. the number submitted to PWB for
    review at different stages.
  • DS2 Percent of Projects submitted for review to
    PWB vs. the the number of reviews returned to
    DCAB and their timeliness
  • DS-2 Building 10 utility infrastructure team
    (DCAB Team 7) has established process measures
    that includes tracking of activities performed by
    this team. It is important to note that this
    team takes the lead in coordinating projects with
    PWB and incorporating their comments specially
    during pre-design and design activities.

16
Process Map for Ensuring Building Utility
Integrity
17
Process Map for Ensuring Building Utility
Integrity Cont
Note there is little to no involvement from PWB
after completion of the design
18
Process Measures
  • Other than Building 10 utility infrastructure
    tracking which is done by Team 7 of DCAB, we did
    not implement any of these proposed unique
    process measures for FY02 however, we will be
    implementing the above measures for FY03 in
    cooperation with PWB.

19
Process MeasuresBuilding 10 Utilities
Infrastructure Group (DCAB Team 7)
20
Process MeasuresBuilding 10 Utilities
Infrastructure Group (DCAB Team 7)
21
Learning and Growth Perspective
22
Summary of L G Datafor Service Group 34
  • Services provided by this service group is a part
    of a greater function performed by DCAB.
    Therefore, the LG for the DCAB is presented
    here.
  • 3 employee turnover
  • Over 1 award per employee
  • About 4 days of sick leave per employee
  • 0 EEO complaints, 2 ER and 5 ADR cases out of 111
    employees

23
Conclusions from Turnover, Sick Leave, Awards,
EEO/ER/ADR Data
  • Based on the feedback from the Learning and
    growth data provided, we found that our turnover
    rate, sick leave rate, and EEO/ER/ADR rates were
    below average when compared to other discrete
    services. We were a little above average on the
    number of awards received by our employees. Since
    this is the first year we have examined this data
    in this way, we will have better comparative data
    at the end of FY03.
  • In general, data indicates DCAB is good place to
    work.

24
Analysis of Readiness Conclusions
  • We feel that our staff is fully qualified to
    perform the work related to our discrete services
    now and into the foreseeable future. We do,
    however, need to monitor the early planning
    process and forecasts for new construction and
    alterations. A sudden increase in work load in
    these areas will have an immediate impact on our
    ability to spend sufficient time investigating
    condition of the existing utilities
    infrastructure and their future planning.

25
Financial Perspective
26
Unit Cost Measures
  • Unit cost for DS1 Ensure Integrity of Campus
    Utility Infrastructure
  • Per square foot cost of providing this discrete
    service was calculated using the budget included
    in the rent model divided by the total gross
    square feet of the campus buildings.

Fiscal Year Total Campus Gross SF Total Budget Cost of DS-1/GSF
2001 9,405,937 713,000 .076
2002 9,405,937 743,000 .079
27
Unit Cost Measures Cont
  • Unit cost for DS2 Ensure Integrity of Building
    Utilities Infrastructure
  • Per square foot cost of providing this discrete
    service was calculated using the budget included
    in the rent model divided by the total gross
    square feet of campus buildings.

Fiscal Year Total Campus Gross SF Total Budget Cost of DS-1/GSF
2001 9,405,937 3,124,000 .33
2002 9,405,937 3,200,000 .34
28
Asset Utilization Measures
  • There are two separate ORS accounts for Campus
    and Building Integrity services as follows
  • Campus Utility Infrastructure (HQF10017)
  • Building Utility Infrastructure (HQF10000)
  • Asset utilization was evaluated using the time
    cards from DCAB and comparing it with the
    allotted budget for each Discrete Service.

29
Budget increase of 4 in 02 vs. 01 DCAB charges
increased by 40 in 02 vs. 01
30
Budget increase of 2.5 in 02 vs. 01 DCAB charges
increased by 7 in 02 vs. 01
31
(No Transcript)
32
Over 30 of POs are not charging to the ORS
account on Building Integrity although they
provide the service.
33
Conclusion for Asset Utilization
  • Based on the study of the actual time charged
    against the ORS account for Ensure Integrity of
    Campus and Building Utilities, it is clear that
    there is a need for additional training regarding
    use of the ORS account for these discrete
    services as there are many POs who do not charge
    any time to these accounts and provide the
    service.
  • Considering deficient utility infrastructure of
    Building 10, we anticipated higher than average
    POs time to be charged against the ORS account.
    Evaluating the time charged by DCAB Teams 1, 6,
    7 revealed that approx 20 of the total time on
    the ORS building infrastructure account was
    charged for building 10 whereas this building
    constitutes 30 of the campus total square feet.
    Paul Hawver who leads the Building 10
    infrastructure group, indicated there are
    services provided by consultants that have not
    been captured but are being considered for
    inclusion in FY03 and will be charged against the
    Building Integrity account.

34
Conclusions and Recommendations
35
Conclusions from FY02 ASA
  • The following are observations from of the Radar
    Charts and the comments from surveys (PWB) for
    the Ensure Campus Integrity discrete service
  • Quality, Cost, Competence, Reliability, and
    Availability were rated above average with
    Quality and Competence being the highest ranked.
  • Timeliness, Responsiveness, Handling of Problems
    and Convenience were rated below average with
    Timeliness being the lowest ranked.

36
Conclusions from FY02 ASA
  • The following are conclusions from the comments
    provided in the surveys
  • What was done particularly well
  • Communication
  • Construction of Tunnel (Reliability, Safety,
    maintenance)
  • What needs to be improved
  • Timeliness
  • Training
  • Coordination
  • Closeout
  • More involvement during Inspection, Commissioning
    and Acceptance
  • Involvement in all levels of discussions
    affecting the utilities
  • PWB is the customer
  • Beneficial occupancy dates should be agreed to by
    customers (PWB)

37
Recommendations
  1. Identify PWB as a customer in DCABs ISO 9000
    procedure manual. This would require PWBs
    involvement in all aspects of the project
    specifically sign off requirement at 35 Design
    and Construction closeout. (Action by DCAB)

38
Recommendations Cont
  • Establish a tracking system to monitor projects
    that impact the campus or building utilities
    infrastructure and include the following as a
    minimum (Action by DCAB and PWB management)
  • For each project identify if there is impact on
    the utilities, both campus and building
    utilities, during planning (Existing system -
    PIN).
  • Provide automatic Notification to the PWB central
    point of contact of the utility impacts and the
    design / construction schedule.
  • PWB and/or DCAB monitoring system needs to be
    established to assure projects that are
    identified for utility impact are submitted for
    PWB reviews at planning, design, construction and
    closeout. Also PWBs reviews are done in a
    timely manner and returned to DCAB.

39
Recommendations Cont
  • DCAB Project Officers need to better track their
    time spent on the utilities infrastructure for
    both campus and building utility categories in
    their time cards. Additional training is needed.
    (Action by DCAB management)

40
Recommendations Cont
  • Building 10 complex utility infrastructure team
    (DCAB Team 7) led by Paul Hawver is currently
    providing review and coordination services to
    DCAB project officers on all projects that have
    utility impact in Building 10. Continue with the
    service and assure all associated costs are
    charged against the ORS Building Utilities
    Infrastructure account. Recommend adding same
    type of service for all other buildings on the
    campus either by PWB or DCAB. (Decision by DES
    management as to which branch should provide this
    service as this will require dedicated manpower).
  • The two Discrete Services studied under this
    service group category, should become a part of
    the larger DCAB service group. (Decision by DCAB
    management).

41
Appendices
42
Appendices
  • Page 2 of ASA Template
  • Customer segments graphs
  • Customer satisfaction graphs
  • Block diagram
  • Process maps
  • Process measure graphs
  • Learning and Growth graphs
  • Analysis of Readiness Information
  • Unit cost graphs
  • Asset utilization graphs

43
Page 2 of Revised Template
44
Customer Survey Results
  • DS1 Ensure Integrity of Campus Utility
    Infrastructure

45
Radar ChartFY02 Product/Service Satisfaction
Ratings
Note The rating scale ranges from 1 - 10 where
1 represents Unsatisfactory and 10 represents
Outstanding. Refer to the Data Analysis and
Graphing training for advice on interpreting
these results.
46
Radar ChartFY02 Customer Service Satisfaction
Ratings
Note The rating scale ranges from 1 - 10 where
1 represents Unsatisfactory and 10 represents
Outstanding. Refer to the Data Analysis and
Graphing training for advice on interpreting
these results.
47
Scatter DiagramFY02 Customer Importance and
Satisfaction Ratings
Note The Importance rating scale ranges from 1
- 10 where 1 represents Unimportant and 10
represents Important. The Satisfaction rating
scale ranges from 1 - 10 where 1 represents
Unsatisfactory and 10 represents Outstanding.
48
Scatter DiagramFY02 Customer Importance and
Satisfaction Ratings A Closer Look
Note A smaller portion of the chart is shown
so that the individual data points can be labeled.
49
Reviewing Comments
  • Realize comments are qualitative data and are not
    meant to be counted and tallied
  • Comments provide a different type of information
    from your customers regarding their satisfaction
  • Comments are NOT representative of the
    perceptions of all your customers
  • Review them but dont over react to an individual
    comment
  • Comments are a great source for ideas on how to
    improve

50
What was done particularly well?
  • Nothing comes to mind.
  • Nothing was done particularly well the quality of
    the material was not the best they do not make
    the repairs to the equipment in a timely manner.
  • Communication.
  • Can say very little.
  • Can't think of anything.
  • The Construction of the Utility Tunnel Expansion
    Project improves the reliability of the
    steam/condensate pump return, chilled water and
    domestic water distribution systems. It also
    provides our operating personnel safe working
    conditions, adequate space and accessibility to
    perform maintenance and repairs of the systems.

51
What needs to be improved?
  • Operator and maintenance training, fixing
    deficiencies, follow up on warranty work,
    customer concurrence on change orders/deletions/ad
    dition to contract.
  • They need to fix the things that are leaking and
    finish the punch list that they started on but
    stopped it seams that they are just waiting to
    see if we will do the work ourselves.
  • Attention to detail, alternative solutions,
    fiscal responsibility, timeliness, bring project
    to closure.
  • Working relationship between DCAB and PWB. PWB
    is also DCAB's customer (internal).
  • All of the above.
  • Training, O M Manuals, dust control, monitoring
    contractor activities, close out of contract,
    warranty items, final acceptance.
  • Improvements are needed in the areas of
    coordination, inspection, commissioning and
    acceptance of the project involving new
    construction or repair of utility distribution
    systems. Personnel from Central Utilities
    Section (CUS) should be involved and included in
    all level of discussion affecting the Utilities.

52
Other Comments
  • DCAB needs to get costumer concurrence on changes
    to contract. Beneficial occupancy date should be
    agreed to by costumer.
  • I THINK YOU NEED TO GET MORE INPUT FROM THE
    PEOPLE IN CHARGE OF THE PLANT TO SEE WHAT THEY
    NEED IN HERE NOT JUST WHAT THE DESIGNERS WANT TO
    PUT IN HERE THE MEN THAT RUN THE PLANT KNOWS WHAT
    THEY NEED TO DO THE JOB
  • Please give me a call if this survey is for real
    then would give comments. Jim Powers 451-4478
    Asst Chief Cup
  • Central Utilities Section should have the
    signatory authority in the commissioning and
    final acceptance on all the projects relating to
    Utility Distribution Systems.

53
Customer Survey Results
  • DS2 Ensure Integrity of Campus Building Utility
    Infrastructure

54
Radar ChartFY02 Product/Service Satisfaction
Ratings
Note The rating scale ranges from 1 - 10 where
1 represents Unsatisfactory and 10 represents
Outstanding. Refer to the Data Analysis and
Graphing training for advice on interpreting
these results.
55
Radar ChartFY02 Customer Service Satisfaction
Ratings
Note The rating scale ranges from 1 - 10 where
1 represents Unsatisfactory and 10 represents
Outstanding. Refer to the Data Analysis and
Graphing training for advice on interpreting
these results.
56
Scatter DiagramFY02 Customer Importance and
Satisfaction Ratings
Note The Importance rating scale ranges from 1
- 10 where 1 represents Unimportant and 10
represents Important. The Satisfaction rating
scale ranges from 1 - 10 where 1 represents
Unsatisfactory and 10 represents Outstanding.
57
Scatter DiagramFY02 Customer Importance and
Satisfaction Ratings A Closer Look
Note A smaller portion of the chart is shown
so that the individual data points can be labeled.
58
What needs to be improved?
  • Differential pressure of around 10 to 12 psig
    should be maintained all the time. There are
    still many bldg. Groups that don't have the
    tertiary pumping system installed are dependent
    on the differential pressure being maintained.
  • The fee for service associated with the planning
    needs to be reduced and the time it takes to
    complete projects needs to be shortened.

59
Other Comments
  • I don't know that the high pressure air has been
    included in this project but there are some
    existing problems with this utility. Most
    buildings do not receive the 100 psig air from
    the plant as they did in the past. A minimum of
    100 psig air is needed for lab and cage wash
    equipment. Bldg 37 is having a compressor
    installed just for this equipment. Bldg 36
    researchers have also inquired about the lower
    pressure being supplied to the bldg.

60
Process Map for DS1 (page 1 of 2)
61
Process Map for DS1 (page 2 of 2)
62
Process Map for DS2 (Page 1 of 2)
63
Process Map for DS2 (Page 2 of 2)
64
FY02 Learning and Growth (LG) Data for the
Annual Self Assessments Service Group 34
Ensure Integrity of NIH Facilities 10
October 2002 Summary Prepared by the Office of
Quality Management
65
Methodology
  • All data represent occurrences from Oct 2001 -
    June 2002
  • Data analyzed covered period between October 1st
    and end of June to provide time to analyze and
    present the data
  • ORS Human Resources (HR) provided data on
  • Turnover
  • Sick leave
  • Awards
  • HR data stored in NIH databases by Standard
    Administrative Codes (SACs)
  • Developed cross-reference of ORS Service Groups
    to SACs
  • Almost all SACs assigned to Service Groups
  • Some Service Groups have identical SACs
  • In this case, two Service Groups will receive
    same set of data

66
Methodology (cont.)
  • Also obtained data from
  • Equal Employment Opportunity (EEO)
  • Number of EEO complaints
  • Employee Relations (ER)
  • Number of ER cases
  • Alternative Dispute Resolution (ADR)
  • ADR cases

67
Interpreting Your Data
  • FY02 is the first time LG data were collected
    and analyzed
  • Compare your Service Group relative to the other
    ORS Service Groups
  • What are all the LG indicators telling you?
  • In the future your group should compare itself to
    its own Service Group data over time
  • Interpret data in terms of other ASA data
  • Customer satisfaction ratings
  • Process measures
  • Financial measures
  • Does the LG data, when compared to data in other
    perspectives, show potential relationship (could
    LG be contributing to customer satisfaction
    results)?
  • From reviewing your Service Groups LG data,
    what could be done to improve Quality of Work
    Life (QOWL)?

68
Service Group Turnover Rate
  • Calculated as the number of separations for a
    Service Group / Population of Service Group
  • Separations defined as
  • Retirements (separation codes 3010, 3020, 3022)
  • Resignations (separation codes 3120, 3170)
  • Removals (separation codes 3300)
  • Terminations (separation codes 3520, 3550, 3570)
  • Promotions to new organization (separation codes
    7020)
  • Reassignments (separation code 7210)
  • Note that transfers/promotions within ORS
    Divisions/Offices are not captured by the NIH
    database

69
Service Group Turnover Rate (cont.)
  • Calculation of Service Group population was
    needed since number of employees changes over
    time
  • Population for Service Group was estimated based
    on average of employee count at three snapshots
    in time (Nov 2001, Feb 2002, June 2002)

70
Service Group Turnover Rate (Oct 2001 - June
2002)
Turnover Rate
Service Group Number
71
Average Hours of Sick Leave Used
  • Calculated as the total number of sick leave
    hours used for a Service Group / Population of
    Service Group

72
Average Hours of Sick Leave Used (Oct 2001 - June
2002)
Average Hours
Service Group Number
73
Average Number of Awards Received
  • Calculated as the total number of awards received
    / Population of Service Group
  • Includes both monetary and non-monetary awards
  • Cash awards
  • QSIs
  • Time-off
  • Honorary
  • Customer Service

74
Average Number of Awards Received (Oct 2001 -
June 2002)
Average number
Service Group Number
75
Average Number of EEO Complaints
  • Calculated the total number of EEO complaints for
    a Service Group / Population of Service Group

76
Average Number of EEO Complaints (Oct 2001 -
June 2002)
Average Number
Service Group Number
77
Average Number of ER Cases
  • Calculated the total number of ER cases for a
    Service Group / Population of Service Group
  • Case is defined as any contact with ER Office
    where an action occurs (e.g., Letter is prepared)

78
Average Number of ER Cases (Oct 2001 - June 2002)
Average Number
Service Group Number
79
Average Number of ADR Cases
  • Calculated the number of ADR cases for a Service
    Group / Population of Service Group
  • Case is initiated when person contacts ADR

80
Average Number of ADR Cases (Oct 2001 - June
2002)
Average Number
Service Group Number
81
Learning and Growth Data Table
  • 3 Employee Turnover
  • About 4 days of sick leave
  • per employee
  • Over 1 award per employee
  • 0 EEO complaints, 2 ER
  • and 5 ADR cases out of 111
  • employees

82
Summary of Service Group 34 Learning and Growth
Data
  • 3 employee turnover
  • Over 1 award per employee
  • About 4 days of sick leave per employee
  • 0 EEO complaints, 2 ER and 5 ADR cases out of 111
    employees
Write a Comment
User Comments (0)
About PowerShow.com