Title: The Late Night Top 10 - Energy Saving Measures for your Data Center
1The Late Night Top 10 -Energy Saving Measures
for your Data Center
2Energy Efficiency in Data centers
- LBNL benchmarking and case studies
- California Energy Commission
- Pacific Gas Electric Company
- NYSERDA
- DOE
- DOE energy assessments
- EPA report to Congress
- Resources
- Letterman Top 10 opportunities
3Benchmarking energy end use
4Data Center Research Roadmap
A research roadmap was developed for the
California Energy Commission and outlined key
areas for energy efficiency research,
development, and demonstration
5IT equipment load intensity
6Overall power use in Data Centers
Courtesy of Michael Patterson, Intel Corporation
7Data Center performance differences
8Performance varies
The relative percentages of the energy actually
doing computing varied considerably.
9IT equipment percentage to total
Ave. 0.49
Higher is better
10HVAC system effectiveness
We observed a wide variation in HVAC performance
11Benchmarking leads to best practices
- The ratio of IT equipment power to the total is
an indicator of relative overall efficiency. - Examination of individual systems and components
in the centers that performed well helped to
identify best practices.
12Best practice topics
13Design guidelines for 10 Best Practices were
developed in collaboration with PGE
Guides available through PGEs Energy Design
Resources Website
14Design guidance
A web-based training resource is available
http//hightech.lbl.gov/dctraining/TOP.html
15Late night top 10 opportunities
- Staying up late worrying about your data
centers energy use? - Lets see what David Letterman would come up with.
16Number 10
- Efficient Lighting/ Lighting Controls
-
- Lighting represents a small of power use in a
data center - Lights are typically on 24x7
- Staff is rarely present in data centers
- Reducing lighting intensity also saves HVAC
energy - Lighting controls are well proven
17punch line
- Computers dont need lights!
18Number 9
Use Premium Efficiency Motors
- Data center systems operate 24x7
- Premium efficiency motors can save several over
less efficient models - Newer motors are even more efficient than premium
motors of several years ago
19Check motor efficiency
MotorMaster is a good resource
20Punch line
- A penny saved is a penny earned!
21Number 8
Variable Speed Everything
- Power and flow related by cube law
- Small reductions in flow result in large energy
savings - No data center needs full capability all the time
- Operating multiple fans or pumps at slower speeds
is often a good strategy
22punch line
23Number 7
Efficient chilled water plant
- Chiller efficiency dominates but pumps, fans,
system effects contribute - kW/ton
24punch line
25Number 6
Electrical power conversion
- Many power conversions take place in a data
center from end to end - Each conversion loses some power converted to
heat - Optimizing or eliminating conversions has a short
pay back
26Data Center power conversions
PDU
27- Research into power conversion losses
-
Power Supplies in IT equipment
Uninterruptible Power Supplies (UPS)
28punch line
29Number 5
Virtualization
- Typical servers processors are busy 10 or less
on average - Consolidating can eliminate many servers
- Older less efficient equipment can be removed
- Immediate decrease in electrical and HVAC load
- Quick pay back
30punch line
- Take your IT person to lunch!
31Number 4
Air side efficiency
- Air management a huge opportunity
- Efficiency of air handlers varies
- W/cfm a key metric
- ?T is also a key metric
- Avoid mixing of cold and hot airstreams
- Eliminate simultaneous humidification/dehumidifica
tion
32Fan energy savings 75
If there is no mixing of cold supply air with hot
return air - fan speed can be reduced
33punch line
- Just like oil and water, air shouldnt be mixed
34Number 3
Free cooling
- In many climates, compressor-less or reduced use
of compressors is possible for much of the year - Air economizers are feared because of
contamination or loss of humidity control - Some jurisdictions are requiring air economizers
in data centers - Water side economizers (heat exchange with
cooling tower) is a good option
35Air economizer study
- Eight centers were monitored with particle
counters - Three were successfully using air economizers
- Measurements taken inside and outside the centers
- Humidity was monitored
36Air Quality Guidelines
- Limited literature connecting pollutants to
equipment failure - ASHRAE
- Design Considerations for Data/Com Equipment
Centers - Guidelines for particles, gases, humidity
- Industry Sources Telcordia GR-63-CORE/IEC
60721-3-3 - Designed for telephone switching centers
- Based on research over 20 years old
- Primary concern current leakage caused by
particle bridging hydroscopic salts in
combination with high humidity
37Contamination limits
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard and ASHRAE Standard
38Outdoor measurements
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard
and ASHRAE Standard
39Indoor measurements
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard
and ASHRAE Standard
40Indoor measurements
41Data center w/economizer
42Humidity measurements
ASHRAE Allowable Upper Limit
ASHRAE Recommended Upper Limit
ASHRAE Recommended Lower Limit
ASHRAE Allowable Lower Limit
43Findings
- High concentration of water soluble salts in
combination with high humidity can cause failures - ASHRAE guidance is drastically lower than one
manufacturers standard - Particle concentration in data centers is
typically an order of magnitude lower than
ASHRAE limits - Economizer use can increase particle
concentration to ASHRAE limits - A modest improvement in filtration would provide
equal conditions to centers not using economizers - Humidity control is easily attainable
44Punch line
- What part of free is not appealing?
45Number 2
Environmental conditions
- ASHRAE Thermal Guidelines define conditions at
the inlet to the IT equipment - Often, operating temperatures are much less than
recommended - Often humidity is more tightly controlled than
recommended
46Temperature guidelines at the inlet to IT
equipment
ASHRAE Allowable Maximum
ASHRAE Recommended Maximum
ASHRAE Recommended Minimum
ASHRAE Allowable Minimum
47Humidity guidelines at the inlet to IT equipment
ASHRAE Allowable Maximum
ASHRAE Recommended Maximum
ASHRAE Recommended Minimum
ASHRAE Allowable Minimum
48Air Isolation Concepts
- Better isolation of hot and cold aisles will
improve temperature control and allow air system
optimization - Airflow (fan energy) can be reduced if air is
delivered without mixing - Air system and chilled water systems operate more
efficiently at higher temperature differences - Temperatures in entire center can be raised if
mixing is eliminated. It may be possible to cool
with tower water rather than use of chillers.
49Isolation scheme cold aisle isolation
50Isolation scheme hot aisle isolation
51Better temperature control allows raising the
temperature in the entire data center
ASHRAE Recommended Range
Ranges with aisles isolated
52Punch line
- Its not nice to fool Mother Nature
- (think about it)
53Number 1
- Rising heat intensity will lead to liquid
cooling - Water can hold approximately 3500 times the
amount of heat that air can hold - Liquid cooling is already taking many forms
- Rack level solutions
- Chip level solutions
- Building level solutions
54Moving to liquid cooling
- Hybrid solutions as an interim step
- Integrating liquid solutions all the way out of
the building - Liquid doesnt have to mean water
- 35-50 reduction in HVAC energy is possible by
going liquid
55Punch Line
- Sometimes it pays to be all wet!
56Recapping the Top 10
- 10 Lighting controls
- 9 Premium efficiency motors
- 8 Variable speed everything
- 7 Efficient chilled water plant
- 6 Power conversion efficiency
- 5 Virtualization
- 4 Air side efficiency
- 3 Free cooling
- 2 Appropriate environmental conditions
- 1 Liquid cooling
57Conclusion