The Late Night Top 10 - Energy Saving Measures for your Data Center - PowerPoint PPT Presentation

1 / 57
About This Presentation
Title:

The Late Night Top 10 - Energy Saving Measures for your Data Center

Description:

No data center needs full ... Some jurisdictions are requiring air economizers in data centers ... Eight centers were monitored with particle counters ... – PowerPoint PPT presentation

Number of Views:115
Avg rating:3.0/5.0
Slides: 58
Provided by: sabati
Category:

less

Transcript and Presenter's Notes

Title: The Late Night Top 10 - Energy Saving Measures for your Data Center


1
The Late Night Top 10 -Energy Saving Measures
for your Data Center
2
Energy Efficiency in Data centers
  • LBNL benchmarking and case studies
  • California Energy Commission
  • Pacific Gas Electric Company
  • NYSERDA
  • DOE
  • DOE energy assessments
  • EPA report to Congress
  • Resources
  • Letterman Top 10 opportunities

3
Benchmarking energy end use
4
Data Center Research Roadmap
A research roadmap was developed for the
California Energy Commission and outlined key
areas for energy efficiency research,
development, and demonstration
5
IT equipment load intensity
6
Overall power use in Data Centers
Courtesy of Michael Patterson, Intel Corporation
7
Data Center performance differences
8
Performance varies
The relative percentages of the energy actually
doing computing varied considerably.
9
IT equipment percentage to total
Ave. 0.49
Higher is better
10
HVAC system effectiveness
We observed a wide variation in HVAC performance
11
Benchmarking leads to best practices
  • The ratio of IT equipment power to the total is
    an indicator of relative overall efficiency.
  • Examination of individual systems and components
    in the centers that performed well helped to
    identify best practices.

12
Best practice topics
13
Design guidelines for 10 Best Practices were
developed in collaboration with PGE
Guides available through PGEs Energy Design
Resources Website
14
Design guidance
A web-based training resource is available
http//hightech.lbl.gov/dctraining/TOP.html
15
Late night top 10 opportunities
  • Staying up late worrying about your data
    centers energy use?
  • Lets see what David Letterman would come up with.

16
Number 10
  • Efficient Lighting/ Lighting Controls
  • Lighting represents a small of power use in a
    data center
  • Lights are typically on 24x7
  • Staff is rarely present in data centers
  • Reducing lighting intensity also saves HVAC
    energy
  • Lighting controls are well proven

17
punch line
  • Computers dont need lights!

18
Number 9
Use Premium Efficiency Motors
  • Data center systems operate 24x7
  • Premium efficiency motors can save several over
    less efficient models
  • Newer motors are even more efficient than premium
    motors of several years ago

19
Check motor efficiency
MotorMaster is a good resource
20
Punch line
  • A penny saved is a penny earned!

21
Number 8
Variable Speed Everything
  • Power and flow related by cube law
  • Small reductions in flow result in large energy
    savings
  • No data center needs full capability all the time
  • Operating multiple fans or pumps at slower speeds
    is often a good strategy

22
punch line
  • Thank you cube law!

23
Number 7
Efficient chilled water plant
  • Chiller efficiency dominates but pumps, fans,
    system effects contribute
  • kW/ton

24
punch line
  • How low can you go?

25
Number 6
Electrical power conversion
  • Many power conversions take place in a data
    center from end to end
  • Each conversion loses some power converted to
    heat
  • Optimizing or eliminating conversions has a short
    pay back

26
Data Center power conversions
PDU
27
  • Research into power conversion losses

Power Supplies in IT equipment
Uninterruptible Power Supplies (UPS)
28
punch line
  • AC DC AC AC DC DC

29
Number 5
Virtualization
  • Typical servers processors are busy 10 or less
    on average
  • Consolidating can eliminate many servers
  • Older less efficient equipment can be removed
  • Immediate decrease in electrical and HVAC load
  • Quick pay back

30
punch line
  • Take your IT person to lunch!

31
Number 4
Air side efficiency
  • Air management a huge opportunity
  • Efficiency of air handlers varies
  • W/cfm a key metric
  • ?T is also a key metric
  • Avoid mixing of cold and hot airstreams
  • Eliminate simultaneous humidification/dehumidifica
    tion

32
Fan energy savings 75
If there is no mixing of cold supply air with hot
return air - fan speed can be reduced
33
punch line
  • Just like oil and water, air shouldnt be mixed

34
Number 3
Free cooling
  • In many climates, compressor-less or reduced use
    of compressors is possible for much of the year
  • Air economizers are feared because of
    contamination or loss of humidity control
  • Some jurisdictions are requiring air economizers
    in data centers
  • Water side economizers (heat exchange with
    cooling tower) is a good option

35
Air economizer study
  • Eight centers were monitored with particle
    counters
  • Three were successfully using air economizers
  • Measurements taken inside and outside the centers
  • Humidity was monitored

36
Air Quality Guidelines
  • Limited literature connecting pollutants to
    equipment failure
  • ASHRAE
  • Design Considerations for Data/Com Equipment
    Centers
  • Guidelines for particles, gases, humidity
  • Industry Sources Telcordia GR-63-CORE/IEC
    60721-3-3
  • Designed for telephone switching centers
  • Based on research over 20 years old
  • Primary concern current leakage caused by
    particle bridging hydroscopic salts in
    combination with high humidity

37
Contamination limits
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard and ASHRAE Standard
38
Outdoor measurements
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard
and ASHRAE Standard
39
Indoor measurements
IBM Standard
EPA 24-Hour Health Standard
EPA Annual Health Standard
and ASHRAE Standard
40
Indoor measurements
41
Data center w/economizer
42
Humidity measurements
ASHRAE Allowable Upper Limit
ASHRAE Recommended Upper Limit
ASHRAE Recommended Lower Limit
ASHRAE Allowable Lower Limit
43
Findings
  • High concentration of water soluble salts in
    combination with high humidity can cause failures
  • ASHRAE guidance is drastically lower than one
    manufacturers standard
  • Particle concentration in data centers is
    typically an order of magnitude lower than
    ASHRAE limits
  • Economizer use can increase particle
    concentration to ASHRAE limits
  • A modest improvement in filtration would provide
    equal conditions to centers not using economizers
  • Humidity control is easily attainable

44
Punch line
  • What part of free is not appealing?

45
Number 2
Environmental conditions
  • ASHRAE Thermal Guidelines define conditions at
    the inlet to the IT equipment
  • Often, operating temperatures are much less than
    recommended
  • Often humidity is more tightly controlled than
    recommended

46
Temperature guidelines at the inlet to IT
equipment
ASHRAE Allowable Maximum
ASHRAE Recommended Maximum
ASHRAE Recommended Minimum
ASHRAE Allowable Minimum
47
Humidity guidelines at the inlet to IT equipment
ASHRAE Allowable Maximum
ASHRAE Recommended Maximum
ASHRAE Recommended Minimum
ASHRAE Allowable Minimum
48
Air Isolation Concepts
  • Better isolation of hot and cold aisles will
    improve temperature control and allow air system
    optimization
  • Airflow (fan energy) can be reduced if air is
    delivered without mixing
  • Air system and chilled water systems operate more
    efficiently at higher temperature differences
  • Temperatures in entire center can be raised if
    mixing is eliminated. It may be possible to cool
    with tower water rather than use of chillers.

49
Isolation scheme cold aisle isolation
50
Isolation scheme hot aisle isolation
51
Better temperature control allows raising the
temperature in the entire data center
ASHRAE Recommended Range
Ranges with aisles isolated
52
Punch line
  • Its not nice to fool Mother Nature
  • (think about it)

53
Number 1
  • Liquid Cooling
  • Rising heat intensity will lead to liquid
    cooling
  • Water can hold approximately 3500 times the
    amount of heat that air can hold
  • Liquid cooling is already taking many forms
  • Rack level solutions
  • Chip level solutions
  • Building level solutions

54
Moving to liquid cooling
  • Hybrid solutions as an interim step
  • Integrating liquid solutions all the way out of
    the building
  • Liquid doesnt have to mean water
  • 35-50 reduction in HVAC energy is possible by
    going liquid

55
Punch Line
  • Sometimes it pays to be all wet!

56
Recapping the Top 10
  • 10 Lighting controls
  • 9 Premium efficiency motors
  • 8 Variable speed everything
  • 7 Efficient chilled water plant
  • 6 Power conversion efficiency
  • 5 Virtualization
  • 4 Air side efficiency
  • 3 Free cooling
  • 2 Appropriate environmental conditions
  • 1 Liquid cooling

57
Conclusion
  • Thank you.. Questions?
Write a Comment
User Comments (0)
About PowerShow.com