Data Centers for the 21st Century - PowerPoint PPT Presentation

Loading...

PPT – Data Centers for the 21st Century PowerPoint presentation | free to download - id: 71c4b-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Data Centers for the 21st Century

Description:

Data Centers for the 21st Century – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 40
Provided by: gabort
Learn more at: http://hightech.lbl.gov
Category:
Tags: 21st | centers | century | data | ox

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Data Centers for the 21st Century


1
Data Centers for the 21st Century
Working with Industry to Improve Energy
Efficiency in Data Centers
Dale Sartor, P.E. LBNL Applications Team June
7, 2007
2
Crying Uncle!
3
LBNL Computer Systems Power
4
Why Data Centers
5
Why Data Centers
  • Highly energy-intensive and rapidly growing
  • Consume 10 to 100 times more energy per square
    foot than a typical office building
  • A single rack of servers can be 20 kW
  • 17k per year per rack (at .10/kWh)
  • Hundreds of racks per center can have significant
    impact on electricity supply and distribution
  • Used about 45 billion kWh in2005about 1.2of
    all retail U.S. electricity sales.
  • At current growth, powerrequirements could
    doublein less than 10 years.

6
Where Does it Go?Data Center Energy Use
Typical Data Center Energy End Use
Power Conversions Distribution
100 Units
35 Units
Cooling Equipment
Server Load/ComputingOperations
33 UnitsDelivered
7
Potential Benefits of Improved Data Center
Energy Efficiency
  • Save 20 billion kWh per year by 2015
  • Worth 2 billion and roughly equal to annual
    electricity use in 1.8 million American homes
  • Potentially defer need to build 2,300 MW of new
    generating capacity and avoid 3.4 million metric
    tons of carbon emissions (like taking 675,000
    cars off the road)
  • Extend life and capacity of existing data center
    infrastructures

8
Overall power use in Data Centers
Courtesy of Michael Patterson, Intel Corporation
9
Computational Energy Efficiency
  • While my presentation focuses on the data center
    infrastructure, significant opportunities are
    available to increase computational energy
    efficiencies
  • Improve Utilization of Resources
  • Consolidation and virtualization
  • Grid computing
  • Enable and improve power management (dynamic
    loading)
  • Improve software efficiency (and software
    instructions to hardware)
  • Any reduction in IT equipment energy use has a
    corresponding savings in infrastructure
  • Potential first cost savings are often missed
  • Very challenging to set standards

10
Computational Energy Efficiency Standards
  • ENERGY STAR for Servers
  • EPA will release strawman proposal this year
  • EPA considering power supply efficiency system
    energy efficiency performance
  • Need support from industry stakeholders
  • EPA also interested in other IT equipment --
    storage, networking, etc.
  • Efforts to Develop Server Performance Benchmark
    SPEC
  • No metric available to compare server energy
    efficiency
  • SPEC Committee developing energy efficiency
    benchmark
  • Working prototype developed, more info on
    progress www.spec.org/specpower

11
Efficient Data Centers - Building Knowledge Base
  • With funding from PGE and CEC, LBNL conducted
    benchmark studies of 22 data centers
  • Found wide variation in performance
  • Identified best practices
  • CEC and PGE continue RD and demonstrations
  • New DOE program will expand current knowledge base

12
Benchmarking How do I Stack Up?
13
Energy Intensity Growth
  • Growing but not a good performance metric

14
Infrastructure Efficiency
Data Center Cooling and Power Conversion
Performance Varies
Cooling Power Conversions
Cooling Power Conversions
Server Load/ComputingOperations
Server Load/ComputingOperations
Typical Practice
Better Practice
15
Best PracticesLessons from Benchmarking
  • Total Data Center Power/ IT power

16
Using benchmark results to find best practices
  • The ratio of IT equipment to the total power is
    an indicator of relative overall efficiency.
    Examination of individual systems and components
    in the centers that performed well helped to
    identify best practices
  • Air management
  • Right-sizing
  • Central plant optimization
  • Efficient air handling
  • Free cooling
  • Humidity control
  • Liquid cooling
  • Improving power chain
  • UPSs and equipment power
  • supplies
  • On-site generation
  • Design and MO processes

17
Optimize Air Management
  • Enforce hot aisle/cold aisle arrangement
  • Eliminate bypasses and short circuits
  • Reduce air flow restrictions
  • Proper floor tile arrangement
  • Proper locations of air handlers

18
Right-Size the Design
  • Data Center HVAC often under-loaded
  • Ultimate load uncertain
  • Design for efficient part-load operation
  • modularity
  • variable-speed fans, pumps, compressors
  • Upsize fixed elements (pipes, ducts)
  • Upsize cooling towers

19
Optimize the Central Plant
  • Have one (vs. distributed cooling)
  • Medium temperature chilled water
  • Aggressive temperature resets
  • Primary-only CHW with variable flow
  • Thermal storage
  • Monitor plant efficiency

20
Design for Efficient Central Air Handling
  • Fewer, larger fans and motors
  • VAV easier
  • Central controls eliminate fighting
  • Outside-air economizers easier

21
Use Free Cooling
  • Outside-Air Economizers
  • Can be very effective (24/7 load)
  • Controversial re contamination
  • Must consider humidity
  • Water-side Economizers
  • No contamination question
  • Can be in series with chiller

22
Improve Humidity Control
  • Eliminate inadvertent dehumidification
  • Computer load is sensible only
  • Medium-temperature chilled water
  • Humidity control at make-up air handler only
  • Use ASHRAE allowable RH and temperature
  • Eliminate equipment fighting
  • Coordinated controls on distributed AHUs

23
Use Liquid Cooling of Racks and Computers
  • Water is 3500x more effective than air on a
    volume basis
  • Cooling distribution is more energy efficient
  • Water-cooled racks available now liquid-cooled
    computers are coming
  • Heat rejection at a higher temperature
  • Chiller plant more efficient
  • Water-side economizer more effective

24
(No Transcript)
25
Improving the Power Chain
  • Increase distribution voltage
  • DC distribution
  • Improve equipment power supplies
  • Improve UPS

26
Specify Efficient Power Supplies and UPSs

Power supplies in IT equipment generate much of
the heat. Highly efficient supplies can reduce
IT equipment load by 15 or more.
UPS efficiency also varies a lot. (Do they need
the same environment as the IT equipment or can
their conditions be relaxed?)
27
Consider On-Site Generation
  • Can use waste heat for cooling
  • sorption cycles
  • typically required for cost effectiveness
  • Swaps role with utility for back-up
  • Air-quality issues
  • Sell-back options
  • complex controls required

28
Improve Design and Operations Processes
  • Get IT and Facilities people to work together
  • Use life-cycle total cost of ownership analysis
  • Document design intent
  • Introduce energy optimization early
  • Benchmark existing facilities
  • Re-commission as a regular part of maintenance

29
Top best practices identified through benchmarking
30
Design guidelines are available
  • Design Guides were developed based upon the
    observed best practices
  • Guides are available through PGE and LBNL
    websites
  • Self benchmarking protocol also available

http//hightech.lbl.gov/datacenters.html
31
What is in the RD Pipeline
A research roadmap was developed for the
California Energy Commission and outlined key
areas for energy efficiency research,
development, and demonstration
32
LBNL Data Center Demonstrations
  • Air management improvements (PGE)
  • Dramatic savings in fan energy
  • Improved cooling efficiency and output
  • Significant reduction in hot spots
  • Outside air economizers (PGE)
  • Field test of air quality
  • Contamination and humidity control concerns
    generally unfounded
  • DC powering (CEC-PIER)
  • Facility level savings 10 to 20

33
Impact of Energy Efficiency
  • Energy efficiency can slow expected growth in
    electricity use
  • Current trends will lead to 10 reduction
  • Simple management improvements can reduce
    consumption by an additional 20
  • Best practices can lead to 50 reduction

34
U.S. Opportunity Potential
Comparison of Projected Electricity Use,All
Scenarios, 2007 to 2011
140
2006 Baseline58.7
120
100
80
Annual Energy Use (Billion kWh/year)
60
40
20
0
2008
2009
2011
2010
2007
35
Where can I get help?
  • CEC is providing the technical foundation to
    improve data centers
  • PGE is the leading U.S. utility providing
    support to data centers
  • DOE is initiating a national program to provide
    assistance in identifying and evaluating savings
    opportunities

36
DOE Data Center Program
  • Program Objectives
  • Provide systems approach
  • Build tools, expertise, and strategies
  • Raise awareness and recognize industry leaders
  • Program Strategies (2007)
  • Build on successful Save Energy Now model
  • With industry input, develop appropriate tools,
    training, and qualified experts
  • Implement assessment program (solicitation in
    Fall)
  • Screen for industrial demonstrations
  • Federal government procurement specifications

www1.eere.energy.gov/industry/saveenergynow/partne
ring_data_centers.html
37
Sponsors and Stakeholders
  • Sponsors
  • California Energy Commission (CEC)
  • http//www.energy.ca.gov/pier/
  • U.S. Department of Energy (DOE)
  • http//www1.eere.energy.gov/industry/saveenergynow
    /partnering_data_centers.html
  • U.S. Environmental Protection Agency
  • http//www.energystar.gov/datacenters
  • Pacific Gas and Electric Company (PGE)
  • http//www.pge.com/docs/pdfs/biz/rebates/hightech/
    06_DataCenters-PGE.pdf
  • Stakeholders
  • Industry Organizations
  • e.g., Green Grid, ASHRAE, AFCOM, 7x24, SVLG
  • Equipment suppliers
  • Research organizations
  • Consultants

38
Web-based Resources
http//hightech.lbl.gov/datacenters.html
Good starting point for those seeking efficiency
measures
Best Practices
Benchmark data
Self-benchmarking Guide
Case Studies
Other Reports (demonstrations)
Design Guidance
39
Contact Information
Dale Sartor, P.E. Lawrence Berkeley National
Laboratory Applications Team MS
90-3011 University of California Berkeley, CA
94720 DASartor_at_LBL.gov (510) 486-5988 http//Atea
m.LBL.gov
About PowerShow.com