Example DC EE Assessment
Example DC EE Assessment
Example DC EE Assessment
Final Report
November 20, 2015
________________________________________________________________________
Disclaimer
This document was prepared as an account of work sponsored by the United States
Government. While this document is believed to contain correct information, neither the
United States Government nor any agency thereof, nor The Regents of the University of
California, nor any of their employees, makes any warranty, express or implied, or assumes
any legal responsibility for the accuracy, completeness, or usefulness of any information,
apparatus, product, or process disclosed, or represents that its use would not infringe
privately owned rights. Reference herein to any specific commercial product, process, or
service by its trade name, trademark, manufacturer, or otherwise, does not necessarily
constitute or imply its endorsement, recommendation, or favoring by the United States
Government or any agency thereof, or The Regents of the University of California. The views
and opinions of authors expressed herein do not necessarily state or reflect those of the
United States Government or any agency thereof or The Regents of the University of
California.
This report is prepared as the result of visual observations, environmental monitoring, and
discussions with site staff. The report, by itself, is not intended as a basis for the engineering
required for adopting any of the recommendations. Its intent is to inform the site of
potential energy saving opportunities and estimated cost savings. The purpose of the
recommendations and calculations is to determine whether measures warrant further
investigation.
Acknowledgments
Authors
1
________________________________________________________________________
TABLE OF CONTENTS
1. EXECUTIVE SUMMARY ..................................................................................... 4
2. FACILITY OVERVIEW ............................................................................................. 6
3. FACILITY ENERGY USE........................................................................................... 7
4. COOLING SYSTEM DESCRIPTION ......................................................................... 9
5. ELECTRICAL SYSTEM DESCRIPTION ......................................................... 11
6. BENCHMARKING................................................................................................... 13
6.1 OVERALL ENERGY EFFICIENCY METRIC .......................................................................................13
6.2 AIR MANAGEMENT AND AIR DISTRIBUTION METRICS..................................................................13
6.3 COOLING PLANT METRICS ............................................................................................................14
6.4 ELECTRICAL POWER CHAIN METRICS ..........................................................................................16
7. RECOMMENDED ENERGY EFFICIENCY MEASURES ........................................ 19
2
________________________________________________________________________
List of Abbreviations
AC – Alternating Current
ASHRAE – American Society of Heating, Refrigerating, and Air-Conditioning Engineers
BTU/sf-y – British Thermal Units per square foot per year
CRAC – Computer Room Air-Conditioner (with internal refrigerant compressor)
CRAH – Computer Room Air Handler (with chilled water coil)
DC – Direct Current
EEM – Energy Efficiency Measure
ECM – Electronically Commutated Motor
°F – degree(s) Fahrenheit
GWh/yr – GigaWatt Hours per year (millions of kWh/yr)
HVAC – Heating, Ventilating, and Air-Conditioning
IT – Information Technology
kV – kiloVolts (thousands of volts of electrical potential)
kVA - kiloVolt-Amperes of apparent power
kW – kiloWatts of real power
kWh – kiloWatt hour
PDU – Power Distribution Unit
PUE – Power Usage Effectiveness
RCI – Rack Cooling Index
RTI – Return Temperature Index
RH – Relative Humidity
sf – square foot
TCO – Total Cost of Ownership
UPS – Uninterruptible Power Supply
V – Volt(s)
VFD – Variable Frequency Drive (for operating motors at variable speed)
W/cfm – Watts (of electrical power input) per cubic feet per minute (of air flow)
W/gpm - Watts (of electrical power input) per gallon per minute (of water flow)
W/sf – watts per square foot
3
________________________________________________________________________
1. Executive Summary
This energy assessment, sponsored by AGENCY X and the Federal Energy Management
Program (FEMP), focuses on AGENCY X data center on the East Coast. AGENCY X leases the
data center facility from the General Services Administration (GSA) which, in turn, leases it from
a local vendor.
Lawrence Berkeley National Laboratory (LBNL) staff performed the assessment, which
established an estimate of baseline energy end use and identified potential energy-efficiency
measures (EEMs). Observation of the building physical conditions, environmental conditions,
and energy use led to the recommendations for operational and energy efficiency improvement
opportunities identified in this report.
This data center is planned to be one of the central consolidation sites for a federal agency. As
such, the center plans to increase the amount of IT equipment resulting in higher electrical load
and heat density. Efficiency measures will need to consider the changing nature of the IT
configuration and be able to adjust to maintain efficient operation over a range of electrical
loading.
It should be noted that this report is not based upon an investment-grade assessment. The
precision of the calculations used to determine Energy Efficiency Measures (EEMs) is limited
because:
The data center is embedded in a large office building.
Only limited power and environmental measurements were obtained during operation for
short durations of time.
UPS power input readings were not observed at the meters as configured; only output
readings appeared to be available.
A common chilled-water system serves the entire building, not just the data center.
As-built information of the facility was not generally available.
Despite these limitations, valuable observations and recommendations have been made.
Assumptions and calculation methods are noted throughout the report.
4
________________________________________________________________________
Thanks to a pro-active staff, in only 4 months the AGENCY X data center has
progressed toward being an effective and energy-efficient facility (from a PUE of 2.3
to 1.7). Implementing the recommendations in this report would make it truly
exemplary, with better cooling and power service to the IT equipment and a PUE of
about 1.3.
5
________________________________________________________________________
2. Facility Overview
The data center is embedded in a large building that currently houses other functions (e.g.
offices, print shop, etc.) It was built in 2007, and is now slated to be a federal agency’s
consolidation site. The quantity of IT equipment will likely be increasing for several years.
The data center has a 3 ft. raised floor, and the distance from the raised floor to the dropped
ceiling is just over 9 feet. The height of the plenum above the dropped ceiling is 18’. The IT
equipment layout in the main data center is generally configured in a hot-aisle/cold-aisle
arrangement as shown in Figure 1 below. Note that in this report we use the abbreviation
CRAH (Computer Room Air Handler) instead of CRAC (Computer Room Air Conditioner, and
how the units are labeled) because CRAH is a more-precise description of the units.
The main area of the East Coast data center is about 22,600 sf within an 81,000 sf facility.
Two separate rooms contain the uninterruptible power supplies (UPSs) that serve the IT
equipment through power distribution units (PDUs) located in the data center spaces.
Typical Hot Aisles (every other) Typical Cold Aisles (every other)
Figure 1. Main Data Center (MSF; Room 167). The 12 rows of IT racks are in red; the
CRAH units (labeled as CRACs) are the large gray rectangles
6
________________________________________________________________________
IT Equipment Loads
Summarized in Table 2 below is the IT equipment average power use in kW.
Data Area (sf) IT equipment load (kW) Power Density (W/sf)
Center
Areas
7
________________________________________________________________________
8
________________________________________________________________________
9
________________________________________________________________________
For much of the year, it is possible to maintain an acceptable upper humidity limit without
ever needing to actively dehumidify and for the lower end of the humidity range, ASHRAE
has recently published research results that show that low humidity has a negligible effect
on electrostatic discharge (as long as appropriate grounding is applied) and they are
revising their thermal guidelines accordingly, with the result that humidifying won’t be
necessary except in extremely dry conditions (less than 14° F dew point).
It should be noted that the data center’s proactive staff adjusted temperature setpoints
upward between the June and October visits, and disabled the reheat in the CRAH units.
They also changed the chiller plant pump operation, and the combination of reduced cooling
load and a more-efficient chiller plant has resulted in very large energy savings and a
reduction in the PUE from 2.3 to 1.7. The savings associated with these changes are
captured in the CRAH retrofit measure (EEM 3) and described in Section 7 of this report.
Chilled water is supplied to the CRAH units from a central chilled-water plant that services
the entire building. The data center however, is the dominant load on the chilled-water
plant. Chilled water is supplied at 42° F, which is much cooler than is required to provide
cooling for the IT equipment. The set point is driven by the print shop in the building,
which requires a low setting for humidity control. Cooling for this relatively small load is
adversely affecting the efficiency of the data center as well as the capacity of the chiller
plant. The chilled-water plant consists of three 250-ton screw type chillers with variable-
speed chilled water pumps and cooling tower fans. Two chillers and all three of the chilled-
water pumps, condenser water pumps, and tower fans were run all year as found at the
June 2015 visit, but the staff adjusted the operation to two each of the pumps and towers. A
plate-and- frame heat exchanger (rated at 83 tons at the design condition of 50 degree
condenser water temperature but capable of 500 tons at 40-degree condenser water) to
allow for free cooling was installed as part of the original construction but was not
operating during the site visits and is not currently in use even when conditions allow it.
10
________________________________________________________________________
See EEMs 4 and 5 in Section 7 for recommended changes to the chiller plant and its
operation.
UPS System:
Eight 500 kVA/450 kW Uninterruptible Power Supply (UPS) systems (MGE model EPS
7000) provide power conditioning and battery back up to the main power supplied to the IT
equipment. The topology of these units is double-conversion, meaning all of the power is
converted from AC to DC and then back to AC. The output power meters on the UPS systems
were functional and gave readings reasonably consistent with the other readings taken
during the June site visit (e.g. UPS outputs, one-time checks with portable metering, and the
PDU inputs agree to within about 3%); these meters were used again at the October site
visit. It should be noted that the A-3 output meter reads high and might be reporting the
total from the paralleling unit; this problem should be corrected.
Table 5 shows the loading of the A and B UPS systems, including Output (from the built-in
meters); % load (calculated from the output meter and the UPS rating); efficiency (from
manufacturer’s curve); input and losses are calculated from the output and the efficiency.
These units are lightly and uniformly loaded with good efficiency given their light loading;
see Section 6.4 for further discussion.
11
________________________________________________________________________
Distribution transformers/PDUs:
Twenty-four Power Distribution Units (PDUs) with built-in transformers are used to
distribute electrical power to the IT equipment; 16 PDUs are rated at 150 kVA and 8 are
rated at 300 kVA. The PDUs are equipped with meters to enable reading the input and
output power (kW) and energy (kWh). These meters were used to check the UPS meter
readings and further validate the IT input power and thus the PUE. These meters also
provided data to estimate the power loss for the transformers in the PDUs. Several of the
meters were not reading correctly (specifically D4, E/CB4, S1A1, S1A5, S1B2, and S1D1;
kWh registers blank or kW, kWh or both with outputs greater than inputs) so some
estimating was necessary. Two of the PDUs were energized but were serving no IT load.
Lighting:
The data center contained a standard dropped ceiling with fluorescent T-8 lighting fixtures
and there were no automatic lighting controls. There were 79 fixtures each with four 32W
lamps; with electronic ballasts these fixtures draw about 120 watts each.
Standby Generation:
Three standby generators, each rated at 1890 kVA/1512 kW at 480 volts, provide back-up
power in the event of a power failure. Each of the generators has an electric engine block
heater to keep the engine warm to facilitate rapid starting and ability to pick up load. The
block heaters are controlled by thermostats.
12
________________________________________________________________________
6. Benchmarking
The purpose of this section is to summarize the metrics that were calculated as part of the
assessment process and compare them to data from other facilities, where available.
AGENCY X, AGENCY X,
June 2015 October 2015
13
________________________________________________________________________
Table 6 below summarizes the metrics, calculated from data taken in the main data center
room at the October 2015 visit, and provides interpretation.
Ratio of Total CRAH Flow to Total None 0.33 Poor. Very low CRAH flows
Rack Flow relative to IT flows. Ideally
1.0 but best practice is a bit
higher than 1.
Fan motor efficiency % 87.5 Good but not premium
efficiency; typical of 20-ton
units.
Econ Utilization Factor % 0
No air-side economizer.
Table 6. Air Management and Air Distribution Metrics
14
________________________________________________________________________
15
________________________________________________________________________
16
________________________________________________________________________
90%
Efficiency
85%
Double-Conversion UPS
75%
Delta-Conversion UPS
70%
0% 20% 40% 60% 80% 100%
Percent of Rated Active Power Load
17
________________________________________________________________________
AGENCY X
AGENCY X
18
________________________________________________________________________
19
________________________________________________________________________
Many DCIM systems also provide for tracking of IT equipment, e.g. what software is
running on what hardware and what is the utilization of each piece of hardware.
This feature can identify “zombie” servers that take up space, power, and cooling
but contribute no computing value, as well as point to virtualization and power-
down opportunities.
The energy savings shown in the table assume 5% of the overall data center energy
can be saved: 2% in IT, 2% in cooling, and 1% in electrical losses. Note that while
the payback period is relatively long, no credit was taken for non-energy benefits for
the operation and maintenance, which can easily dominate the overall savings.
20
________________________________________________________________________
blanking panels, internal recirculation of hot air will seriously compromise the
ability of the cooling system to adequately serve the IT equipment, with the result
being compromise IT reliability. There are many cable penetrations through the
floor that could use better sealing devices to reduce the amount of cold air from the
underfloor leaking past the IT equipment, which is needed to prevent reducing the
CRAH capacity. Isolating the hot aisles with barriers such as strip curtains, in
combination with using the ceiling space as a hot-air return plenum, would greatly
reduce the hot air recirculation in evidence. This return plenum would include open
“egg-crate” ceiling panels above the hot aisles, and typically sheet-metal return air
chimneys extending from the ceiling to the CRAH inlets. The savings from air
management, realized in the CRAH measure (and indirectly in the chiller plant) are
listed in Table 9 with the CRAH measure; the cost of the air-management is included
in the combined measure 2 and 3.
There are two significant parts related to improving the energy efficiency of the
CRAHs in the data center. The first is adjusting the control set-points so that the IT
inlet conditions conform to the ASHRAE recommendations, which include up to 80.6
°F air temperature and a dewpoint range of 41.9 °F (to 60% RH) to 59.0 °F. See
Table 4 above and note that the lower humidity limit is being lowered further to 14
°F dewpoint, as discussed in Section 4 above. Since the CRAHs are presently
controlled using return air conditions, the temperature set-point will be
substantially higher, and the humidity control should be turned off. Eliminating the
humidifying, dehumidifying, and CRAH unit fighting will reduce the total CRAH
energy by roughly 65% while still keeping the IT inlet conditions in the
recommended ASHRAE range.
The second part related to the CRAHs is to rebuild them using plenum fans with
direct-drive, electronically commuted (ECM) variable-speed motors and change the
control from return-air to supply-air temperature. Such fans are inherently much
more efficient than the existing squirrel-cage fans in down-flow, underfloor plenum
applications; the inefficiency, maintenance, and particle generation of the belt drives
is eliminated, and the variable-speed motors are controlled to maintain the
necessary underfloor pressure without over-provisioning the system. Collectively
changing the fans and controlling them using underfloor pressure will reduce fan
power by about 70%. Because the IT inlet air temperature is what matters, and the
inlet air is coming from the CRAH supply air, using supply air temperature to control
the chilled water valve is much better than using the return air temperature. The
overall savings of the controls and fan retrofits is about 89% of the existing CRAH
usage. As noted above, a significant fraction of this opportunity has already been
realized by the pro-active staff at AGENCY X.
21
________________________________________________________________________
which in turn increase the CRAH efficiency. The print shop requires 42-degree water
for dehumidification purposes, and this roughly 20-ton load is the ‘tail wagging the
dog’ of the 500-ton chiller plant, which could otherwise supply water up to 50
degrees or higher with an appropriate reset schedule. A dedicated air-cooled chiller
could be installed and operated as needed to meet the print shop’s special needs and
allow the chiller plant to meet the cooling requirements of the data center and office
spaces. Not only would the plant operate more efficiently, but it would have more
capacity at the increased chilled-water temperatures.
22
________________________________________________________________________
The savings in Table 9 include the increased usage from the small print shop chiller
as well as the savings from the temperature resets and use of the water-side
economizer. The combination of the resets, shutting off the extra pumps, and use of
the economizer improves the overall annual plant performance from 0.87 kW per
ton of cooling to 0.49 kW per ton, a savings of 44%.
In addition to the UPS opportunity, we found that two of the PDUs were energized
but not supplying any IT equipment; thus their no-load losses are going to waste.
Turning off these units (D5 and S1A3) until such time as they are needed would
directly save energy in their transformers and indirectly by requiring less cooling.
The savings in Table 9 reflects the combination of the above-mentioned UPS and
PDU measures.
23
________________________________________________________________________
Additional Measures
The pie charts below show the current data center energy breakdown (Figure 8)
along with the projected energy breakdown (Figure 9) after implementation of the
recommended measures. Note that the estimate of the absolute number of 811
average kW in Figure 9 assumes the IT load stays constant. As the IT load grows, the
absolute total number will grow, and the absolute energy use of the electrical and
cooling infrastructure will grow, but the PUE typically decreases since the
infrastructure generally gets more efficient as the load increases.
24
________________________________________________________________________
25
________________________________________________________________________
Thanks to a pro-active staff, the AGENCY X data center has rapidly progressed
toward being an effective and energy-efficient facility (from a PUE of 2.3 to 1.7).
Implementing the recommendations in this report would make it truly exemplary,
with a PUE of about 1.3, and with better cooling and power service to the IT
equipment.
26