0% found this document useful (0 votes)
50 views34 pages

Existing Data Centre - Thermal Management-24072018

Uploaded by

kewal.patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views34 pages

Existing Data Centre - Thermal Management-24072018

Uploaded by

kewal.patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

“Thermal Management Strategies

for Existing Data Centres”

© Confederation of Indian Industry


Data Centers Main Thermal Management Needs
Capacity:
Always available, following the dynamic
data centre’s environment: peaks, load
variations.

Efficiency:
Optimized for all conditions →
minimizing the pPUE value

Availability and Reliability:


100% cooling guaranteed even in the
most extreme conditions
50% of DC Power for Physical
Infrastructure

Electricity Lighting, etc.


Transformer/ 3%
UPS
IT Equipment
10%
Air 50%
Movement
12%

Cooling
25%

The direction of energy efficiency

Total Facility Power


PUE = , PUE = 2
IT Equipment Power
(common PUE = 1,6 bis 3,5)

Source: EYP Mission Critical Facilities Inc., New York


The Data Center World - Operating
Thresholds

ASHRAE ASHRAE
Recommended Envelope Allowable Envelope

Typical Application: Legacy DC with Typical Application: Information and


Return Control and High Precision on Technology Space or Office
Humidity Hardware: Volume servers, storage
Hardware: All Servers products, pc, workstations
Temperature & Humidity: 20°C – 25°C, Temperature & Humidity: 10°C – 35°C,
40% – 55% RH 20% – 80% RH

Typical Application: Current DC with Typical Application: DC with focus on


Return or Supply Control and Humidity Energy Savings and larger limits on
Control humidity
Hardware: All Servers Hardware: Enterprise servers,
Temperature & Humidity: 18°C – 27°C, storage products
5,5DP – 60% RH&15DP Temperature & Humidity:
15° – 32°C, 20% – 80% RH
Everything Starts from the Server
…When IT Works, IT Makes Heat

The “Equivalent Circuit”


Variable fan speed depending on inlet
temperature

Hot Discharged Air

Variable power input depending on the


fan speed

Cold Supply Air


Everything Starts from the Server
…When IT Works, IT Makes Heat

If Airflow is Not Enough…

Hot
Aisle

Recirculation

Cold Aisle
Air Distribution Concept
pilot project “ENERGY”
Steps to improve Energy in existing Data Center
• Focus „ENERGY EFFICIENCY“

• Two test conditions:


➢ full load: 280 kW in the room
➢ partial load: 180 kW in the room
Consistent Covering ………..

warm
> 30 °C
loss of cold air recirculation
low return temperature
22 … 24 °C „hot spots“
~ 30 °C
small delta T
high air flow
recirculation

… to be eliminated step by step ⇨ ⇨ ⇨


100%
air leaks – loss of cold air
fan speed
~ 18 °C

low supply temperature high velocity – high pressure balancing required


~14 … 18 °C
pilot project “ENERGY”
pilot project “ENERGY”
Airflow Can Vary -
Thermal Solution Always Needs to Match It!

Servers’ Airflow
q1
qservers = q1 + q2 + ...qn

q2
Cooling Unit Airflow
qcooling = qservers
qn

12
Server Airflow Dynamic Control

13
Control and Different IT Loads per Aisle
Server Airflow Dynamic Control

A step forward!
14
Underfloor Pressure Control –
Type of control to keep a constant pressure in the raised floor based on the differential pressure sensor

Permanent
pressure to
Loss through slots equipment

20 Pa pressure

• Readings:
o Inside the unit body or outside in the room (room pressure)
o In the raised floor or cold aisle,
• Typical Application: 15
o Open architecture: ~ 50 – 70 Pa , Hot / cold aisle containment: ~ 10– 30
Control and Different IT Loads per Aisle - Pressure Control

May be Critical!
16
pilot project “ENERGY”

full load

partial load

Note: Heat blowers used; efficiency results are


depending from energy efficiency of the servers
Hot-Cold Separation

38 °C -
24 °C -
10 °C -
Conclusion pilot project “ENERGY”

• complete change of the philosophy


(open frame → closed racks and
CoolFlex)
• CRAC units (n+1 → 2n)
→ redundancy and more security
• Up to and more than 90% energy
reduction for the run of the CRAC
units (or more kW load is possible)
• Regulation of the cold aisle
Power Savings in Chiller after deploying CAC -
(Aircooled, Capacity-305 kW)

Power
Chiller Outlet Chiller Inlet
Consumption

°C °C kW

7 12 103

10 15 91

18 24 69
Concept of the organized cold and hot areas
Direct Expansion Solution

22
Upgrade/Replace - Direct Expansion Units
Energy Consumption of Different DX Systems
Energy Consumption / 1 kW Cooling
Year
Market and Floor Mounted DX System With Fixed
Capacity Compressor and operating at 2008
Customer 240 C

I
Requirements: Improved Floor Mounted DX System
2010

M
With Fixed Capacity Compressor and

P
operating at 240 C

R
The Data Center

O
Technology trend
Variable Capacity Unit with Air
Economizer
2012

V
anticipates higher

E
2014

M
servers’ working Variable Capacity Unit with Cold Aisle

E
Controlled Operation
temperatures, lower

N
pPUE
2016

T
!
Variable Capacity Unit with Pumped
Refrigerant Economizer

2017
Chilled Water Solution
How to Optimize Chilled Water Systems

High Energy Maximize Additional Features Intelligent System


Efficiency Freecooling Control

• Energy efficient floor mount • Operation at high CW • Options for fast restart • Communication between
unit temperatures (20/26°C) after power outage indoor units and chillers
• High Efficiency EC Fans • Via Freecooling Chillers • Manage the level of • Chiller set point shifting in
• Fans extended in the raised • Via Adiabatic Freecooling power/current absorbed case of low heat load in
floor Chillers data center

25
Efficiencies improvements in the CW SOLUTIONS
Energy Consumption / 1 kW Cooling
Year
Market and Chiller at 15/10 0 C & Floor Mount
2008
Customer
Requirements: Freecooling Chiller at 26/20 0 C Water & Floor

I
Mount 2010

M
P
The data center

R
Freecooling Chiller at 26/20 0 C Water &
Improved CW Unit 2012

O
technology trend

V
anticipates higher Chiller With Adiabatic free Cooling at 26/20 0 C

E
Water & Improved CW unit
servers’ working 2014

M
temperatures, lower

E
Chiller With Adiabatic free Cooling at 26/20 0 C
pPUE: increase of water T

N
Water & Improved CW unit and variable water

and water Delta T


flow 2016

T
!
CW System 32 / 20 0 C

2017

2017 – CW System at 20 / 32°C – pPUE 1.06


Where Indirect Evaporative Freecooling / Adiabatic Works

DX/CW INTEGRATION:
• At 24°C and 90% relative
humidity, the unit might require
DX/CW integration.
DRY OPERATION • But, at 30°C (higher temperature) and
35% (lower relative humidity) the unit
The unit can cool the data center just via can work just with evaporative.
the air-to-air Heat Exchanger thus using
only external cold air.
D
Int X/CW
eg
rat WET OPERATION
ion
The unit can here exploit the
evaporative effect via humidification.

WET
DRY Operation
Operation
Assumptions:
• Data Center 36°C → 24°C
• 100% of Full Load per Unit

2
7
Typical Installation

Roof Configuration Perimeter Configuration


• Data centers located on the • Green field sites
top of the building • Warehouse data centers

2
8
Indirect Evaporative Solution
Operation Modes

Optional

Summer
Winter

EXTERNAL AIR TEMPERATURE IS TOO HIGH TO


HAVE 100% COOLING WITH EVAPORATIVE, THE DX
MODULE IS THUS INTEGRATED TO COVER THE
MISSING CAPACITY

AIR-TO-AIR HEAT EXCHANGE VIA THE


SPRAYING OF WATER TO THE EXTERNAL AIR DX/CW Integration
SIDE
AIR-TO-AIR HEAT EXCHANGE WITHOUT
SPRAYING WATER
WET Operation
Dry Operation

29
MAXIMUM Optimization : System Logics of the Control
Adiabatic Freecooling
Chiller

Dynamic Control
Communications CRAC CRAC
←→ chillers units

Adiabatic Freecooling
Chiller
TEAMWORK
Communication
between Chillers

• Software function embedded in the units’ control Freecooling Chiller


• Leverages the communication between indoor
and outdoor units
→ EFFECT: increase of freecooling at part-load
conditions 3
0
Steps for Exisitng Data Centres
• Follow ASHRAE Cold Aisle / Hot Aisle Layout, Avoid Air mixing >
Implement Aisle Containments
• CFD Analysis
• Convert Fix Capacity Systems to variable (Fan / Compressors)
• Apply Supply Air control
• Raise the RA temperature
• Monitoring and control of Fans speed from Remote Cold Aisle sensors /
Pressure sensors
• Raise the CW temperatures and Raise the CW deltaT
• Apply Free cooling/ Adiabatic wherever possible
Two Steps for Data Centre Efficiency Improvement
• Increase data center temperatures to the limit of the Recommended
Envelope

• Move to the Allowable ranges A1-A4 as a result of evaporative cooling and


adiabatic chilled water solutions
Questions &
Answers
Thank you
Contact Details:
Raghuveer Singh
[email protected]
M: +91 99209 59205

© Confederation of Indian Industry

You might also like