0% found this document useful (0 votes)
45 views16 pages

Data Center - Thermal Management

Data centers consume 1.1-1.4% of global electricity, with about 40% of that energy used for cooling systems. Various cooling methods and environmental classes are defined to optimize performance and efficiency, particularly for high-density setups. Innovations like hybrid cooling and underwater data centers are explored to enhance energy efficiency and reduce operational costs.

Uploaded by

aa30077aa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views16 pages

Data Center - Thermal Management

Data centers consume 1.1-1.4% of global electricity, with about 40% of that energy used for cooling systems. Various cooling methods and environmental classes are defined to optimize performance and efficiency, particularly for high-density setups. Innovations like hybrid cooling and underwater data centers are explored to enhance energy efficiency and reduce operational costs.

Uploaded by

aa30077aa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Data Centers

Thermal
Management

https://www.akcp.com/blog/the-real-amount-of-energy-a-data-center-use/

1
Introduction
• Data centers and data transmission networks consume 1.1-
1.4% of global electricity use (240-340 TW-h). Crytocurrency
mining consumed around 0.4% (110 TW-h).
• About 40% of that total power is spent on cooling system.
• For HPC/AI data centers, performance and
configurations are constrained by power
consumption, availability, and cooling capacity.
• Performance, reliability, and lifespan of electronic devices are
affected by the junction temperature.
Max. Junction Si based ICs SiC MOSFET
temperature
die level 110˚C 200˚C
package / system level 85˚C 180˚C

2
Definitions of Environmental Classes

• A1: data center with tightly environmental control for enterprise servers,
storage products
• A2, A3/A4: IT space, office, or lab for volume servers, storage product, PC,
workstations
• B: office, home, or transportable environment with minimal environmental
control for PC, workstations, laptops, printers
• C: point-of-sale or light industrial or factory environment with weather
protection
• H1: a zone within a data center that is cooled to lower temperatures to
accommodate high-density air-cooled products.

ASHRAE 2021 Thermal Guidelines for


Data Processing Environments

3
• Economizer: a mechanical device that
Cooling tower reduces the amount of energy used to cool
a data center or buildings.
• Air-side vs. Liquid-side economizer

4
• W17/W27: data center cooled using chillers and a cooling tower, with an
optional water-side economizer to improve energy efficiency
• W32/W40: operated without chillers in most locations, but may still require
chillers in some locations.
• W45/W+: operated without chillers to take advantage of energy efficiency and
reduce capital expense. Some locations may not suitable for drycoolers.

5
Cabinet or Rack Level

Air-cooled cabinets

6
Air-cooled cabinets

Primary liquid loop components are


housed outside the rack to permit more
space within the rack for electronic
components.
Liquid-cooled cabinets

7
TEC

Widely used in shipboard cabinets, where


electronic components need to be isolated
from ambient environment
Hybrid cooling approaches

CRAC Unit

8
Room Level

9
Avoid mixing of cold supply air and hot return air

Hot-aisle containment

Cold-aisle containment

10
S-pod layout – staggered arrangement

Direct hot-air extraction above the rack

11
Hot aisle air extraction

12
Yahoo! Chicken Coup Facility
Lockport, New York
PUE ~ 1.08 Central cupola provides
top venting of hot air

Air inlet vents at


• Long narrow design mimics a chicken coup to the bottom
promote buoyancy-assisted air flow
• Location: cooler climate, prevailing winds,
nearby low-cost hydropower
• Summer: convective air cooling is augmented
with evaporative cooling

Facebook Open Compute Facility


Triplet rack architecture
Prineville, Oregon
PUE~1.07

Each 42U column Battery cabinet to


houses 30 servers provide backup power

13
14
Microsoft Project Natick

• A shipping-container-size data center was deployed to the 35.6 m


deep seafloor off Scotland’s Orkney Islands.
• Tank was filled with dry nitrogen to reduce oxidation.
• Seawater is cool and nearly isothermal.
• Lower failure rate.

15
https://www.science.org/content/article/clim
ate-change-threatens-supercomputers

16

You might also like