Airflow Management in Data Center
Airflow Management in Data Center
Applied Energy
journal homepage: www.elsevier.com/locate/apenergy
H I GH L IG H T S
• The airflow management technologies in data centers are classified as long and short distance in the present review.
• Hot-air recirculation, cold-air bypass, leakages, and airflow distributions are the most critical factors.
• Strategies on improving airflow uniformity and preventing bypass and hot-air recirculation are reviewed.
• Computational fluid dynamics plays essential role but requires prior calibration before further implementation.
A R T I C LE I N FO A B S T R A C T
Keywords: This study provides a review upon airflow management in data centers. Based on the available airflow path,
Data center cooling systems in data centers are categorized as long-distance cooling or short-distance cooling systems.
Airflow management Investigations on airflow management include tests in real data center or in simulated data center. Besides, the
Long-distance cooling computational fluid dynamics (CFD) had been widely employed upon the thermal and airflow management of
Short-distance cooling
the data centers. For the long-distance cooling system, the airflow management normally adopts raised-floor
Energy saving
configuration and hot/cold aisle arrangement. The major problems in airflow management include hot-air re-
circulation, cold-air bypass, leakages, over-provisioned and under-provisioned air supply, and airflow/tem-
perature non-uniformity. The aforementioned effects often interact with the geometry layout of the data center.
Related literatures regarding the effect of plenum depth, perforated tiles, enhanced facility such as induced
bypass fans, infrastructure layout, aisle containment and leakage are discussed and compared. In addition,
studies on the overhead air supply method are also examined and compared with the raised-floor ones. For the
short-distance cooling system, the effect of server layout and heat exchanger layout concerning the airflow
uniformity are investigated. It is found that the appropriate management of the original design into centralized
server layout can ease the mal-distribution of airflow into the severs by 30%. This review aims to emphasize the
criteria of implementing airflow management to data centers that serve as a reference guide for energy saving in
data center as far as airflow arrangement is concerned. Moreover, some recommended future research efforts are
also addressed.
⁎
Corresponding author.
E-mail address: [email protected] (C.-C. Wang).
https://doi.org/10.1016/j.apenergy.2019.02.041
Received 20 November 2018; Received in revised form 31 January 2019; Accepted 7 February 2019
0306-2619/ © 2019 Elsevier Ltd. All rights reserved.
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
concerned. To fulfill the cooling demand of a typical data center, nor- rack-level cooling system using a finned tube heat exchanger are dis-
mally a chilled water system and an airflow loop as shown in Fig. 1 is cussed. Furthermore, the effects of server and heat exchanger layouts
used. Apparently, an optimized chilled water cycle could offer appre- on the airflow uniformity inside the server rack are numerically ex-
ciable energy saving and on the other hand, effective management of amined in details.
airflow also contributes essential energy saving and reduces emission to
ensure sustainable and reliable operation of IT equipment. In practical
data centers, appropriate airflow management strategies may impose 2. Previous reviews and cooling performance indices
pronounced effect on the cooling performance of data centers.
The chilled airflow path alongside the data centers can be identified 2.1. Previous reviews
either as the long-distance or short-distance cooling systems. For the
long-distance cooling system, the raised-floor and overhead air supply Table 2 [4–6,101,111–121] summarizes the previous review lit-
systems may be employed in which a computer room air-conditioner eratures concerning the thermal management applicable for data center
(CRAC) or computer room air-handler (CRAH) that deliver chilled cooling systems. The existing review articles focused upon air-con-
airflow toward the computer racks in distance. The chilled airflow ditioning system, energy management, free-cooling, and applicable
supplied by the CRAC (or CRAH) then enters the cold aisles from per- cooling technologies.
forated tiles or ceilings, subsequently flows through racks for heat ex- Future challenges for data center thermal management were pro-
change. The heated air then gathers in the hot aisle to circulate back to posed by Schmidt et al. [116,117], who also enumerated various best
CRAC (or CRAH) accordingly. For the short-distance cooling, airflow practices in data centers. A summary of air conditioning energy per-
may circulate close by the computer racks. This can be made available formance from more than 100 data centers was presented by Ni and Bai
by placing CRAC (or CRAH) units right nearby or inside the computer [4]. Based on their statistics, it was known that more than half of those
racks to reduce/eliminate the hot-air recirculation. Typical examples data centers are inefficiently operated, wasting plenty of energy, and
for the short-distance cooler are in-row cooler or rack-mounted cooling some currently available and developmental energy efficiency strate-
system. There are enormous studies associated with long-distance and gies such as economizer cycles, temperature and humidity control and
short-distance airflow management. Table 1 listed related studies in airflow optimization are helpful in improving the energy usage effi-
conjunction with these two cooling systems [9–110]. ciency. Oro et al. [5] summarized a number of energy saving strategies
In this paper, previous reviews on data center thermal management via integration with low-grade waste heat recovery such as district/
and relevant cooling performance indices are summarized. plant/water heating, absorption cooling, direct power generation
Subsequently, for the long-distance cooling system, the strategies are (piezoelectric and thermoelectric), indirect power generation (steam
thoroughly reviewed upon improving the airflow uniformity and pre- and organic Rankine cycle), biomass co-location, and desalination/
venting the bypass and recirculation effect for raised-floor data centers, clean water. Moreover, the review evaluated many developed dynamic
which is the most commonly implemented cooling system currently. models for properly understanding and predicting benefits of applying
Then, studies on the overhead air supply method are introduced as advanced technologies in data centers. Fulpagare and Bhargav [112]
compared to the raised-floor method, and the pros and cons are also also summarized the numerical optimization efforts on cooling perfor-
described. For the short-distance cooling method, the in-row cooler and mance improvement, They also indicated that the current guidelines
mainly stressed on rack inlet temperatures that could be effectively
Condenser
water pump
Condenser
85
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 1
Study classification of airflow management in data centers [9–110].
Sections Contents References
Long distance cooling [172] Raised-floor air supply Airflow distribution Plenum depth Karki et al. [9,10],
Returned CRAC system uniformity Bhopte et al. [11],
hot air Nada et al. [12],
Nagarathinam et al. [13].
Perforated tiles Patankar [14],
Cold aisle Hot aisle Sorell [15],
Rambo et al. [16]
VanGilder et al. [17,18],
Nada et al. [19],
Kang et al. [20],
Racks Racks Karki et al. [21],
Abdelmaksond et al. [22],
VanGilder [23],
Arghode and Joshi [24,25],
Zhang et al. [26]
Ling et al. [27]
Supplied Khalili et al. [28].
Plenum Enhanced facility Khalifa and Demetriou [29,30],
cooling air
Erden et al. [31,32],
Song [33,34],
Arghode et al. [35],
Athavale et al. [36,37]
Bypass and recirculation effect Infrastructure Layout Samadiani et al [38],
Nada and Said [39–42],
floor [43],
Zhang et al. [44],
Rambo and Joshi [45],
Kumar et al. [46,47],
Radmehr et al. [43],
Arghode and Joshi [48],
Fakhim et al. [49],
Wang et al. [50],
Aisle containment Wilson et al. [51],
Schmidt et al. [52],
Muralidharan et al. [53],
Sundaralingam et al. [54,55],
Arghode et al. [56],
Arghode and Joshi [57],
Nada et al. [58,59],
Khalaj et al. [60],
Alkharabsheh et al. [61],
Gondipalli et al. [62],
Gao et al. [63],
Zhou et al. [64],
Wibron et al. [65],
Onyiorah et al. [66],
Martin et al. [67],
VanGilder and Zhang [68],
Tsuda et al. [69],
Takahashi et al. [70],
Nemati et al. [71],
Shrivastava et al. [72],
John [73],
Kennedy [74].
Effect of leakage Pastrana et al. [75],
Khankari [76],
Radmerhr et al. [77],
Alkharabsheh et al. [78,79],
Morgan et al. [80],
Song et al. [81],
Hamann et al. [82],
Fink et al. [83].
Overhead air supply system Wang et al. [84,85],
Srinarayana et al. [86],
Udakeri et al. [87,88],
Nakao et al. [89],
Sorell et al. [90],
Schmidt and Iyengar [91].
86
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 1 (continued)
Heat
exchanger
Fans
improved by further considering the energy consumption of fans, measure of how effectively the equipment racks are cooled and main-
blowers, and other auxiliary equipment within the data racks. Yet tained within industry temperature guidelines and standards. In the
physics-based, heuristic guidelines using knowledge from thermal en- formula of RCI, the Tmax-rec and Tmax-all denote the maximum re-
vironment were imperative for the efficient operation of data centers. commended and allowable temperatures, and the Tmin-rec and Tmin-all
Besides, the free cooling technology applicable for data centers was are the minimum recommended and allowable temperatures, respec-
reviewed by Daraghmeh and Wang [111], who summarized the appli- tively, which can be found in the ASHARE specification [1]. Further-
cations of airside economizers, waterside economizers and heat pipe more, the return temperature index (RTI) was also introduced by
technology integrated with other systems, such as absorption, solar Herrlin as a measure of the energy efficiency of airflow management
systems, adsorption, geothermal and evaporative cooling. systems [124].
On the other hand, the progress of energy saving technologies from Under the ideal situation, the temperature at rack inlets should be
multiple perspectives of computer science was reviewed by Rong et al. equivalent to the supply temperature from CRAC (or CRAH). However,
[6]. A comprehensive set of strategies including optimizations of net- the exhaust air in hot aisles is prone to recirculation back to rack inlets,
work, processor and server resources scheduling were proposed to and deteriorates the performance appreciably. To assess the influence
maximize data center efficiency. Combining with energy saving tech- of the hot air re-circulation, the supply heat index (SHI) and return heat
nologies of thermal management in the data center room, it was in- index (RHI) were proposed by Sharma et al. [125]. The SHI and RHI are
dicated that various aspects like cost, energy consumption and en- related to the rack inlet temperature, rack outlet temperature and the
vironment should be considered comprehensively in order to further supplied air temperature. The numerator in the SHI formula denotes the
improve data center efficiency. sensible heat gained in cold aisle before entering racks and the de-
This review focuses on airflow management in data centers with nominator represents the total sensible heat gain by air leaving racks.
some effective strategies that serve as useful guidelines for thermal RHI evaluates the sensible heat extracted by the CRAC (or CRAH) units
engineers or designers. For pre-design of new-built data centers, readers relative to the total sensible heat gained by the air stream exiting the
or engineers can understand clearly the general concerns of airflow rack. Total heat extraction by the CRAC units. For a closed system with
management, thereby some preliminary solutions can be derived easily no leakage to the external environment, adding SHI and RHI together is
from this review paper. For remodeling of existing data centers, readers equal to unity.
or engineers can efficiently find the appropriate, effective and eco- In the case of airflow returning back to CRAC or CRAH without
nomical airflow management strategies to improve the thermal per- getting heat from equipment, the return temperature index (RTI) was
formance based on this review article to achieve appreciable energy defined to evaluate the airflow utilization supplied to cold aisles when
saving. the open and semi-containment strategies were applied. Furthermore,
the CRAC provisioning ratio (CPR) was proposed by Breen et al. [126]
2.2. Cooling performance indices to evaluate the influence of flowrates. Noted that the CPR is able to
reflect the relation between air supply flowrate and rack intake flow-
The power usage effectiveness (PUE) and its inverse, data center rate. For uncontained data centers, changing CRAC flowrate also affects
infrastructure efficiency (DCiE) are the power metrics for computing the CRAC return air temperature, therefore influences the efficiency of
data center efficiency proposed by the Green Grid Association in 2007 the chiller unit operation.
[122]. The PUE is expressed as the total energy consumption over the IT
equipment consumption, which has been adopted by industry as an 3. Methodologies in exploration the performance of data center
important index for measuring infrastructure energy efficiency in data
centers. Meanwhile, the most commonly used indices that can be used 3.1. Test based on real data center
to evaluate the influence of airflow upon supply, distribution and re-
circulation in data centers are shown in Table 3 [123–126] with esti- The associated studies can be classified as the experimental test, and
mation equations, which are regarded as criteria of assessment and simulation upon some real data centers [127]. Detailed measurements
comparison of data center cooling performance. were taken in real data centers including electronic equipment power
Herrlin [123] proposed the rack cooling index (RCI), which is a usage, CRAC supplied air flowrate, flowrate distribution from
87
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 2
Summary of review articles for data centers [4–6,101,111–121].
Reference Major contents Major contributions and conclusions
Ni and Bai [4] (1) Air conditioning summary including the indoor thermal Analyzing energy consumptions for air-conditioning system from 100 data
guidelines, cooling methods and air distribution in data centers centers. Some currently available energy efficiency strategies like
(2) Energy performance in mechanical cooling equipment, cooling economizer cycles, airflow optimization, energy management and
distribution equipment and heat rejection equipment. simulations tools are reviewed and summarized. The collected data from
(3) Energy efficiency strategies by applying economizer cycles, articles and reports showed that the range of air-conditioning system energy
airflow optimization and energy management. usage was 21% for the most efficient system and 61% for the least efficient
system.
Oro et al. [5] (1) Advanced concepts for cooling supply. The energy efficiency strategies, renewable energy integration and
(2) Advanced concepts for power supply. numerical models applied to data centers were summarized. CFD analysis
(3) Renewable energy supply classification and integration. often includes inaccuracies and highly time consuming, thus the major
(4) CFD models, own developed models and building and energy challenges are to develop efficient models capable to dynamically estimate
simulation models. the energy consumption and mass flows into the data centers.
Rong et al. [6] (1) Resources scheduling and optimization The energy saving technologies from multiple perspectives of energy
(2) Energy saving of network equipment and protocol consumption, including concerns of cost reduction, environment protection
(3) Energy-aware compiler and software power optimization and the energy saving trends for data centers in the future applications. The
(4) Low power design including processor architecture energy consumption can be reduced probably by 25–30% through choosing
optimization and disk storage system optimization. low-power servers and auxiliary energy-saving devices. The operators of
data centers can save about 10–15% of the total energy consumption by
optimizing resources scheduling algorithm and management strategies.
Schmidt and Lyengar [116] (1) Basic cooling concepts and guidelines that need to be evaluated The best practices for data center thermal and energy management with 83
when building data centers. papers covering from 2001 to 2007 were summarized and compared. When
(2) Placement of cabling, chilled water system and partitions for building a new data center, designer and operator should follow the
raised-floor and non-raised-floor designs. recommendations and guidelines published by ASHRAE Technical
(3) Effect of rack placement and aisle spacing. Committee 9.9.
Schmidt et al. [117] (1) Physical design of data centers with rack layouts and air The factors affecting the environmental conditions in data centers and
distribution configurations. measurements meet the telecommunication equipment environmental
(2) Factors influencing rack air inlet temperature and humidity. requirements were summarized. Significant savings can be obtained even for
(3) Thermal profiling and numerical modeling of high power a small improvement of 5% in energy-related expenditures. The HVAC
density data centers. industry as well as the server manufacturers may have to embark on liquid
(4) Data center energy requirements and predicted future data cooling solutions in order to resolve some of the future temperature
centers. problems that will occur within the server racks and the environment in
which they reside.
Fulpagare and Bhargav (1) Rack layout with thermal analysis and power distributions. Studies on rack layout, efficiency and performance metrics, dynamic control
[112] (2) Energy efficiency and thermal performance metrics. and life cycle analysis, and validation of numerical models for data centers
(3) Data center dynamic control and lifecycle analysis. were summarized. Current guidelines in use focus on rack inlet air
(4) Review of data center cooling strategies. temperatures. These guidelines could be made more effective if they
(5) Programming based optimization of data centers considered energy consumption of fans, blowers, and other auxiliary
equipment. Physics-based, heuristic guidelines using knowledge from
thermal environment are imperative for the efficient operation of data
centers.
Zhang et al. [120] (1) Airside free cooling including the direct and indirect airside The advancements of data center free cooling technologies including the
free cooling systems. airside free cooling, waterside free cooling and heat pipe free cooling were
(2) Waterside free cooling including the direct water cooled, air presented and compared. Among the three categories of free cooling
cooled and cooling tower systems. systems, heat pipe system has good energy efficiency and cooling capacity
(3) Heat pipe system including the independent system, integrated due to its ability to transfer heat at small temperature difference without
system and cold storage system. external energy. Also, it has no disturbance on the indoor environment and
(4) Summary of criteria of performance evaluation. can be integrated with compression systems.
Zhang et al. [101] (1) Thermosyphon for free cooling in data centers. The states of the art for thermosyphon and related integrated system were
(2) Integrated system of vapor compression and thermosyphon discussed in detail, including features and shortcomings of existing designs.
with shared flow channel. Loop thermosyphon is very suitable for application with easy installation,
(3) Integrated system of vapor compression and thermosyphon and it is necessary to investigate the applicability of experiment friendly
with in-series or parallel heat exchangers. working fluids like CO2.
(4) Integrated system of vapor compression and thermosyphon
with three-fluid heat exchangers.
Daraghmeh and Wang [111] (1) Airside economizers in direct free cooling, indirect cooling and The free cooling technologies applicable for data centers were summarized,
multi-stage evaporative cooling systems. including airside economizers, waterside economizers and heat pipe
(2) Waterside economizers in integrated dry cooler-chiller system application. The indirect airside coolers such as air-to-air heat exchanger
and cooling tower system. system and heat wheel shows very high efficiency, yet the thermosyphon
(3) Cold energy storage systems and integrated system of system reveals even more promising features.
mechanical refrigeration and thermosyphon.
Ebrahimi et al. [113] (1) Data center thermal loads and temperature limits The operating conditions of cooling systems and the technologies for
(2) Management of waste heat sources and streams in data center recovering data center low-grade waste heat recovery systems were
cooling systems. introduced and summarized. The absorption refrigeration and organic
(3) Discussion of waste heat recovery technologies such as Organic Rankine cycle were found to be among the most promising and economically
Rankine cycle, piezoelectric, thermoelectric and biomass co- beneficial technologies for data center waste heat recovery which are of
location. particular interest to data center operators.
Wang et al. [114] (1) Taxonomy of thermal metrics such as data center temperature, The operating conditions of cooling systems in data centers were
British thermal unit, airflow performance index and cooling summarized and discussed. The technologies for recovering data center low-
system efficiency metrics. grade waste heat recovery systems were also addressed. Cost reduction is
(2) Power and energy metrics of various scales and components for regarded as the final target to make data centers “green”, thus strategies are
data centers in practice. needed to show how the operation cost of a data center responses to its
initial purse cost.
Samadiani and Joshi [115] (1) Multi-scale modeling of electronic systems such as CFD/HT and The possible multi-objective methodologies for energy efficient design of
POD. data centers with selected examples were introduced. Through the use of
(continued on next page)
88
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 2 (continued)
(2) Multi-objective design methods such as GA and CDSP. economizers, it may be possible to achieve energy usage reductions. To
(3) Example applications of multi objective thermal design of handle the multi-scale problems in data centers, generation of reduced order
electronic cabinets and data center facility. models compared with full scale CFD/HT simulations is quite essential. Finer
length scales are needed to be described in the form of compact models. The
approximations require experimental validation of these reduced order
models.
Schmidt and Shaukatullah (1) Reviews of environmental requirements for data centers by The factors that critically affect the cooling design of data centers were
[118] ASHRAE. presented. Integrating energy efficiency into the design of the rooms housing
(2) Reviews of energy saving schemes and forced convection room the electronic equipment and improving the room ventilation are two key
cooling. areas of focus without consideration of liquid cooling.
(3) Reviews of experimental works on air conditioning systems.
Beaty and Davidson [119] (1) Design conditions and effect of mixed air. Various aspects of constraints to airflow patterns in data centers were
(2) Rules of thumb to minimize rack inlet air temperature discussed. Many potential solutions and pitfalls exist in the goal to maintain
excursions. airflow patterns that result in inlet conditions to datacenter equipment
(3) Addition of localized cooling and baffles to direct air and within the ranges recommended by both data center manufacturers and
prevent recirculation. thermal guidelines.
Lu et al. [121] (1) Ventilation configuration in data centers. Geometrical effects regarding the ventilation designs and underfloor plenum
(2) Effect of underfloor plenum. on airflow distributions were presented. The air distribution configurations
(3) Row and rack-based solutions. and the methods of airflow management exert a strong influence on thermal
(4) Cold aisle and hot aisle containment systems. performance of airflow in data center, and comprehensive assessment
(5) Other methods. methods for the thermal performance of data centers should be developed.
Literature on the HVAC or air distribution systems remain scarce. Methods
for enhancing the airflow uniformity along the height direction is also
recommended in future efforts.
Table 3
Summary of airflow performance indices [123–126].
Indices Expressions Scenarios
perforated tiles and rack inlet and outlet temperatures [128]. The ex- Center were reported by Germagian and Martin [67]. They reported as
perimental data of real data center were usually carried out by in- much as 40% power saving of CRAC units is attainable when applied
dustrial companies. A representative real data center with raised-floor aisle containment strategy, and the payback of mechanical systems was
area with physical dimension of 24 m × 20 m × 2.9 m in the IBM plant within the lifetime of operation of the equipment.
as shown in Fig. 2 was tested by Schmidt [129]. The airflow distribution
was strongly related to the number of CRAC units, and the reverse flow
through the perforated tiles close to the CRAC was observed. Advanced 3.2. Simulated data center test
strategies and development trends of data center development were
also reported upon reducing hot spot [54–56,130–132]. The cooling Usually, it is impractical to adjust the variables like server intake
performance and improvement strategies in the Oracle Austin Data flowrates and rack surface temperatures in real data centers in case for
concerns of affecting the data center normal operation. For easier
89
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Return fan
Computer
Simulated computer room Rack
Server 4
Thermocouples Server 3
AC power source
Server 2
Simulated server
Perforated
Voltage variac tiles Server 1 DAQ Hot
system exhaust
Supply air
Cold
air inlet
Thermocouple
Raised wires
floor
Blower
50
Temperature at front vs. Height
Temperature at rear vs. Height
40
Height /cm
30
20
10
0
24 26 28 30 32 34 36 38
Temperature /°C
(b) Temperature profile at front and rear of rack for uniform power scheme
140
120
Server temperature /°C
100
80
60
40
20
0
10 15 20 25
Height /cm
understanding the possible outcome of airflow management, simulated a common consequence of cold airflow bypass which may occur either
data centers by using distributed heater and fans by Nada et al. [58,59] when the supplied air flow rate from perforated plate is too high or
as shown in Fig. 3(a) was used to explore the thermal behavior of when the fans located in the racks insufficiently bring the airflow into
servers. The effect of airflow distribution on cooling performance was the racks. The corresponding variation of the temperature alongside the
investigated by examination of the rack inlet and outlet temperature rack is shown in Fig. 3(c). It is evident that the servers at the upper rack
distribution subject to RCI and SHI indices. A typical temperature suffers considerably from temperature. Two main causes for the in-
profile along the rack height at the rack front and back for uniform crease of the server temperature with the increase of its location height.
power is schematically shown in Fig. 3(b). Noted that, normally, uni- Firstly, as explained by the authors, the buoyancy force effect at the hot
form temperature profile prevails at the rack front but an appreciable aisle may impose a back pressure to resist the airflow across the rack.
temperature rises on the upper part of back rack temperature due to Yet secondly, the bypass cold air mix with the hot aisle return air also
mixing of the hot air in hot aisles with the cold air in cold aisles. This is reinforces a comparatively back pressure that lead to a decline of
90
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
91
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Hot
Cold aisle Hot CRAC
aisle Cold aisle
aisle
Air supply
Air supply
Perforated tiles
Plenum
(a) Schematic of airflow uniformity issue in cold aisles in raised-floor data centers
(b) Variation of the maximum temperature subject to with raised floor height and
ceiling height [13]
Fig. 5. Schematic of study on plenum depth and ceiling height.
uniform, leading to cooling performance degradation in some racks. processes to predict and control the airflow distribution. It was pro-
Thus, improving airflow uniformity in cold aisles may have beneficial posed that the suggested plenum depth was radically different de-
effects on saving energy consumption in data centers. Studies on airflow pending on data center floor area, hence quantitative studies were de-
management strategies to improve the airflow uniformity are sum- monstrated. Bhopte et al. [11] conducted a numerical parametric study
marized in Table 4 [9–13,17–20,22,24,26–37], including the plenum to the effect of plenum depth on the air flow distribution in racks with
depth, perforated tiles and application of enhance facilities. 12 kW power. Results showed that larger plenum depth could minimize
It was indicated that a higher plenum can improve the airflow the static pressure variations under the plenum, resulting in more
uniformity for perforated tiles, and the recommended plenum height uniform cold air distribution. In addition, the effect of hot-air re-
ranged from 600 mm to 1080 mm for different data centers. Besides, the circulation at rack tops would be alleviated as plenum depth increased
porosity of perforated tiles with 25% can also improve the uniformity. from 914 mm to 1219 mm due to the certain suppression of hot-air
However, the pressure resistances will dramatically increase. In this recirculation space, and the data center cooling performance was im-
regard, the enhanced facilities like induced bypass fan and active tile proved eventually. Nada et al. [12] investigated the airflow uniformity
are implemented and recommended for further improvement. in the cold aisle, and the recommended plenum depth is 600 mm. The
maximum rack inlet temperature affected subject to the plenum depth
(1) Effect of plenum depth and ceiling height in the range of 30 mm to 1320 mm was studied by Nagarathinam et al.
[13]. They concluded that the cooling performance is increased with
In the plenum, the airflow velocities can be divided as the hor- the plenum height to a threshold depth of 1080 mm. In summary of the
izontal velocity and vertical velocity. With the increased plenum depth, foregoing studies, a larger plenum depth may improve the airflow
the horizontal plane of the velocity profile weakens which may lead to uniformity in the plenum. This is because higher plenum can maintain
reduced gradient for velocity and pressure distribution. The effect of the pressure difference along flow direction. However, the actual height
plenum depth on the airflow distribution is mainly studied by CFD containing the computer racks in a building is normally fixed. Hence,
method. increasing the plenum height also lead to a decline of the ceiling height
In terms of the pressure distribution in plenum and flowrates that also impairs the performance. This can be seen clearly from the
through perforated tiles, Karki et al. [9,10] provided fundamental CFD calculated results about the influence of plenum height and ceiling
92
Table 4
W.-X. Chu and C.-C. Wang
Plenum depth Karki et al. [9,10] CFD 6.06 m × 20 m 8–20 kW·m−2 215–760 mm The airflow distribution depends on the frictional resistance which is affected
by plenum depth effectively, which can be well evaluated by developed one-
dimensional model.
Bhopte et al. [11] CFD by FloTherm 11.6 m × 3.1 m × 6.1 m 12 kW 305–1219 mm The higher plenum may increase the flowrate uniformity, which eventually
improve the rack inlet temperature. The plenum depth with 900 mm was
recommended.
Nada et al. [12] CFD 6.71 m × 5.49 m × 3.0 m 42 kW 400–800 mm Increasing the plenum depth can improve the uniformity of air flowrate, and
the plenum depth with 600 mm was recommended.
Nagarathinam et al. [13] CFD 11.4 m × 3.6 m × 12 m 16 kW 600–1250 mm The plenum depth with 1080 mm was recommended.
Porosity of perforated VanGilder et al. [17,18] Experimental study and CFD 687 m2 25% and 56% The two-measurement passive hood correction technique was proposed to
tiles study by self-coding calculate the pressure loss through tiles.
2
Zhang et al. [26] Experiment 9m 17–45% The pressure drop across perforated tiles was varied from 2.5 Pa to 25 Pa
while the open area of the perforated tile varied from 45% to 17%.
Nada et al. [19] Experiment 0.4 m × 0.3 m × 0.50 m 0.4–1.9 kW·m−2 25%, 50% and 75% Perforated tiles with 25% opening ratio has the best temperature distribution
and servers located at the bottom rack cabinet always has better thermal
performance.
Kang et al. [20] CFD with FNM method 58 m2 25%, 40% and 60% The design of perforated tiles involved CFD calculations was proposed, which
93
can obtain the desired airflow distribution in cold aisle.
2
Abdelmaksond et al. Experimental study and CFD 83.6 m 25% and 56% Several improved tile models were proposed via a combined experimental
[22] study by FloTherm and computational investigation.
Arghode and Joshi [24] Experimental study and CFD 56 m2 21.1–36.7% The reduction in the pore size from 6.35 to 3.18 mm had non-negligible effect
study by FLUENT on the flow field.
Nagarathinam et al. [13] CFD by FloVENT 11.4 m × 3.6 m × 12 m 20–100% The lowest maximum rack inlet temperature is 33.6 °C with 25% opening tiles
accompanying with very high pressure loss.
2
Khalili et al. [28] CFD by 6sigmaRoom 65.4 m Tiles of 0.61 × 0.61 m Proper type directional grilles can be applied to improve the rack intake
flowrate near the CRAC.
Ling et al. [27] CFD by FLUENT – 0.78–50.3% The correlation of pressure loss with respect to the geometrical factors and
flow parameters was proposed.
Enhanced facilities Khalifa and Demetriou CFD 8.8 m × 3.7 m × 7.0 m 1024 kW Introducing the bypass The energy consumption can be saved as much as 60% by utilizing an
[29,30] tiles optimized enclosed aisle configuration instead of a traditional enclosed aisle.
Erden et al. [31,32] Experiment – 1000 kW Introducing the bypass By combining the CRAC bypass fan, the cooling infrastructure power usage
fans can be decreased as much as 80%, and the cooling power consumption can be
reduced as much as 52%.
Song [33,34] CFD 8.75 m × 6.4 m 8.3–38.4 kW Introducing fan-assisted The application of fan-assisted perforated tiles may satisfy an advanced
tiles cooling solution to better manage and optimize the heat and mass transfer in
data centers.
Arghode et al. [35] Experiment 56 m2 64 kW Introducing active Implementing the active tiles can improve the flow uniformity, however
perforated tiles showed little improvement on the PUE.
Athavale et al. [36,37] Experimental study and CFD 6.2 m × 8.7 m 100 kW Introducing active The specific power consumption is lower when operating with an aisle of
by 6SigmaRoom perforated tiles active tiles as against passive tiles.
Applied Energy 240 (2019) 84–119
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 5 height in Fig. 5(b). The simulations was carried out at a data center
Computing model of airflow pressure loss through tiles with physical dimension of 11.4 m × 3.6 m × 12 m, and contained two
[20,22,26,132,144,145]. CRAC units (100 kW for each one) and 40 racks aligned in 4 rows (10
Models Equations racks in each row). Normally raising the plenum height ease the max-
imum temperature but the trend is reversed when the plenum height
Porous jump (PJ) model [20] 2
ΔpPJ = K1 ρvin
1
increased over 1050 mm. For the influence of ceiling height, the max-
2
[1 + 0.5(1 − F )0.75 + 1.414(1 − F )0.375] imum temperature initially decreases with ceiling height with a
K1 =
F2 minimum at 2520 mm, and thereafter it rises again. The maximum
Body force (BF) model [22,144] v
ΔpBF = K2 ρvin (vpore − vin), vpore = in temperature reaches a plateau at 3320 mm and maintains there after-
F
2
1 − F 1/2 wards.
F
1
K2 = 2 ⎡
⎣
(2 )+ (1 − F ) ⎤
⎦
Modified body force (MBF) model 2
ΔpMBF = ρvneck
1 2
− ρvin
1
(2) Effect of perforated tiles
2 2
[132,145]
vneck = vin (K2 + 1)1/2
Refined MBF model [26] ΔpRMBF = φΔpMBF
1 In the design of a data center cooling system, the flowrate dis-
2
tribution through perforated tiles is usually assumed uniform. However,
in many perforated tiles, the flowrate is substantially more or less than
the mean flowrate, resulting in potential equipment failure due to in-
adequate cooling [23]. Notice that the supplied cold air will deliver into
94
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
cold aisles through perforated tiles, which is crucial for managing air- that cannot be overlooked, especially near the tile surface at both racks
flow in data centers. In essence, using high porosity tiles may reduce the and aisle ends. However, the flow profile was identical by increasing air
pressure drop across the raised floor due to lower penetrated airflow flowrates due to fully turbulent flow regime, and the results revealed
velocity accompanying with the advantage of lower noise. However, it that the reduction in the tile width may improve air delivery into racks
was found that the airflow distribution can be significantly non-uniform with reduced cold-air bypass. Kang et al. [20] presented experimentally
for higher porosity of tiles such as 50% or 70% [14]. The guidelines for validated CFD results to predict airflow across perforated tiles based on
selection of floor grilles were provided for management of the quantity flow network modeling (FNM). Noted that the velocity over all tiles was
and placement of the tiles, and the criteria for selecting perforated tiles kept constant with tiles having 25% open area. The pressure loss across
in data centers was proposed for minimizing the airflow non-uniformity the tiles is much larger than the pressure variation in the plenum, thus
[15]. pressure under the floor behaves quite uniformly, thereby providing a
In a raised-floor data center, the plenum pressure and airflow dis- better flow distribution alongside the perforated grilles. However, with
tribution in cold aisles are dependent on the geometrical features of the 60% open area, the tile resistance was significantly diminished and
perforated tiles such as the size, thickness, and especially the porosity the effect of flow inertia takes control. Karki et al. [21] described a CFD
or opening area. Experimental measurements of individual mass flow- model for calculating air flowrates through perforated tiles in raised-
rate in tiles were performed by Rambo et al. [16] in a 100 m2 experi- floor data centers, and the relation between velocity and pressure dis-
mental facility. The diminished flowrate, even reversed flow, was ob- tributions were discussed, which showed good agreement with ex-
served for perforated tiles located adjacent from the CRAC unit. perimental data. It was verified that the flowrates through perforated
Subsequently, VanGilder et al. [17,18] calibrated a passive flow hood tiles near the CRAC units were the lowest due to its highest velocity
against a laboratory flow bench, then the airflow distribution through with the lowest pressure. This phenomenon may become even pro-
perforated tiles in the data center was accurately measured. In their nounced with a low plenum height where negative pressure may occur
studies, the two-measurement passive hood correction technique was and incur hot-air flow reversal into the plenum [142]. Generally, the
proposed to calculate the pressure loss through tiles, which can be supplied pressure under the plenum is increased when the distance to
implemented with existing commercially available flow hoods within the CRAC is increased. The variation of maximum temperature at rack
5% error. With the accurate measurement of flowrates from perforated inlet with the porosity of perforated tiles was investigated by Nagar-
tiles, the tests and prediction by experimental and CFD methods are athinam et al. [13]. It was observed that the lowest maximum rack inlet
more reliable. Nada et al. [19] also tested the effect of perforated tiles temperature is 33.6 °C with 25% opening tiles because of the much
with different opening area in a simulated data center, and the results better flow uniformity. However, the pressure loss would dramatically
demonstrated that the perforated tiles with 25% opening ratio showed increase.
the best temperature distribution. The velocity profiles through perfo- The pressure loss through perforated tiles was quantitatively ex-
rated tiles with different porosities were presented by the Abdelmak- amined. Zhang et al. [26] experimentally investigated the pressure drop
soud et al. [22] via experimental and CFD methods. While compared across the perforated tile. Their results indicate the pressure drop varied
under the same operating condition, a 100% perforated tile corre- from 2.5 Pa to 25 Pa while the opening area of the perforated tile varied
sponded to a jet velocity of 1.25 m·s−1, and the measured velocity was from 45% to 17%. The parameters such as tile area, porosity, diameter
5 m·s−1 when the tiles with 25% open area was applied. Arghode and of tile hole, tile thickness and layout of generally used perforated tiles
Joshi [24,25] proposed that the reduction in the pore size from were numerically studied by Ling et al. [27], who proposed that the
6.35 mm to 3.18 mm may impose moderate influence on the flow field pressure loss coefficient with respect to the geometrical factors and flow
95
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
96
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
22 50 °C Bypass flow
Recirculation
CRAC Hot
Racks
exhaust
installed in the aisle containment data centers. The parametrical study cooling performance of data center. Besides, the performance drops
of the IBPFs were discussed by Song [33,34], who employed the full with the rise of rack gap. Results showed that the hot air may re-
factorial design of IBPFs to obtain and verify the quantitative char- circulate back to the cold aisle, implying that the buoyancy plays cer-
acteristics of the interaction with different design parameters. On the tain role at the hot aisle since more hot air was gathered on the top part
other hand, a modified IBPFs coupling with perforated tiles were ap- of the hot aisle.
plied to data center plenum by Arghode et al. [35], and it was named
active floor tiles as shown in Fig. 8(b). The steady and transient states (1) Effect of infrastructure layout
were experimentally studied in order to characterize the effect on
temperature and flow distribution improvement in comparison with The effect of computer room and CRAC unit modeling in a re-
common passive tiles. It was identified that the fully provisioned case presentative raised-floor data center was studied by Samadiani et al.
with active tiles resulted in a nearly ideal temperature distribution as [38]. Different layouts of infrastructures including the CRAC unit,
fully containment cold aisle. Subsequently, Athavale et al. [36,37] op- racks, plenum pipes and perforated tiles were compared by using CFD
timized the number of active tiles by using CFD method, and the airflow method. Nada and Said [39–42] also investigated the effect of CRAC
delivered to cold aisle was verified to become more uniform. Yet the location, layout and arrangements inside a data center on airflow dis-
active tiles may improve the cooling performance effectively so that the tribution experimentally and numerically. Two different CRAC layouts
supplied air temperature can be raised more or less resulting in energy shown in Fig. 10(a) were compared in a data center having dimensions
savings at the chiller plant. For a given CRAC blower speed, the si- of 6.7 m × 5.5 m × 3.0 m, and 14 racks with the dissipating power of
mulation also indicates that an increase in active tile fan speed in- 49 kW were arranged in two rows. The RTI and SHI were applied as
creases the total tile air flow rate and decreases the leakage of cold air evaluation indices. The results showed that locating CRAC unit per-
in the plenum. A typical comparative result for the intake temperature pendicular to the rack row can improve the cooling performance of the
between the conventional passive tile and the active tile is shown in data center by enhancing the airflow uniformity as also shown in
Fig. 8(c) which clearly shows the superior uniformity of temperature Fig. 10(a), especially for racks at end of the cold aisle, and reducing hot
distribution of active tile. air recirculation at the first rack counting from the CRAC side.
It was indicated that the airflow may bypass the servers at the
4.1.2. Bypass and recirculation of airflow bottom with the raised-floor air supply system due to the high speed of
The effect of bypass and recirculation flow on cooling performance supplied air [43], thereby resulting in a non-uniform air supply dis-
in data centers is explained in Fig. 9. For raised-floor system, the airflow tribution on the rack inlet plane. The heat load distribution in one rack
may bypass servers at the bottom of racks due to the high speed of should be considered because of the vertically distributed flowrates in
supplied air from perforated tiles, which caused the non-uniform intake racks. Zhang et al. [44] adopted the detailed rack modeling with dif-
flowrate in servers alongside the height direction. The over-supply cold ferent server layouts and revealed that the heat power densities may
air is also possible to directly flow back to air-conditioners without impact the prediction of its average cooling performance. To maximize
serving cooling effect on servers when no aisle containment is used. On the thermal performance, different server layouts in one rack were
the other hand, the hot air recirculation, which means the exhaust air compared by Rambo and Joshi [45]. Their study showed that the high
from server outlet recirculates back into server inlet, is another concern powered servers are recommended to be sparsely arranged whenever
that may reduce the cooling efficiency of data centers. Studies on pre- possible rather than packing closely. On the contrary, clustering them
venting the bypass and recirculation flow are shown in Table 6 together in the middle cabinet gave the worst thermal performance.
[38–50,54–56,58–72,75–83], and the improvement approaches can be This can be explained that airflow might bypass the servers located at
classified as optimization of infrastructure layouts or aisle containment. the bottom leading to regional failure of cooling process, and the con-
The containment at the top of racks was recommended since the hot centrated high-power server layout may lead to insufficient air supply.
air may accommodate there and recirculate back, thereby impairing the In order to study the bypass airflow and flowrate distribution in racks,
97
Table 6
Summary of studies on airflow bypass and recirculation [38–50,54–56,58–72,75–83].
Contents References Methodology Testing domain Heat load Specifications Major conclusions or benefits
Infrastructure layouts Samadiani et al [38] CFD by FLUENT 104 m2 – CRAC and rack layouts Modeling the CRAC blowers, Using higher porosity perforated tiles and
removing the plenum pipes can increase the air flowrate
Nada and Said [39–42] CFD by FLUENT 6.71 m × 5.49 m × 3.0 m 49 kW CRAC layouts Locating the CRAC units perpendicular to the racks row can enhance the
W.-X. Chu and C.-C. Wang
98
Applied Energy 240 (2019) 84–119
Table 6 (continued)
Contents References Methodology Testing domain Heat load Specifications Major conclusions or benefits
Aisle containment Sundaralingam et al. Experiment and CFD by 8.84 m × 6.25 m × 2.64 m 124.2 kW CAC The fully contained aisle with over-provisioned conditions was preferred,
strategies [54,55] FloTherm and the top containment should be considered first if has geometrical or
cost limitations.
Arghode et al. [56] CFD by FLUENT 8.84 m × 6.25 m × 2.64 m 124.2 kW CAC The over-provisioned case with cold aisle containment strategy resulted in
W.-X. Chu and C.-C. Wang
99
VanGilder and Zhang CFD by FloVent 120 m2 2–12 kW per rack HAC Air containment can simultaneously improve the energy efficiency and
[68] reliability of data centers, and cooling airflow is suggested to exceed the
rack intake flowrate by 10–20%.
Tsuda et al. [69] Experiment 11.2 m × 4.8 m × 3.9 m 45 kW HAC and CAC The temperature environment for CAC with an airflow ratio of 1.36 was the
same as for HAC with an airflow ratio of 1.83, so that the CAC always
performs better than the HAC that requires less energy.
Takahashi et al. [70] CFD 24 m × 24 m 46.2 kW HAC and CAC Both CAC and HAC are effective to improve hot spots and control the inlet
temperature of IT equipment. The fan energy of the air conditioners is
reduced by approximately 15% by applying CAC or HAC.
Nemati et al. [71] CFD by 6SigmaRoom 215 m2 143 kW HAC and CAC A correlation was derived to link the cooling unit controller and set point to
the delivered airflow, which can help the data center operator to get
estimation of required water in chilled cycle.
Shrivastava et al. [72] CFD by 6SigmaRoom 17.7 m × 30.5 m 1,583 kW HAC and CAC The effects of deploying various containment configurations were studied,
and the HAC system provided better cooling performance but with most
challenging in structural design.
Servers located in the upper corners of the aisle can lose up to 70% of the
The measurements showed that the leakage flow in a typical data center is
The brush-type grommets were used to seal cable cut-out holes, which were
cooling capacity of the supplied cooling air while the containment leakage
The average rack inlet temperature in cold aisle reduced 2 °C while the gap
reallocation of leakage flow. bypass flow. A methodology for measuring the rack flowrate sensitivity
was between 1.02–5.67%.
rack slots are occupied by servers, with the remaining area covered by
panels. The thermal performance of a data center with semi-populated
racks utilizing different server layouts was presented by Fakhim et al.
Leakage between panels in
[49]. The heat load per rack remains unchanged with 5.5 kW, and
and the bottom of racks
Leakage at containment
Leakage at containment
eleven models for different server layouts were examined. It was in-
dicated that the rack model with double servers with double blanking
panels space that can give the best SHI. In their studies, the necessity for
Specifications
Cable cutoff
Cable cutoff
specific rack layout designs for each individual data center based on the
cooling system and aisle containment strategy was examined. Mean-
racks
racks
racks
1000 kW
60.7 kW
150 kW
–
9.14 m × 7.92 m × 3.35 m
8.5 m × 6.4 m
hot-air recirculation and cold-air bypass. The methods include the cold
aisle containment (CAC), hot aisle containment (HAC) and chimney
105.9 m2
1115 m2
CAC or HAC were described by Wilson [51], who described the em-
ployment of containment strategies with optimizations to the operating
cooling infrastructures can further save the energy consumption of
cooling system. It was indicated that as much as 59% of the energy
Methodology
Experiment
Experiment
required for the CRAC units used in a traditional open type data center
could be saved when the CAC strategy was applied [52]. Analogous
CFD
CFD
CFD
CFD
CFD
CFD
Alkharabsheh et al.
Khankari et al [76]
Radmehr et al [77]
presented.
Fink et al. [83]
With the CAC strategy, the cold air from perforated tiles in front of
References
thereby allowing the rest of room space to become a large reservoir for
hot air return. Apparently, this management can effectively prevent hot
Studies of airflow leakage
exhaust from mixing with supplied cold air before reaching servers. The
large scale air temperature field measurements were performed by
Table 6 (continued)
Arghode et al. [56], in order to study the hot air entrainment char-
acteristics in the cold aisle while the open aisle strategy was applied to
the data center. All measurements including the flowrate from perfo-
Contents
rated tiles, cooling load, rack flowrate and inlet temperature were re-
corded during steady state conditions. They had conducted experiments
concerning the effects of under-provisioned and over-provisioned cases
100
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
(a) study of CRAC layouts and the corresponding mass flowrate distribution
alongside the racks [39]
(c) Drawer design to minimize cold-air bypass and hot-air recirculation [50]
Fig. 10. Schematic of studies on infrastructure layouts.
101
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
subject to an open and contained cold aisle. Ideally, the total flowrate resistances, detailed rack models, and the like and it should be cali-
from perforated tiles equals to the rack flowrate but this is often not the brated and verified with experimental results. Hence, the relevant
case in practice. In an open aisle, the condition when the total tile modeling guidelines for data centers were provided by Alkharabsheh
flowrate is less than the total rack airflow is termed under-provisioned et al. [61]. In their results, the top portions of the racks were highly
(UP). The vice versa is regarded as over-provisioned (OP). Their testing susceptible to recirculation where should be considered as critical lo-
facility of Arghode et al. [56] is shown in Fig. 12(a) and (b) where two cations in the CAC system. Meanwhile, a certain threshold value of
CRAC units were used for testing about the influence of OP or UP. Test leakage, which is approximately 15% of the total containment surface
with only 1 CRAC running is considered as UP while the other case with area. A leakage surpass this threshold may jeopardize the benefits of
both CRACs being turned on is regarded as OP. The associated flowrate CAC strategy. Gao et al. [63] also compared the cooling performance of
ratio is shown in Fig. 12(c). For an open aisle condition, the utilization the data center with and without CAC strategy by CFD method. It was
of containment would tend to increase the tile air flowrate and decrease shown that the supplied air temperature could be increased by 3 °C by
the rack air flowrate. Conversely, the total tile flowrate was higher than using the CAC system while still maintains the thermal ambient at the
the rack flowrate. This is because the containment tends to equalize the recommended range.
tile and rack flowrate. Hence, the total tile airflow is appreciably higher The hot aisle containment (HAC) consists of a barrier that guides hot
than the rack flow for UP condition while it is moderately reduced at aisle exhaust airflow upward back to the CRAC return. The experi-
OP case. Results showed that hot air recirculation still appeared from mental data by Martin [67] showed that 40% blower power savings of
the cold aisle entrance even for an over-provisioned air supply case. For the CRAC units were achieved by establishing a ducted hot air return
the cold aisle containment case, close to perfect cold delivery to racks path between the IT equipment and the CRAC units, thus the efficiency
was examined for both under-provisioned and over-provisioned cases, gains are significant. The finite network method (FNM) was employed
resulting in a significant improvement on the temperature uniformity in by VanGilder and Zhang [68] to characterize and to compare the
cold aisles as well as at server inlets. Subsequently, further modified cooling effectiveness of ducted hot aisles subject to a given ceiling
strategies concerning partial cold aisle containment and fully cold aisle plenum pressure. Results indicated that the cooling airflow should ex-
containment were experimentally investigated by the same research ceed the rack intake flowrate by 10–20%, and the ceiling height con-
team [54,55], and the pros and cons of different systems were discussed nected to hot aisles should be at least 0.46 m. The individual rack-level
against the open aisle conditions for both under-provisioned and over- HAC applied to a 94 m2 data center with 161 kW of IT load was nu-
provisioned air supply systems. Typical rack inlet temperature contour merically studied by Onyiorah et al. [66]. The RCI was improved to
plot for UP and OP is shown in Fig. 12(d). Note that the OP normally 100% everywhere, and the rack inlet temperature is 9% lower than the
shows a better temperature contour than the UP case, and this is ap- ASHRAE’s allowable temperature (32 °C) while the airflow manage-
plicable for either open, partial, and fully containment case. For UP ment technology by HAC was applied.
subject to partial containment, hot spots occur at the edges of the cold- The CAC and HAC are increasingly used as data center cooling so-
aisle with top panel arrangement only while the upper center part of the lutions due to the benefits of segregating the cold and hot streams to
cold-aisle suffers appreciable overheating with side panel arrangement offer appreciable energy savings. Both CAC and HAC are effective to
(door containment). remove hot spots, to control the inlet temperatures of IT equipment and
The raised-floor data center with 60% and 80% CRAC flowrate was to reduce the energy consumption of air conditioners by approximately
analyzed by Gondipalli et al. [62] which accommodated 12 server 15%. However, there might exist some different concerns between
racks. It was indicated that isolating the cold aisle by CAC strategy can these two strategies under the same operating condition. The difference
reduce the inlet temperature by 40% as compared to the original between applying CAC and HAC strategies regarding the airflow intake
baseline design. A detailed description for thermal management in data temperature of the equipment in cold aisles was experimentally clar-
centers was concluded by Khalaj et al. [60]. A number of undesirable ified by Tsuda et al. [69]. Comparing with the two strategies under the
hot spots near the racks were detected with the condition of open aisle. same heating load and air-conditioning environment, the maximum
By applying CAC strategy, the SHI, RCI and coefficient of performance airflow intake temperature for CAC was 3.3 °C lower than HAC.
(COP) of the cooling system are improved by more than 0.45, 17% and Meanwhile, the fan power consumption for HAC was almost twice than
19.5%, respectively. A scaled and simulated data center was also in- that for CAC when the temperature environments were equal by ad-
vestigated by Nada et al. [58,59], which accommodated a row of three justing the air-conditioner fan speed. Nemati et al. [71] numerically
racks with the power density being 1898 W·m−2. The intake tempera- studied the transient failure scenario while the chilled water system
ture could drop from 22.5 °C to 19 °C with an enhancement of 11%, and suddenly stopped. The up-time for CAC and HAC by utilizing the flow
the SHI can be improved as much as 70% while the CAC was applied cure and server thermal inertial modeling methods was analyzed.
compared to the condition with open aisle. In order to predict the ac- Fig. 13(a) and (b) illustrated the temperature distribution of data center
curate performance of data centers with CAC system upon modeling, it with CAC and HAC strategies. It can be found that the hot exhaust could
is crucial to include the influences of fan curves, server internal be contained and directed into the ceiling plenum when applying HAC
102
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
different results. The energy savings associated with a CAC and a HAC
were compared with an open system. The impact of back pressure on
server airflows were ignored, and the economical comparison was
summarized in Table 7 [73]. Results showed that the primary benefit
was the CRAC air supply temperature can be raised that allows a higher
water supply temperature. It was found that a significant amount of
cooling energy could be saved by operating the chiller system effi-
ciently. The electricity cost was assumed 0.101 USD·kWh−1, and the
annual cost with be reduced by 23.6% and 34.0% with CAC and HAC,
respectively. On the other hand, in comparison with CAC, the data
center incorporating HAC strategy consumed 40% less cooling system
energy in a typical data center, corresponding to a 13% reduction in
annualized PUE. The majority of energy savings can be attributed to the
economizer hours while chiller is off. The PUE of the data center ap-
plying the two technologies subject to the same working environment
were 1.87 and 1.64, respectively. Despite the aforementioned reasons,
(a) schematic of the Data Center Laboratory (DCL) there are practical differences in implementations and operations that
could impose significant consequences on working environment con-
ditions [73]. In the meantime, since people still need to work inside a
data center temporarily, the environment must be maintained at a
reasonable temperature. Meanwhile, some custom ducting to enable the
miscellaneous devices such as tape libraries and standalone servers are
also in need. With the CAC strategy, the general working area is the hot
aisle. Therefore, it is essential to have a reasonable temperature to
avoid violating Occupational Safety and Health Administration (OSHA)
regulations [150] or ISO 7243 guidelines [151] for exceeding web-bulb
globe temperature (WBGT). Staffs who stayed at a desk in the data
center may feel very uncomfortable when the rack temperature exceeds
32 °C that was suggested by ASHRAE [1].
In summary of the foregoing discussions, containment systems were
verified that will improve data center operating efficiencies and save
energy consumptions, however, the applications of CAC or HAC stra-
tegies should depend on the layout of the room, budget and long-term
goals [74].
79% from 1.02% to 5.67% were evaluated by CFD method. The test room
has a total 20 server racks arranged in two rows separated by a con-
75%
tained hot aisle as shown in Fig. 15(a) which includes hard and soft
containment. The hard containment adopts hard metal panels with the
50% leakage coming through the gaps around the doors and hinges, and
One CRAC unit with One CRAC unit with Two CRAC units Two CRAC units through the gaps under and above the door panels. For the case of soft
open aisle fully CAC with open aisle with fully CAC
containment systems, plastic curtains are employed and the air can leak
(c) Measured tile/rack air flowrate ratio through the gaps between the hanging panels. Fig. 15(b) shows the
effect of leakage area on normalized temperatures for both hard and
Fig. 12. Schematic of experimental study concerning the influence of UP/OP
soft containment system. It shows that increase in the leakage area
upon different cold aisle configurations [54,56].
increases the normalized temperature, suggesting a higher localized
loss in the cooling capacity. It also shows the servers located in the
strategy, thereby eliminating recirculation back to equipment inlets. upper corners of the aisle can lose up to 70% of the cooling capacity of
Besides, it was suggested that the CAC is preferred because it performed the supply air. The space existed above and below the server and at the
greater up-time in comparison with HAC under the similar airflow rail besides rack and server provides undesired leakage where airflow
provisioning, which can be seen in Fig. 13(c). may escape without heat transfer. In practical application, some empty
However, different containment strategies were also compared by slots may appear in the racks. It is highly recommended to use blank
Shrivastava et al. [72] based on an industry data center and yields some panel to blockage the re-circulation flow. Radmerhr et al. [77]
103
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Temperature
/°C
26
UP with doors only UP with top only
(d) rack inlet contour plots for different UP/OP cold aisle configurations
Fig. 12. (continued)
experimentally showed that the leakage flow through panels in a ty- of the floor area in a typical data center. Alkharabsheh et al. [78,79]
pical data center varied from 5% to 15% of the available cooling air, investigated the cold aisle leakage at floor tiles, containment surfaces
and suggested this amount should be considered in typical CFD studies. and racks by using experimentally validated CFD model. The results
Besides, the measured quantities were generated from about just 0.35% agreed well with the experimental data with error of about 6.7%. The
104
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
CRAH
Raised-floor
CRAH
Raised-floor
Perforated tiles
110%
30
25
HAC CAC
(c) comparison of CAC and HAC
Fig. 13. Comparison of data center with CAC and HAC [71].
50 mm wide channels at the position of rails was assumed, which may demand. The practical undertaken measures to reduce this bypass could
inevitably cause a negative effect on the data center performance. The reduce total power consumption by up to 8.8%. Song et al. [81] in-
containment system gains no benefits over the conventional un- vestigated the under-cabinet leakage between the floor and the bottom
contained system when the leakage ratio passes a threshold value of of racks, which can be attributed to lower values of the local pressure at
15%. At typical cold aisle pressures, Morgan et al. [80] found that as the front of the cabinet due to the air stream delivered from the per-
much as 20% of the supplied air may bypass servers, thereby requiring forated tiles. The leakage was verified that may affect the cooling
over-supply flowrate from air conditioning units to meet the cooling performance by hot air recirculation rather than by cold-air bypass
105
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Table 7
Comparison of cooling system with different aisle containment strategies [73].
Characteristic Open aisle CAC HAC/VED
Server fronts and bottom cable imposes significant effect in reducing the cooling performance.
Thus, performance improvement can be achieved by using the brush-
Leakage above, below type grommets to seal cut-out holes of cables, and it was predicted that
and at sides of rack rails
the ten-year savings were as much as $72,000 in the hypothetical 1 MW
data center.
106
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Racks
Leakage path
Supply tiles
Leakage path
Supply tiles
(a) 1.02% leakage area (b) 2.04% leakage area (c) 4.08% leakage area
(d) 1.42% leakage area (e) 2.83% leakage area (f) 5.67% leakage area
CRAC
Fig. 16. Schematic of data center with overhead air supply system [84].
investigated by Udakeri [87,88]. The overhead supply configuration was evaluated parametrically as a function of the ratio of system supply
was found to be much more effective for a lower supply fraction air to cabinet air. Noted that the SHI of the data center with the raised-
whereas at higher fractions, and racks away from the CRAC displayed floor design was much lower than the overhead design. Numerical
lower intake temperature than the racks close to the CRAC due to the models of two data center sections, representing underfloor air supply
higher distributive flowrate. Meanwhile, a hybrid cooling strategy in- and overhead air supply designs were constructed by Schmidt and
corporated liquid cooling by a rear-door heat exchanger was proposed Iyengar [91]. The complex airflow patterns and temperature profiles
and analyzed under operating conditions with different air flowrates. that influence the computer equipment air intake temperature were
Results showed that the hybrid solution with overhead supply config- discussed. Results showed that the overhead supply design yielded
uration is comparable to underfloor supply even at higher fractions. cooler rack inlet temperatures for lower chilled-air supply, and the
Sorell et al. [90] numerically investigated the data center cooling per- raised-floor supply design was better for higher chilled-air supply cases.
formance for both overhead and raised-floor air supply systems with Besides, Srinarayana et al. [86] observed that hot sports occurs near the
quantitative evaluation about the ability to effectively deliver cold air top positions in the data center with a typical raised-floor supply
into the cabinets. Meanwhile, the ability to provide efficient cooling system. With CFD optimization, the best ventilation air supply system
107
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
The intake flowrate of rack cooling fans played dominant role, which was suggested to be greater than the
The overhead supply system was verified more efficient, and the underfloor model can be considered with
better performance when the supplied airflow approaches approximately 15% over the total cabinet flow.
application with liquid cooling can make overhead air supply configuration comparable to the raised-floor
A full containment design having higher jet airflow pattern may cause the Coanda effect that lead to zigzag
The overhead supply configuration was more effective at lower supply flowrate conditions, and the hybrid
Four airflow distribution systems are compared by cooling efficiency, and the underfloor supply overhead
The overhead supply design yielded cooler rack inlet temperatures for lower chilled-air supply, and the
hot-air return with ducts.
return system was the best, which required the lowest supply air flowrate to realize the same average
air supply flowrate in normal operation. The non-uniform heat load distribution in racks showed an
The aforementioned experimental and numerical studies compared
the performance between the raised-floor and overhead air supply
systems based on actual data centers. However, it can be found that
results may be different from case to case. In essence, the practical room
construction, overhead duct may contain some difficulty and require
further building cost but it also offers a better higher airflow distribu-
tion for its higher pressure as compared to the raised floor design.
raised-floor supply design was better for higher chilled-air supply cases.
5. Short-distance cooling
side or bottom of the rack, and the hot aisle and cold aisle are separated
1.08 kW·m−2
by the rack row. The fans arranged in the hot aisle can drive the hot
Heat load
exhaust from servers into the heat exchanger, then the server fans may
640 kW
30 kW
50 kW
32 kW
entrain the cold air into servers so that airflow recirculates within the
rack.
12.18 m × 13.42 m × 3.05 m
6 m × 3.1 m × 2.3 m
19 m × 9 m × 2.4 m
12.12 m × 13.42 m
Summary of studies on data centers with overhead air supply method [84,85,87–91,152].
the context of the cooling power usage and realizable energy savings of
Testing domain
sion were also discussed while introduced cool outdoor air with the risk
Methodology
Experiment
Experiment
Experiment
CFD
cooling unit was located at the bottom of a customized server rack. The
PUE was calculated as the evaluation criterion, and the developed
Airflow distribution and heat load
prototype was validated that would save $385,440 per year for a typical
Comparing with raised-floor air
supply method
supply method
supply method
temperature, water flowrate and fan operating duty cycle. Sahini et al.
distribution
to the long-distance design under the same aisle dimensions, and the
CAC
[91]
108
W.-X. Chu and C.-C. Wang
Table 9
Summary of studies on data centers with short-distance cooling system [92–98,100,103–110].
Contents References Methodology Rack dimension Heat load Major conclusions or benefits
Enclosure with heat exchanger Sahini et al. [100] CFD by 6SigmaDC 0.6 m × 1.2 m × 2.300 m 10.93 kW The long-distance cooling system consumed three times higher airflow energy compared to the short-
cooling distance cooling with the same rack intake temperatures, and the minimum hot aisle width was
suggested as 1.8 m.
Gao et al. [93] CFD by 6SigmaDC 0.6 m × 1.2 m × 2.3 m 10 kW The heat exchanger is of paramount importance, and its minimal size was optimized that can provide
adequate cooling capacity and sufficient heat transfer area with minimum pressure drop.
Iyengar et al. [92] Experiment – 3.516 kW The risks of server contamination and corrosion with exposure of electronics to particulate and gaseous
matter in the air were discussed, and the benefits of indirect cooling approaches was highlighted.
Gao et al. [94–97] Experiment – – The procedure of transient response modeling of single-pass cross flow heat exchangers was proposed
and showed good agreement with experimental data.
Kang et al. [98] CFD 0.6 m × 1.2 m × 2.3 m 6 kW The integrated server rack solution was proposed that can provide reduced power consumption and
improved cooling efficiency, and further result in better PUE and reduced cost compared to traditional
server layouts.
109
Heat pipe technology Wilson et al. [103] Theoretical method – 12.6 kW A thermal bus system composed by thermosyphon was studied to transfer heat in component level, and it
was found that significant reduction in thermal resistance can be achieved.
Leonard and Phillips [104] Theoretical method – 15 kW The energy saving by using the alternative thermal bus system can reach 64–69% depending upon the
application of a water-side economizer system.
Ding et al. [105] Experiment – 18.5–32.1 kW The separated heat pipe system was able to be used in summer, winter and trans-season with the PUE of
1.58, 1.20 and 1.38, respectively, which can save energy by approximate 48.3% compared to traditional
CRAC system.
Tong et al. [106] Experiment – 1–4 kW The total thermal resistance of a thermosyphon loop with the refrigerant of R744 was calculated by
theoretical model and compared with measured data, which was 22–25% lower than that of with R22.
Zhou et al. [107] Experiment 0.75 m × 0.32 m × 0.29 m 2 kW The energy consumption of the thermosyphon heat exchanger was only 41% of that of an air conditioner,
and the annual energy consumption can be reduced by 35.4%.
Tian et al. [108] Experiment – 6 kW Compared to the CRAC system, the new cooling solution can effectively improve the thermal reliability
of data processing devices by eliminating undesired air mixing and hot spots, and also can reduce the
cooling energy cost by about 46%.
Wu et al. [109] and Singh – Two types of cold energy storage system are explained and tested in the data center in Poughkeepsie,
et al. [110] New York, with heat output capacity of 8800 kW, which can provide an optimum system size with
minimum payback period of 3.5 years by handling 60% of datacenter yearly heat load.
Applied Energy 240 (2019) 84–119
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Servers Server
Server
Server
Server
Hot Server
aisle Cold Server
aisle Hot
Server aisle
Server
Cold
aisle
Server
Server
Server
Server
Server
Heat exchanger
Heat exchanger
Servers
Servers
Heat exchanger
range from 0.9 m to 2.0 m. The results indicate a considerable influence experimental methods. The heat exchanger was tested in the wind
of the pressure in hot aisles, thereby affecting the intake flowrate. tunnel system and the effects of the heat exchanger design on the
Under the condition of 50% fan duty cycle, it was found that the dif- cooling performance and air side pressure drop were modeled and
ferential pressure may rise rapidly when the hot aisle width is below analyzed quantitatively. The pressure drop across the heat exchanger
1.8 m, and this value was suggested as the minimum hot aisle width was very crucial that may lead to fan failure for inappropriate opera-
whenever possible. tion. Based on the fan curves, it was found that the pressure drop was
The heat exchanger is the key equipment in the enclosed rack-level increased by 51.3% from the vertical layout to the 30° angled case.
cooling system, and the thermo-hydraulic performance of the cross flow Furthermore, the location and containment strategy were numerically
water-and-air heat exchangers was investigated in many application analyzed, and results showed great agreement with the experimental
areas [154,155]. Because the compact and sensitive features of this data from literature. The specific modeling study was benchmarked
kind of heat exchanger, the transient effectiveness of the heat ex- with an IBM rear-door heat exchanger. With the increasing space in the
changers is especially important in the application of data center racks. cold aisle contained by the front door, the pressure drop in the rack
Gao et al. [94–97] studied the characterization of the transient effec- system was rapidly reduced. On the other side, it was found that suf-
tiveness of the heat exchanger for rack-level cooling with numerical and ficient space in the hot aisle contained by the back door should also be
110
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Gas
pipe Chilled Condensor
wate cycle
Gas pipe
Fans
Evaporator Server
Server
Rack Server
Server
Cold Server
aisle Liquid
Server pipe
Server
Server
Server Evaporator
Server
Server
Server
Liquid Server
pipe
(b) Schematic of the pool boiling process subject to different filling ratios
Fig. 18. Schematic of the thermosyphon system and effect of filling ratio [156].
provided for airflow moving to the heat exchanger. The effect of water heat from the severs and release heat subsequently to the heat pipe. The
inlet temperature, water flowrate and fan duty circles were also studied. inlet and outlet temperature is close to each other so that hot-air re-
When the duty cycle was between 30% and 60%, the pressure dis- circulation and cold-air bypass effects is extremely small. In this regard,
tribution along the height of the rack was close to absolutely uniform. A the concerns regarding airflow management is minimized. One of the
liner dependency between supply air temperature from the heat ex- interesting features in employing the thermosyphon heat exchanger is
changer and the water inlet temperature was observed when the water the correct filling ratios. Daraghmeh et al. [156] reported the optimum
inlet temperature changed from 15 °C to 20 °C. energy savings can be achieved with refrigerant at a filling ratio of 70%,
Recently, the heat pipe technology becomes increasingly popular for a further increase of filling ratio leads to a reduction in energy saving
data center cooling for its passive characteristic and fast responding and a substantial energy savings of 38.7% can be achieved at the filling
feature. Fig. 18(a) illustrates the schematic of data center rack with a ratio of 70%. As shown in the schematic of a typical thermosyphon tube
thermosyphon heat exchanger cooling system in which the evaporator in Fig. 18(b), The under-filling situation starves the evaporator, thereby
is placed at the back door of the rack [102]. The airflow in the data impairing the performance and higher uneven temperature distribution
center room will be drawn into servers by intake fans and the absorb especially at the upper part of the thermosiphon heat exchanger. With
111
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
112
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
1. CFD plays vital role in data center design and simulation. However, 2. For long-distance cooling systems, the outside ambient air can be
the simulation scale may change from sub-micro (chip level) to directly brought into data centers through filters which temperature
hundreds meters (room/building). Hence, some ingenious approach is lower than that in data center. The temperature of the outside air
encompassing multi-scale thermal analysis from chip to room/ can be further reduced via some evaporative design when the hu-
building level are becoming quite challenging. Yet the evaluation midity of outside air is comparatively low. Hence, strategies con-
criteria for the airflow management strategies are still not sufficient. cerning effective management of airflow upon the manipulation of
The interactions amid the airflow in cold aisle, hot aisles, fans, induced airflow from ambient and the airflow circuitry in data
airflow space inside the cabinet as well as flowrate distribution in center deserve further investigations. In addition, the airflow dis-
heat sinks should be comprehensively investigated. tribution in economizers that can affect the thermal efficiency is
113
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Server S7
Server S6
Server S5
0
Server S4
Server S3
Server S2
Server S1
(a) Case A
Server S7
Server S6
Server S5
0
Server S4
Server S3
Server S2
Server S1
(b) Case B
Cold aisle Hot aisle Ideal average flowrate
4.2 m·s-1
Server S8
Server S7
Server S6
Server S5
0
Server S4
Server S3
Server S2
Server S1
(c) Case C
Fig. 21. Effect of server layout on airflow uniformity.
114
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
Server S7
Server S6
Server S5
0
Server S4
Server S3
Server S2
Server S1
(a) Case D
Cold aisle Hot aisle Ideal average flowrate
4.2 m·s-1
Server S8
Server S7
Server S6
Server S5
0
Server S4
Server S3
Server S2
Server S1
(b) Case E
Fig. 22. Effect of heat exchanger layout on airflow uniformity.
system may have better energy efficiency and cooling capacity. The
0.03 design offers very minor disturbance on the indoor environment and
can be integrated with the vapor-compression systems. Hence, stu-
dies on the airflow management and thermal performance im-
provement of the two-phase closed thermosyphon loop subject to
0.02 geometrical structure, layout, power distribution, and the like are
recommended in future works.
4. In practice, the work load of data racks is not evenly distributed in
data centers. Yet this phenomenon may become accentuated with
0.01 the variation of airflow caused by supplied fans, fan-tray in rack,
buoyant airflow via heating load, or fans in CRAC (or CRAH), which
make complicated airflow that is difficult to predict within data
A B C D E centers. Hence introducing technologies like artificial intelligence
Case number (AI) or deep learning to integrate with training from CFD for smart
airflow management may be quite promising in future airflow
Fig. 23. Comparison of standard deviation of airflow uniformity.
management [168–170].
115
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
7. Conclusions [12] Nada SA, Said MA. Comprehensive study on the effects of plenum depths on air
flow and thermal managements in data centers. Int J Therm Sci 2017;122:302–12.
[13] Nagarathinam S, Fakhim B, Behnia M, Armfield S. A comparison of parametric and
This paper reviews the airflow management that imposes significant multivariable optimization techniques in a raised-floor data center. J Electron
impacts on the cooling performance in data centers. Based on the Packag 2013;135:030905.
available airflow path management methods, airflow paths with long- [14] Patankar SV, Karki KC. Distribution of cooling airflow in a raised-floor data center.
ASHRAE Trans 2004;110:629–34.
distance and short-distance are identified respectively. The long dis- [15] Sorell V. The Oft-forgotten component of air flow management in data center
tance may suffer appreciable loss from re-circulation, bypass, mal-dis- applications. ASHRAE Trans 2011;117:427–32.
tribution, and leakage but is comparatively economical in installation. [16] Rambo J, Nelson G, Joshi Y. Airflow distribution through perforated tiles in close
proximity to computer room air-conditioning units. ASHRAE Trans
Conversely, the short-distance cooling system can effectively reduce or 2007;113:124–35.
even completely eliminate the foregoing shortcomings but could be [17] VanGilder JW, Pardey ZM, Healey CM. Measurement of perforated tile airflow in
costly in deployment. Also, it may still suffer the airflow mal-distribu- data centers. ASHRAE Trans 2016;122:88–96.
[18] VanGilder JW, Sheffer ZR, Zhang X, Healey CM. Potential flow model for pre-
tion problem. Normally, the CFD method had been extensively used to
dicting perforated tile airflow in data centers. ASHRAE Trans 2011;117:771–86.
study the thermal and airflow management of the data centers. [19] Nada SA, Elfeky KE, Attia AMA, Alshaer WG. Experimental parametric study of
However, the CFD model must be firstly calibrated with experimental servers cooling management in data centers buildings. Heat Mass Transf
data before further improvement can be made. 2017;53:2083–97.
[20] Kang S, Schmidt RR, Kelkar KM, Radmehr A, Patankar SV. A methodology for the
For the long-distance cooling systems, the airflow management design of perforated tiles in raised floor data centers using computational flow
technologies applicable to the most commonly used raised-floor data analysis. IEEE Trans Compon Packag Technol 2001;24:177–83.
centers are reviewed and discussed. The major problems in airflow [21] Karki KC, Radmehr A, Patankar SV. Use of computational fluid dynamics for cal-
culating flow rates through perforated tiles in raised-floor data centers. HVAC&R
management includes hot-air recirculation, cold-air bypass, leakages, Res 2003;9:153–66.
over-provisioned and under-provisioned air supply, airflow and tem- [22] Abdelmaksoud WA, Khalifa HE, Dang TQ, Elhadidi B, Schmidt RR, Iyengar M.
perature non-uniformity. Yet these effects often interact with the geo- Experimental and computational study of perforated floor tiles in data centers. In:
12th IEEE intersociety conference on thermal and thermomechanical phenomena
metry layout of the data center. Hence, studies on the effect of plenum in electronic systems. JUN 02–05, Las Vegas, NV; 2010.
depth, perforated tiles, enhanced facility such as induced bypass fans, [23] VanGilder JW, Schmidt RR. Airflow uniformity through perforated tiles in a raised-
infrastructure layout, aisle containment and leakage from the existing floor data center. In: ASME/Pacific rim technical conference on integration and
packaging of MEMS, NEMS, and electronic systems, JUL 17–22, San Francisco, CA;
literatures are discussed and compared, and some rules of thumb for 2005.
manipulating the airflow are suggested based on the prior studies. In [24] Arghode VK, Joshi Y. Experimental investigation of air flow through a perforated
addition, studies on the overhead air supply method are also examined tile in a raised floor data center. J Electron Packag 2015;137:011011–11110.
[25] Arghode VK, Joshi Y. Modeling strategies for air flow through perforated tiles in a
and is compared with the raised-floor ones.
data center. IEEE Trans Compon Packag Manuf Technol 2013;3:800–10.
For the short-distance cooling system, studies on enclosed cooling [26] Zhang K, Zhang X, Li S, Jin X. Experimental study on the characteristics of supply
system with heat exchanger and heat pipe technology were addressed. air for UFAD system with perforated tiles. Energy Build 2014;80:1–6.
Furthermore, based on reviewed studies of the enclosed rack-level [27] Ling Y-Z, Zhang X-S, Zhang K, Jin X. On the characteristics of airflow through the
perforated tiles for raised-floor data centers. J Build Eng 2017;10:60–8.
cooling with heat exchanger, the necessity for airflow management is [28] Khalili S, Tradat MI, Nemati K, Seymour M, Sammakia B. Impact of tile design on
proposed by CFD studies, and the effect of server layout and heat ex- the thermal performance of open and enclosed aisles. J Electron Packag
changer layout on airflow uniformity has been investigated. It is found 2018;140:010907.
[29] Demetriou DW, Khalifa HE. Optimization of enclosed aisle data centers using
by appropriate management of the original design into centralized bypass recirculation. J Electron Packag 2012;134:020904.
server layout can ease the mal-distribution of airflow into the severs by [30] Khalifa HE, Demetriou DW. Enclosed-aisle data center cooling system. US patent,
30%. US2012/0024502; 2012.
[31] Erden HS, Koz M, Yildirim MT, Khalifa HE. Optimization of enclosed aisle data
centers with induced CRAH bypass. IEEE Trans Compon Pack Manuf Technol
Acknowledgements 2017;7:1981–9.
[32] Erden HS, Koz M, Yildirim MT, Khalifa HE. Experimental demonstration and flow
network model verification of induced CRAH bypass for cooling optimization of
The authors are indebted to the financial support from Ministry of
enclosed-aisle data centers. IEEE Trans Compon Packag Manuf Technol
Science and Technology, Taiwan under contract Nos. 107-2622-E-009- 2017;7:1795–803.
002-CC2 and 107-2221-E-009-143. [33] Song Z. Numerical cooling performance evaluation of fan-assisted perforations in a
raised-floor data center. Int J Heat Mass Transf 2016;95:833–42.
[34] Song Z. Thermal performance of a contained data center with fan-assisted per-
References forations. Appl Therm Eng 2016;102:1175–84.
[35] Arghode VK, Sundaralingam V, Joshi Y. Airflow management in a contained cold
[1] Steinbrecher RA, Schmidt R. Data center environments: ASHRAE's evolving aisle using active fan tiles for energy efficient data-center operation. Heat Transfer
thermal guidelines. ASHRAE J 2011;53:42–50. Eng 2016;37:246–56.
[2] Arman S, Sarah Josephine S, Dale AS, Richard EB, Magnus H, Jonathan GK, et al. [36] Athavale J, Joshi Y, Yoda M. Experimentally validated computational fluid dy-
United States Data Center Energy Usage Report; 2016. namics model for data center with active tiles. J Electron Packag
[3] Whitney J, P D. Data center efficiency assessment. New York: Natural Resources 2018;140:010902.
Defense Council; 2014. [37] Athavale J, Joshi Y, Yoda M, Phelps W. Impact of active tiles on data center flow
[4] Ni J, Bai X. A review of air conditioning energy performance in data centers. and temperature distribution. In: 15th IEEE intersociety conference on thermal and
Renew Sustain Energy Rev 2017;67:625–40. thermomechanical phenomena in electronic systems, MAY 31–JUN 03, Las Vegas,
[5] Oró E, Depoorter V, Garcia A, Salom J. Energy efficiency and renewable energy NV; 2016. p. 1162–71.
integration in data centres. Strategies and modelling review. Renew Sustain Energy [38] Samadiani E, Rambo J, Joshi Y. Numerical modeling of perforated tile flow dis-
Rev 2015;42:429–45. tribution in a raised-floor data center. J Electron Packag 2010;132:021002.
[6] Rong H, Zhang H, Xiao S, Li C, Hu C. Optimizing energy consumption for data [39] Nada SA, Said MA. Effect of CRAC units layout on thermal management of data
centers. Renew Sustain Energy Rev 2016;58:674–91. center. Appl Therm Eng 2017;118:339–44.
[7] Ham SW, Kim MH, Choi BN, Jeong JW. Energy saving potential of various air-side [40] Nada SA, Said MA, Rady MA. Numerical investigation and parametric study for
economizers in a modular data center. Appl Energy 2015;138:258–75. thermal and energy management enhancements in data centers' buildings. Appl
[8] Siriwardana J, Jayasekara S, Halgamuge SK. Potential of air-side economizers for Therm Eng 2016;98:110–28.
data center cooling: a case study for key Australian cities. Appl Energy [41] Nada SA, Said MA, Rady MA. CFD investigations of data centers’ thermal perfor-
2013;104:207–19. mance for different configurations of CRACs units and aisles separation.
[9] Karki KC, Patankar SV, Radmehr A, ASME. Techniques for controlling airflow Alexandria Eng J 2016;55:959–71.
distribution in raised-floor data centers. In: ASME international electronic packa- [42] Nada SA, Attia AMA, Elfeky KE. Experimental study of solving thermal hetero-
ging technical conference, JUL 06–11, 2003, Maui, HI. geneity problem of data center servers. Appl Therm Eng 2016;109:466–74.
[10] Karki KC, Patankar SV. Airflow distribution through perforated tiles in raised-floor [43] Radmehr A, Karki KC, Patankar SV, ASME. Analysis of airflow distribution across a
data centers. Build Environ 2006;41:734–44. front-to-rear server rack. In: ASME InterPACK conference, Vancouver, CANADA,
[11] Bhopte S, Agonafer D, Schmidt R, Sammakia B. Optimization of data center room JUL 08–12; 2007.
layout to minimize rack inlet air temperature. J Electron Packag 2006;128:380–7. [44] Zhang XH, Iyengar M, VanGilder JW, Schmidt RR. Effect of rack modeling detail
on the numerical results of a data center test cell. In: 11th IEEE intersociety
116
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
conference on thermal and thermomechanical phenomena in electronic systems, [72] Shrivastava SK, Calder A.R and Ibrahim M. Quantitative comparison of air con-
Orlando, FL, MAY 28-31, 2008. tainment systems. In: Intersocity conference on thermal and thermomechanical
[45] Rambo J, Joshi Y. Thermal performance metrics for arranging forced air cooled pheomena in electronic systems. San Diego, CA, US, May 30–June 1; 2012. p.
servers in a data processing cabinet. J Electron Packag 2005;127:452–9. 68–77.
[46] Kumar P, Joshi Y, Patterson MK, Steinbrecher R, Mena M. Cold aisle air dis- [73] John N. Hot aisle vs. cold aisle containment. APC Whitepaper #135; 2008.
tribution in a raised floor data center with heterogeneous opposing orientation [74] Kennedy D. Improving data centers with aisle containment. Eng Syst
racks. In: ASME pacific rim technical conference and exhibition on packaging and 2012;29:48–53.
integration of electronic and photonic systems, MEMS and NEMS. Portland, OR, [75] Pastrana C, King D, Seymour M. Aisle containment - just how important is it to
JUL 06–08, 2012. worry about by-pass and leakage paths? ASHRAE Trans 2015;121:1–8.
[47] Pramod Kumar, Vikneshan Sundaralingam, Joshi Y. Dynamics of cold aisle air [76] Khankari K. Analysis of Air Leakage from hot aisle containment systems and
distribution in a raised floor data center. In: Thermal issues in emerging tech- cooling efficiency of data centers. In: ASHRAE Winter conference, New York, NY;
nologies (ThETA). Cairo, Egypt, December 19–22; 2010. 2014, NY-14-C093.
[48] Arghode VK, Joshi Y. Measurement of air flow rate sensitivity to the differential [77] Radmehr A, Karki KC, Patankar SV, Schmidt RR. Asme. Distributed leakage flow in
pressure across a server rack in a data center. J Electron Packag 2015;137:041002. raised-floor data centers. New York: Amer Soc Mechanical Engineers; 2005.
[49] Fakhim B, Srinarayana N, Behnia M, Armfield SW. Thermal performance of data [78] Alkharabsheh SA, Shrivastava SK, Sammakia BG. Effect of cold aisle containment
centers-rack level analysis. IEEE Trans Compon Packag Manuf Technol leakage on flow rates and temperatures in a data center. In: ASME international
2013;3:792–9. technical conference and exhibition on packaging and integration of electronic and
[50] Wang IN, Tsui YY, Wang CC. Improvements of airflow distribution in a container photonic microsystems, Burlingame, CA, JUL 16–18; 2013.
data center. Energy Procedia 2015;75:1819–24. [79] Alkharabsheh SA, Muralidharan B, Ibrahim M, Shrivastava SK, Sammakia BG.
[51] Wilson D. Cooling system design for data centers utilizing containment archi- Open and contained cold aisle experimentally validated CFD model implementing
tecture. ASHRAE Trans 2012;118:415–9. CRAC and server fan curves for a data center test laboratory. In: Conference and
[52] Schmidt R, Vallury A, Iyengar M. Energgy saving through hot and cold aisle exhibition on packaging and integration of electronic and photonic microsystems
containment configurations for air cooled servers in data centers. In: ASME Pacific Burlingame, CA, JUL 16–18; 2013.
Rim technical conference and exhibition on packaging and integration of elec- [80] Tatchell-Evans M, Kapur N, Summers J, Thompson H, Oldham D. An experimental
tronic and photonic systems, MEMS and NEMS, Portland, OR, JUL 06–08; 2012. and theoretical investigation of the extent of bypass air within data centres em-
[53] Muralidharan B, Shrivastava SK, Ibrahim M, Alkharabsheh SA, Sammakia BG. ploying aisle containment, and its impact on power consumption. Appl Energy
Impact of cold aisle containment on thermal performance of data center. In: ASME 2017;186:457–69.
international technical conference and exhibition on packaging and integration of [81] Song ZH, Murray BT, Sammakia B. Parametric analysis for thermal characteriza-
electronic and photonic microsystems, Burlingame, CA, JUL 16–18, 2014. tion of leakage flow in data centers. In: 14th IEEE intersociety conference on
[54] Sundaralingam V, Arghode VK, Joshi Y, Phelps W. Experimental characterization thermal and thermomechanical phenomena in electronic systems. Orlando, FL,
of various cold aisle containment configurations for data centers. J Electron MAY 27–30; 2014. p. 778–85.
Packag 2014;137:011007. [82] Hamann H, Iyengar M, O'Boyle M. The impact of air flow leakage on server inlet
[55] Sundaralingam V, Arghode VK, Joshi Y. Experimental characterization of cold aisle air temperature in a raised floor data center. In: 11th IEEE intersociety conference
containment for data centers. In: 29th Annual IEEE semiconductor thermal mea- on thermal and thermomechanical phenomena in electronic systems, Orlando, FL,
surement and management symposium (SEMI-THERM). San Jose, CA, MAR 17–21; MAY 28–31, vols. 1–3; 2008. p. 1153–60.
2013. [83] Fink JR. Plenum-leakage bypass airflow in raised-floor data centers. ASHRAE
[56] Arghode VK, Sundaralingam V, Joshi Y, Phelps W. Thermal characteristics of open Trans 2015;121:422–9.
and contained data center cold aisle. J Heat Transfer 2013;135:061901. [84] Wang CH, Tsui YY, Wang CC. On cold-aisle containment of a container datacenter.
[57] Arghode VK, Joshi Y. Room level modeling of airflow in a contained data center Appl Therm Eng 2017;112:133–42.
aisle. J Electron Packag 2014;136:10. [85] Wang CH, Tsui YY, Wang CC. Airflow management on the efficiency index of a
[58] Nada SA, Elfeky KE, Attia AMA. Experimental investigations of air conditioning container data center having overhead air supply. J Electron Packag 2017;139:10.
solutions in high power density data centers using a scaled physical model. Int J [86] Srinarayana N, Fakhim B, Behnia M, Armfield SW. Thermal performance of an air-
Refrig 2016;63:87–99. cooled data center with raised-floor and non-raised-floor configurations. Heat
[59] Nada SA, Elfeky KE. Experimental investigations of thermal managements solu- Transfer Eng 2014;35:384–97.
tions in data centers buildings for different arrangements of cold aisles contain- [87] Ravi Udakeri VMaDA. Comparison of overhead supply and underfloor supply with
ments. J Build Eng 2016;5:41–9. rear heat exchanger in high density data center clusters. In: 24th Annual IEEE
[60] Khalaj AH, Scherer T, Siriwardana J, Halgamuge S. Increasing the thermal effi- semiconductor thermal measurement and management symposium, MARCH
ciency of an operational data center using cold aisle containment. In: 7th inter- 16–20; 2008. p. 165–72.
national conference on information and automation for sustainability. Beijing, [88] Udakeri R. Comparison of cooling performance of overhead and underfloor supply
China, SEP 27–29; 2014. with rear door heat exchanger in high density data center clusters, thesis.
[61] Alkharabsheh SA, Sammakia BG, Shrivastava SK. Experimentally validated com- University of Texas; 2008.
putational fluid dynamics model for a data center with cold aisle containment. J [89] Nakao M, Hayama H, Nishioka M, Ieee. Which cooling air supply system is better
Electron Packag 2015;137:021010. for a high heat density room-unerfloor or overhead. In: 13th international tele-
[62] Srujan Gondipalli SB, Bahgat Sammakia, Madhusudan K Iyengar, Roger Schmidt. communications energy conf (INTELEC91), Kyoto, Japan, NOV 05–08; 2002. p.
Effect of isolating cold aisles on rack inlet temperature. In: 11th intersociety 393–400.
conference on thermal and thermomechanical phenomena in electronic systems. [90] Sorell V, Escalante S, Yang J. Comparison of overhead and underfloor air delivery
Orlando, FL, USA, MAY, 28–31; 2008. systems in a data center environment using CFD modeling. In: Annual meeting of
[63] Gao C, Yu Z, Wu J. Investigation of airflow pattern of a typical data center by CFD the american-society-of-heating-refrigerating-and-air-conditioning-engineers
simulation. Energy Procedia 2015;78:2687–93. (ASHRAE), Denver, CO; 2005, DE-05-11-5.
[64] Zhou RL, Wang ZK, Bash CE, McReynolds A. Modeling and control for cooling [91] Schmidt RR, Iyengar M. Comparison between underfloor supply and overhead
management of data centers with hot aisle containment. In: ASME international supply ventilation designs for data center High-density clusters. ASHRAE Trans
mechanical engineering congress and exposition (IMECE), Denver, CO, NOV 2007;113:115–25.
11–17; 2012. [92] Iyengar M, Schmidt R, Kamath V, Singh P. Energy efficient economizer based data
[65] Wibron E, Ljung A-L, Lundström T. Computational fluid dynamics modeling and centers with air cooled servers. In: 13th IEEE intersociety conference on thermal
validating experiments of airflow in a data center. Energies 2018;11:644. and thermomechanical phenomena in electronic systems, San Diego, CA, MAY
[66] Onyiorah C, Eiland R, Agonafer D, Schmidt R. Effectiveness of rack-level con- 30–JUN 01; 2012. p. 367–76.
tainment in removing data center hot-spots. In: 14th IEEE intersociety conference [93] Gao TY, Kumar E, Sahini M, Ingalz C, Heydari A, Lu WD, et al. Innovative server
on thermal and thermomechanical phenomena in electronic systems. Orlando, FL, rack design with bottom located cooling unit. In: 15th IEEE intersociety conference
MAY 27–30; 2014. on thermal and thermomechanical phenomena in electronic systems. Las Vegas,
[67] Martin M, Khattar M, Germagian M. High density heat containment. ASHRAE J NV, MAY 31–JUN 03; 2016. p. 1172–81.
2007;49:38–43. [94] Gao TY, Valle M, Ortega A, Sammakia BG. Numerical and experimental char-
[68] VanGilder JW, Zhang XH. Cooling performance of ceiling-plenum-ducted con- acterization of the transient effectiveness of a water to air heat exchanger for data
tainment systems in data centers. In: 14th IEEE intersociety conference on thermal center cooling systems. In: ASME international technical conference and exhibition
and thermomechanical phenomena in electronic systems. Orlando, FL, MAY on packaging and integration of electronic and photonic microsystems
27–30; 2014. (InterPACK), San Francisco, CA, JUL 06–09; 2015.
[69] Tsuda A, Mino Y, Nishimura S. Comparison of ICT equipment air-intake tem- [95] Gao TY, Geer J, Sammakia B. Review and analysis of cross flow heat exchanger
peratures between cold aisle containment and hot aisle containment in data- transient modeling for flow rate and temperature variations. J Therm Sci Eng Appl
centers. In: 2017 IEEE international telecommunications energy conference. Gold 2015;7:10.
Coast, AUSTRALIA, OCT 22–26; 2017. p. 59–65. [96] Gao TY, Sammakia BG, Geer JF, Ortega A, Schmidt R. Dynamic analysis of cross
[70] Takahashi M, Uekusa T, Kishita M, Kaneko H. Aisle-capping method for airflow flow heat exchangers in data centers using transient effectiveness method. IEEE
design in data centers. In: 30th international telecommunications energy, San Trans Compon Pack Manuf Technol 2014;4:1925–35.
Diego, CA, SEP 14-18; 2008. p. 202–8. [97] Gao TY, Geer J, Sammakia B. Nonuniform temperature boundary condition effects
[71] Nemati K, Alissa HA, Murray BT, Sammakia B. Steady-state and transient com- on data center cross flow heat exchanger dynamic performance. Int J Heat Mass
parison of cold and hot aisle containment and chimney. In: 15th IEEE intersociety Transf 2014;79:1048–58.
conference on thermal and thermomechanical phenomena in electronic systems. [98] Kang S, Chen GF, Wang C, Ding RQ, Zhang JJ, Zhu PY, et al. Rack server solution
Las Vegas, MAY 31–JUN 03; 2016. p. 1435–43. in data center. In: International technical conference and exhibition on packaging
117
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
and integration of electronic and photonic microsystems, San Francisco, CA, JUL [132] Arghode VK, Kumar P, Joshi Y, Weiss T, Meyer G. Rack level modeling of air flow
06–09; 2015. through perforated tile in a data center. J Electron Packag
[99] Sahini M, Kshirsagar C, Kumar M, Agonafer D, Fernandes J, Na J, et al. Rack-level 2013;135:030902–30907.
study of hybrid cooled servers using warm water cooling for distributed vs. cen- [133] Depoorter V, Oro E, Salom J. The location as an energy efficiency and renewable
tralized pumping systems. In: 33rd annual semiconductor thermal measurement, energy supply measure for data centres in Europe. Appl Energy 2015;140:338–49.
modeling and management symposium (SEMI-THERM). San Jose, CA; 2017. p. [134] R.R. Schmidt, K.C. Karki, K.M. Kelkar, A. Radmehr, Patankar SC. Measurements
155–62. and predictions of the flow distribution through perforated tiles in raised floor
[100] Sahini M, Kumar E, Gao TY, Ingalz C, Heydari A, Sun XG. Study of air flow energy data centers. In: ASME international electronic packaging technical conference
within data center room and sizing of hot aisle containment for an active vs and exhibition. Kauai, Hawaii, July 8–13; 2001. IPACK2001-15728.
passive cooling design. In: 15th IEEE intersociety conference on thermal and [135] Shah A, Patel C, Bash C, Sharma R, Shih R. Impact of rack-level compaction on the
thermomechanical phenomena in electronic systems. Las Vegas, NV, MAY 31–JUN data center cooling ensemble. In: 11th IEEE intersociety conference on thermal
03; 2016. p. 1453–7. and thermomechanical phenomena in electronic systems, Orlando, FL, MAY
[101] Zhang HN, Shao SQ, Tian CQ, Zhang KZ. A review on thermosyphon and its in- 28–31; 2008. p. 1175–82.
tegrated system with vapor compression for free cooling of data centers. Renew [136] Shrivastava S, Iyengar M, Sammakia B, Schmidt R, VanGilder J. Experimental-
Sust Energ Rev. 2018;81:789–98. numerical comparison for a high-density data center: hot spot heat fluxes in excess
[102] Qian XD, Li Z, Tian H. Application of heat pipe system in data center cooling. of 500 W/ft2. IEEE Trans Compon Packag Technol 2009;1:166–72.
Sustain Energy Technol 2014;2:609–20. [137] Jian Q, Wang Q, Wang H, Zuo Z. Comparison between numerical and experimental
[103] Wilson M, Wattelet JP, Wert KW. A thermal bus system for cooling electronic results of airflow distribution in diffuser based data center. J Electron Packag
components in high density cabinets. ASHRAE Trans 2004;110:567–73. 2012;134:011006.
[104] Leonard PL, Phillips AL. The thermal bus opportunity-a quantum leap in data [138] Hassan NMS, Khan MMK, Rasul MG. Temperature monitoring and CFD analysis of
center cooling potential. ASHRAE Trans 2005;111:732–45. data centre. Procedia Eng 2013;56:551–9.
[105] Ding T, He ZG, Hao T, Li Z. Application of separated heat pipe system in data [139] Ni J, Jin B, Zhang B, Wang X. Simulation of thermal Ddstribution and airflow for
center cooling. Appl Therm Eng 2016;109:207–16. efficient energy consumption in a small data centers. Sustainability 2017;9:664.
[106] Tong Z, Ding T, Li Z, Liu XH. An experimental investigation of an R744 two-phase [140] Renner M, Seymour M. Data center operational CFD predictive models: are they
thermosyphon loop used to cool a data center. Appl Therm Eng 2015;90:362–5. accurate enough to be useful and reliable? ASHRAE Trans 2015;121:1–8.
[107] Zhou F, Tian X, Ma GY. Investigation into the energy consumption of a data center [141] Joshi Y, Kumar P. Energy efficient thermal management of data centers. Springer
with a thermosyphon heat exchanger. Chin Sci Bull 2011;56:2185–90. Science & Business Media; 2012.
[108] Tian H, He ZG, Li Z. A combined cooling solution for high heat density data centers [142] Patankar SV. Airflow and cooling in a data center. J Heat Transfer
using multi-stage heat pipe loops. Energy Build 2015;94:177–88. 2010;132:073001.
[109] Wu XP, Mochizuki M, Mashiko K, Thang N, Wuttijumnong V, Cabsao G, et al. [143] Freid E, Idelchik IE. Flow resistance, a design guide for engineers. New York:
Energy conservation approach for data center cooling using heat pipe based cold Hemisphere; 1989.
energy storage system. In: 26th Annual IEEE semiconductor thermal measurement [144] Abdelmaksoud WA, Khalifa HE, Dang TQ, Schmidt RR, Iyengar M. Improved CFD
and management symposium, Santa Clara, CA, FEB 21–25; 2010. p. 115–22. modeling of a samll data center test cell. In: 12th IEEE intersociety conference on
[110] Singh R, Mochizuki M, Mashiko K, Nguyen T. Heat pipe based cold energy storage thermal and thermomechanical phenomena in electronic systems. 12th inter-
systems for datacenter energy conservation. Energy 2011;36:2802–11. society conference on thermal and thermomechanical phenomena in electronic
[111] Daraghmeh HM, Wang CC. A review of current status of free cooling in data- systems (ITherm), Las Vegas, NV, JUN 02–05; 2010.
centers. Appl Therm Eng 2017;114:1224–39. [145] Arghode VK, Joshi Y. Modified body force model for air flow through perforated
[112] Fulpagare Y, Bhargav A. Advances in data center thermal management. Renew floor tiles in data centers. J Electron Packag 2016;138:031002–31011.
Sustain Energy Rev 2015;43:981–96. [146] Fulpagare Y, Mahamuni G, Bhargav A. Effect of plenum chamber obstructions on
[113] Ebrahimi K, Jones GF, Fleischer AS. A review of data center cooling technology, data center performance. Appl Therm Eng 2015;80:187–95.
operating conditions and the corresponding low-grade waste heat recovery op- [147] Siddharth Bhopte BS, Mahusudan K. Lyengar and Roger Schmidt. Guidelines on
portunities. Renew Sustain Energy Rev 2014;31:622–38. managing under floor blockages for improved data center performance. In: ASME
[114] Wang L, Khan SU. Review of performance metrics for green data centers: a tax- international mechanical engineering congress and exposition. Chicago, Illinois,
onomy study. J Supercomput 2013;63:639–56. November 5–10; 2006.
[115] Samadiani E, Joshi YK. Energy efficient thermal mangement of data centers via [148] Strong L. When containment is modularized. Mission Critical 2014;7:48–52.
oper multi-scale design: a review of research questions and approaches. J [149] Website. Available: < http://www.42u.com/data-center-containment.htm > .
Enhanced Heat Transfer 2011;18:15–30. [150] Website. Available: < https://www.osha.gov/ > .
[116] Schmidt RR, Iyengar M. Best practices for data center thermal and energy man- [151] Ergonomicas of the Physical Environment. Ergonomics of the thermal environ-
agement-review of literature. ASHRAE Trans 2007;113:206–18. ment – Assessment of heat stress using the WBGT (wet bulb globe temperature)
[117] Schmidt RR, Cruz EE, Iyengar M. Challenges of data center thermal management. index; 2017.
IBM J Res Dev 2005;49:709–23. [152] Chu WX, Hsu CS, Tsui YY, Wang CC. Experimental investigation on thermal
[118] Schmidt RR, Shaukatullah H. Computer and telecommunications equipment room management for small container data center. J Build Eng 2019;21:317–27.
cooling: a review of literature. IEEE Trans Compon Packag Technol [153] Iyengar M, David M, Parida P, Kamath V, Kochuparambil B, Graybill D, et al.
2003;26:89–98. Server liquid cooling with chiller-less data center design to enable significant
[119] Beaty D, Davidson T. Datacom airflow patterns. ASHRAE J 2005;47:50–4. energy savings. In: 28th Annual IEEE semiconductor thermal measurement and
[120] Zhang HN, Shao SQ, Xu HB, Zou HM, Tian CQ. Free cooling of data centers: a management symposium, San Jose, CA, MAR 18–22; 2012. p. 212–23.
review. Renew Sust Energ Rev 2014;35:171–82. [154] Qi ZG. Water retention and drainage on air side of heat exchangers-A review.
[121] Lu HJ, Zhang ZB, Yang L. A review on airflow distribution and management in Renew Sust Energ Rev 2013;28:1–10.
data center. Energy Build 2018;179:264–77. [155] Bhuiyan AA, Islam A. Thermal and hydraulic performance of finned-tube heat
[122] Green Grid Association. PUE: a comprehensive examination of the metric; 2012. exchangers under different flow ranges: a review on modeling and experiment. Int
[123] Herrlin MK. Rack cooling effectiveness in data centers and telecom central offices: J Heat Mass Transf 2016;101:38–59.
the rack cooling index (RCI). ASHRAE Trans 2005;111:725–31. [156] Daraghmeh H, Sulaiman M, Yang K-S, Wang C-C. Investigation of separated two-
[124] Herrlin MK. Improved data center energy efficiency and thermal performance by phase thermosiphon loop for relieving the air-conditioning loading in datacenter.
advanced airflow analysis. In: Digital power forum. San Francisco, CA, US, Energies 2019;12:105.
September 10–12; 2007. [157] Yue C, Zhang Q, Zhai ZQ, Ling L. CFD simulation on the heat transfer and flow
[125] Sharma RK, Bash C.E. and Patel C.D. Dimensionless parameters for evaluation of characteristics of a microchannel separate heat pipe under different filling ratios.
thermal design and performance of large-scale data centers. In: 8th ASME/AIAA Appl Therm Eng 2018;139:25–34.
Joint thermophysics and heat transfer conference. Louis, Missouri, JUNE 24–26; [158] Alammar AA, Al-Mousawi FN, Al-Dadah RK, Mahmoud SM, Hood R. Enhancing
2002. p. 3091. thermal performance of a two-phase closed thermosyphon with an internal surface
[126] Breen TJ, Walsh EJ, Punch J, Shah AJ, Bash CE, Rubenstein B, et al. From chip to roughness. J Cleaner Prod 2018;185:128–36.
cooling tower data center modeling: influence of air-stream containment on op- [159] Jafari D, Filippeschi S, Franco A, Di Marco P. Unsteady experimental and nu-
erating efficiency. J Electron Packag 2012;134:041006–41009. merical analysis of a two-phase closed thermosyphon at different filling ratios. Exp
[127] Garimella SV, Persoons T, Weibel J, Yeh LT. Technological drivers in data centers Therm Fluid Sci 2017;81:164–74.
and telecom systems: multiscale thermal, electrical, and energy management. Appl [160] Aghel B, Rahmi M, Almasi S. Experimental study on heat transfer characteristics of
Energy 2013;107:66–80. a modified two-phase closed thermosyphon. Therm Sci 2017;21:2481–9.
[128] Schmidt RR. Thermal profile of a high-density data center–methodology to ther- [161] Solomon AB, Mathew A, Ramachandran K, Pillai BC, Karthikeyan VK. Thermal
mally characterize a data center. ASHRAE Trans 2004;110:635–45. performance of anodized two phase closed thermosyphon (TPCT). Exp Therm
[129] Schmidt R, Karki K, Patankar S. Raised-floor data center: Perforated tile flow rates Fluid Sci 2013;48:49–57.
for various tile layouts. In: Ramakrishna K, Sammakia BG, Culham JR, Joshi YK, [162] Huminic G, Huminic A. Heat transfer characteristics of a two-phase closed ther-
Pang JHL, Jonnalagadda K, et al., editors. 9th intersociety conference on thermal mosyphons using nanofluids. Exp Therm Fluid Sci 2011;35:550–7.
and thermomechanical phenomena in electronic systems. Las Vegas, NV: JUN [163] Rahimi M, Asgary K, Jesri S. Thermal characteristics of a resurfaced condenser and
01–04; 2004. p. 571–8. evaporator closed two-phase thermosyphon. Int Commun Heat Mass Transfer
[130] IBM. Website.Available: < https://www.ibm.com/services/business-continuity/ 2010;37:703–10.
data-center > . [164] Paramatthanuwat T, Boothaisong S, Rittidech S, Booddachan K. Heat transfer
[131] Jennifer Cooke, Michelle Bailey, Villars RL. Datacenter trends and strategies. characteristics of a two-phase closed thermosyphon using de ionized water mixed
Available: < https://www.idc.com/getdoc.jsp?containerId=IDC_P13027 > . with silver nano. Heat Mass Transf 2010;46:281–5.
118
W.-X. Chu and C.-C. Wang Applied Energy 240 (2019) 84–119
[165] Samba A, Louahlia-Gualous H, Le Masson S, Norterhauser D. Two-phase thermo- optimization. Google; 2014.
syphon loop for cooling outdoor telecommunication equipments. Appl Therm Eng [170] Patel Chandrakant D, Bash Cullen E, Sharma Ratnesh, Beitelmal Monem, Friedrich
2013;50:1351–60. R. Smart cooling of data centers. In: International electronic packaging technical
[166] ANSYS Fluent Release 18.2; 2017. conference and exhibition (IPACK'03). Maui, Hawaii, July 6–11; 2003. p.
[167] Schmidt RR. Liquid cooling is back. Eletron Cooling. 2005;11:34–8. IPACK2003-35059.
[168] Deodhar Anirudh, Bhagwat Harshad, Singh U, Sankaranarayanan Dan, Singh [171] Song ZH. Studying the fan-assisted cooling using the Taguchi approach in open
Amarendra K, Sivasubramania A. Coordinated real-time management of return- and closed data centers. Int J Heat Mass Transf 2017;111:593–601.
air-temperature-controlled cooling units in data centers. ASHRAE Trans [172] Website. Available: < https://cdn.ttgtmedia.com/rms/dataCenter-Virtualization/
2015;121:440–5. CRAC_Hot_Cold_Aisle.png > .
[169] Jamidar Ratnesh, Gao J. Machine learning applications for data center
119