Keywords

26.1 Introduction

The rapid increase in the number of internet users in recent years has demanded very high heat density servers. Data centres are a very energy-intensive sector with rapid developments in the 5G and AI (Artificial Intelligence). The number of internet users will increase from 3.6 million to 5 billion between 2018 to 2025 respectively [1]. In 2018, data centres’ electricity consumption has reached 1% of global electricity demand [2]. During pandemic conditions, the data centres industry has seen rapid growth in its IT loads due to digitalization. Energy consumption in data centres is expected to rise by 15–20% each year [3]. Cooling of the IT equipment demands substantial power as 40% of the total power consumption in the data centres is from the cooing equipment. Also, the high demand for cloud-based data centres and High-Power Computers (HPSs) is leading to an increase in the cooling demand.

Traditionally all data centres are air cooled where they use the hot/cold containment systems to decrease the air circulation, reducing hot–cold air mixing and removing thermal inefficiencies. However, these energy-efficient design solutions are not alone enough to make the data centre very efficient. An in-depth analysis of thermal distribution is needed to avoid any downtime and to make the design flawless. CFD analysis method could be used to identify any flaws in the design in the early development stage. Therefore, this study will investigate reducing the cooling energy demand in the data centre by using external and internal CFD Simulations.

Simulation of internal and external thermal conditions in a data centre by using the CFD has become a very thriving research area in recent years. Swift development in the computational power and accuracy in the numerical models in the last decade made CFD a more viable and reliable option to understand the flow behaviour in the data centre. 6Sigma CFD models are important in the data centre industry because of their flexibility due to their specific features that are specifically designed for data centre sectors [4]. Commercial CFD solver software is proven to be accurate in data centre thermal modelling. Cho et al. [5] studied a series of simulations for various design options using commercial software to investigate thermal performance in data centres. Similarly, Nada and Said [6] analysed the importance of the CRAC (Computer Room Airconditioned Unit) layout for the high heat density data centre and found that placing the CRACs perpendicular to the data hall containment improved the thermal performance of the data hall.

So, the aim of this study is to develop a design of an external and internal layout of the data centre with a focus on enhancing their thermal performance that changes the dynamics of data centre cooling to energy savings. Therefore, the main objective is to accurately model the data centre and enhance the thermal distribution. Finally, evaluate the energy performance in the data centre.

26.2 Models

Internal computational fluid dynamics (CFD) studies have been conducted for the server data hall room. These studies are intended to determine the flow direction of air and assess the intake air temperature at the electrical equipment and the average temperature of the IT cabinet data hall room. The capability of the cooling system to deliver the required cooling capacity within the temperature tolerance under both normal operation and failure scenarios has also been assessed. The CFD simulations have been conducted using 6SigmaRoom R15 software to build the 3D geometry of the IT data hall room. All structural obstructions such as columns and beams have been included in the 3D CFD model. Figure 26.1 illustrates the 3D model of the data hall.

Fig. 26.1
A 3-D model represents the internal design of the data center hall.

Internal design of data hall

Along with this, A computational fluid dynamics (CFD) external study was conducted for the data centre, to investigate the potential uplift in supply air temperature for the chillers on the roof. The principal concern is the risk of cooling unit’s derating due to warm air entering from the neighbouring cooling units (normal operation) exhaust vents and generator exhaust flues (emergency operation). Figure 26.2 illustrates the 3D model layout for the roof of the data centre which includes the generators and chillers. The local microclimate was evaluated based on historical meteorological data from the NOAA (National Oceanic and Atmospheric Administration) weather website [7]. The meteorological data, which was associated with the hourly wind speeds recorded over a 20-year period between 2001 and 2020, were analysed., the maximum recorded dry-bulb temperature in the area is 39.1 °C in a 20-year period. Following the wind analysis, a temperature analysis was conducted for the 20-year period (2001–2020). As per the analysis, the dominant wind of 7 m/s from the SW direction was chosen for this study. Additional to this, the key properties that need to be considered in the energy-efficient data centre design are the cooling load and total facility load. For this application, the total IT load (Each rack 8.33 kW heat load) and cooling load are assumed as 96 MW IT data centre and chiller load 112.32 MW for this study.

Fig. 26.2
A 3-D model represents the external design of the data center roof. It has generators and chillers.

External design of data centre roof with chillers and generators

26.3 Theory

The airflow, temperature and pressure differences were governed by the following continuity, momentum and energy conservation equations including buoyancy effects. The solver breaks down the Navier stokes equation as Reynolds averaged Navier stoked equations to solve simulations within the realistic time scale and with less computational power [8]. A comprehensive description of the RANS model equations is expressed in Eqs. (26.126.4). 6Sigma uses the K-epsilon RANS Turbulence model to simulate the flow characteristics.

Continuity equations:

$$div \underline{U} = 0$$
(26.1)

Momentum equation for the RANS model:

$$\frac{\partial }{\partial t}\underline{(U)} + div (\underline{U} \underline{U)} = \frac{1}{\rho } div (\underline{\sigma } - \rho \underline{{\underline{{u^{\prime}}} \underline{{u^{\prime}}} )}} + \frac{1}{\rho }\underline{S}$$
(26.2)

where

$$\underline{\sigma } = - P I + \mu \left( {grad \left( {\underline{U} } \right) + grad \left( {\underline{{U^{T} }} } \right)} \right)$$
(26.3)

\(\underline{U}\)—Average velocity vector, \(\underline{\sigma }\)—Average stress tensor, \(\underline{{u^{\prime}}}\)—Velocity fluctional vector, S—additional momentum source and energy term for the RANS can be expressed as below:

$$\frac{\partial T}{{\partial t}} + div \left( {\underline{U} T} \right) = div \left( {\left( {\frac{v}{Pr} + \frac{{v_{r} }}{{Pr_{r} }}} \right)grad T} \right) + \frac{1}{{\rho C_{p} }}S_{Q}$$
(26.4)

T—Temperature, V—Dynamic Viscosity, Pr—Prandtl Number, \(S_{Q}\)—energy source.

Table 26.1 illustrates the boundary conditions and assumptions made in the 6Sigma Solver.

Table 26.1 Boundary conditions and assumptions

The simulations are not conducted on transient simulation and along with this simulation has not considered the solar gain on the roof of the building.

26.4 Energy Metric

The most popular method to calculate energy effectiveness is PUE (power usage effectiveness). This method was initially proposed and promoted by Green Grid (a non-profit IT organization) [9]. PUE is described as the ratio between the total facility power divided by the power required by the IT to operate. Along with PUE, the green grid also developed and promoted the DCiE (Data Centre Infrastructure Efficiency) which is inverse of the PUE. Equation 26.5 states the description of the PUE.

$${\text{PUE}} = \frac{Total \, Data \, Centre \, Facility \, Energy \, Consumption}{{IT \, Equipment \, Energy \, Consumption}}$$
(26.5)

26.5 Results and Discussion

In this section of the article, several results of the internal and external simulations are presented along with two various options for the chiller. The proposed sample data centre internal and external simulation illustrated vital role of the CFD to enhance the thermal performance of the data centre and increase its efficiency.

Figure 26.3a illustrate results as follows, data hall normal operation at 24.0 °C room temperature results showed that average room temperature distribution at 1.0 m from the floor level at 25.0 °C, highest data hall cold aisle temperature is—26.0 °C, highest averaged IT cabinet inlet temperature is 25.3 °C, maximum IT cabinet inlet temperature—26.2 °C, highest return temperature is 35.3 °C and average air velocity within cold aisle—1.2 m/s.

Fig. 26.3
2 models represent the temperature contour of the data hall and A S H R A E temperature limit of data hall cabinets. There is a respective color gradient scale on the right of each model. The maximum temperature in degrees Celsius is 32 in a, while the average temperature is between 18 and 27 in b.

a Temperature contour of data hall at 24.0 °C, b ASHRAE temperature limit for data hall cabinets at 24.0 °C

To push the data centres further to their thermal limits, the inlet temperature has been set to increase, to achieve the most efficient solutions possible. Figure 26.4a illustrate results as, data hall normal operation at 27.0 °C showed that average room temperature distribution at 1.0 m from floor level at 27.5 °C, highest cold aisle temperature is—31.0 °C, highest averaged IT cabinet inlet temperature is—28.9 °C, maximum IT cabinet inlet temperature—29.4 °C, highest return temperature is 38.2 °C, Average air velocity within cold aisle—0.9 m/s with the highest velocity as 3 m/s. Figure 26.4b illustrates ASHRAE standards for the rack inlet temperature in a failure scenario, failure scenario simulation results have illustrated that the rack inlet temperature has slightly in comparison with CRAC 24.0 °C increased but not exceeded the ASHRAE standards. The uplift (27.0 °C) in the inlet rack temperature made the data centre more efficient in comparison with the rack inlet temperature of 24.0 °C.

Fig. 26.4
3 models illustrate the temperature contour of the data hall and the failure scenario of the A S H R A E temperature limit of data hall cabinets. There is a respective color gradient scale on the right of each model. B and C have an average temperature range in the center, while A has a maximum temperature.

a Temperature contour of data hall at 27.0 °C, b ASHRAE temperature limit for data hall cabinets at 27.0 °C, c ASHRAE temperature limit for data hall cabinets at 27.0 °C (failure scenario–where are 2-cooling units failed)

Figure 26.5 illustrates the external CFD simulation temperature contour at roof level. Following boundary conditions are simplified to analyse the thermal performance of the roof design. Velocity: 7 m/s; Wind Direction: NE; Power failure so Generator’s Engine so all generators ON; All chillers status: On; the results showed that given roof layout design could work under the peak thermal and dominant wind conditions along with power failure mode where generator will be running. This has given very high confidence on the design work which will work flawlessly under any given worst-case conditions. Chart 26.1 presents the analytical calculation of cost saving on various inlet conditions along with the Water-Cooled Chiller (WCC) versus Air Water Cooled Chiller (ACC) option. Inlet water Temperature under options at 20.0 °C and 24.0 °C respectively. These results indicated that WCC chillers could potentially save money by around €100,000 per annum than ACC chillers. Along with this, Table 26.2 illustrates that better PUE also can be also achieved by using the WCC chiller as increasing the supply water temperature to the data halls.

Fig. 26.5
An external temperature contour model of the data center building roof. There is a color gradient scale of temperature in Celsius on the left of the model. The temperature varies from 39 to 45 degrees Celsius.

External temperature contour on the roof of the data centre building

Chart 26.1
A bar graph of energy cost in euros versus W C C and A C C. A C C 20 to 30 has the highest bar at 500.0000 euros, while W C C 24 to 34 has the smallest bar at 400.0000 euros. Values are estimated.

Energy cost in comparison of WCC and ACC

Table 26.2 PUE comparison

26.6 Conclusion

The following conclusions are drawn from this article and future work is to improvise the CFD analysis strategies to further improve the energy efficiency of the data centre.

  • High inlet temperature to data hall within the ASHRAE limit potentially improves the efficiency.

  • External simulation of the CFD concluded that the design of the data centre roof will work flawlessly under peak thermal and wind conditions.

  • 6sigma CFD solver proven to be extremely useful and reliable in simulating the data centre CFD scenarios both internally and externally.

  • Finally, this study also found that a water-cooled chiller solution is more efficient than air-cooled chillers.

  • As the solver only capable of using K-epsilon model to solve the simulations, more RANS model cannot be investigated. Addition to this, solver also not accounted the solar gain on the roof. This could be investigated further to model the roof behaviour very accurate.

  • As future work of this study, more realistic experimental data will be recorded in comparison with the CFD results to evaluate the accuracy of the solver in detail.