A Guide to Cooling Technologies in the Data Centre

Introduction

Cooling is often a misunderstood or overlooked part of many data centre designs. Some consultants believe that as long as there is some cooling in the room, it will be fine.

However, with many UPS systems having near unity power factor designs, cooling can make or break the room in terms of sustainability and efficiency. If the cooling system cannot cope with the desired rack densities, it may render the room unsuitable for its intended use.

Room Air Cooling

Room air cooling utilises Computer Room Air Conditioning (CRAC) or perimeter room cooling and employs either under-floor or flooded room discharge to cool the air in the room.

These units function like traditional comfort cooling systems, drawing in air from the data centre, cooling it down, and then circulating the cooled air back into the room. If using a raised floor void, vent tiles are strategically placed around the data centre to ensure that cool air is distributed evenly in front of the racks containing the IT equipment.

Without aisle containment, getting the warm air from the backs of the servers back to the cooling units is difficult, resulting in air mixing in the room. This makes the system inefficient, requiring overcooling to avoid hotspots. Room cooling is available in both water and DX variants.

In Row Cooling

In-row cooling, also known as close-coupled cooling, involves placing the cooling units within the rows of server racks. The front of the cooling units features fans from top to bottom, delivering cooling directly to the front of the server racks, with hot air return at the back.

In-row cooling works most efficiently with aisle containment, which prevents hot air (hot aisle containment) or cold air (cold aisle containment) from mixing with the room air, thus increasing efficiency and enabling precision cooling.

In our opinion, hot aisle containment is the most effective with in-row cooling. Warm air is captured in the contained aisle, and the in-row cooling units draw the air over the coils, cooling it before expelling it from the front. The open room holds the cool air for use by the racks. This large buffer of cool air is better able to cope with fluctuations in cooling requirements, which depend on the exhaust air temperature of the IT equipment (which is monitored).

This design is also highly effective during power outages, as it can maintain cooling while the generator starts up. In-row cooling can be either DX or chilled water.

Direct-To-Chip Liquid Cooling

Direct-to-chip cooling, also known as direct-on-chip or direct liquid cooling (DLC), is a highly efficient cooling method used in some data centres, particularly for high-performance computing systems.

DLC involves using a liquid coolant and a cold plate with coolant tubes in direct contact with heat-generating components such as the CPU. The liquid coolant absorbs the heat generated by these components, passing back through a CDU (coolant distribution unit), often running through a plate heat exchanger connected to an existing chilled water-cooling loop.

Direct-to-chip cooling can handle high thermal loads more effectively than air-cooled systems, making it suitable for high-performance IT equipment commonly linked to AI applications. It is also very energy-efficient, as it removes heat more directly and rapidly from the source, reducing the energy expended to cool the system. Currently, DLC systems can remove approximately 60% of the heat generated from a server, with the remaining 40% managed by air cooling.

Aisle Containment

Aisle containment is not an actual cooling solution but is designed to improve the efficiency of many cooling systems.

It prevents the mixing of hot and cold air streams, keeping the cool air cool for the servers and the warm air warm for the cooling units. In a hot aisle/cold aisle layout, servers are placed back-to-back and front-to-front, with physical barriers such as doors, roofs, or partitions used to separate the aisles and prevent air mixing.

This containment ensures that the cold supply air reaches the server intakes without being warmed by the hot exhaust air, and that the hot air is directed back to the cooling units. Aisle containment improves the predictability and efficiency of air-cooling systems, reducing energy consumption and creating a more reliable data centre environment. Blanking panels should always be used in conjunction with aisle containment to prevent warm air from cycling back to the IT equipment intake.

Immersion Cooling

Immersion cooling involves using special tanks or racks to submerge bare metal IT equipment into a bath of dielectric fluid or sit them in it, partially submerged. Both methods remove heat generated by the equipment, with the dielectric fluid then cooled through heat exchangers and a secondary cooling system.

The dielectric fluid conducts heat but not electricity, ensuring the equipment remains cool without risk of electrical shorts. This method requires IT equipment to have no metal cases or fans to ensure the fluid can move around the equipment to cool all areas.

Direct Expansion Cooling

Direct expansion (DX) cooling is a mechanical cooling technology that uses the evaporation and condensation of a refrigerant with a mechanical compressor to remove and expel heat to the outside air.

Discussions around DX cooling often focus on refrigerants with low global warming potential (GWP) and compressor technology to maximise efficiency. The compressors operate 24/7, regardless of outside temperatures, to enable continuous heat removal from the data centre.

Chilled Water Cooling

Chilled water systems use chilled water to remove heat from the data centre in a closed loop, with an external chiller cooling the water and pumping it back around to the indoor or in-row cooling units. This method is highly efficient and suitable for larger data centres.

Free Cooling Chillers

Free cooling chillers use low ambient air temperatures to reject the heat from the returning warm water. As cold aisle temperatures rise in the data centre, this technology can offer significant energy savings over DX cooling.

The mechanical cooling within the chiller is only used on warm days, while intelligent free cooling and partial free cooling systems maximise the use of ambient air, topped up with mechanical cooling as needed. In an N+1 or 2N solution, intelligent free cooling ensures maximum efficiency and energy savings across all connected chillers.

Conclusion

Cooling is a critical component of data centre design, and choosing the right technology can significantly impact efficiency, sustainability, and overall performance.

From room air cooling to advanced direct-to-chip liquid cooling, each method has its benefits and ideal use cases. Understanding these technologies and their applications can help you create a data centre environment that meets your needs both now and in the future.

At Datacentre UK, we are committed to providing the most efficient and sustainable cooling solutions tailored to your specific requirements.

Contact us today to learn more about how cooling technologies can enhance your data centre’s performance.