The use of liquid cooling within data centres promises improved energy efficiency and the possibility of redirecting excess heat to other purposes outside of the facility, such as commercial or residential heating. Both are compelling advantages given the rising cost of energy and environmental pressures to be more efficient.
However, as Cadence’s Distinguished Engineer Mark Seymour explains, to realise these benefits, operators need to develop a comprehensive understanding of the technology, including its limitations, where challenges could arise, and how tools — such as data centre digital twins — can help.
Liquid cooling and the impact on legacy data centres
One of the key challenges posed by liquid cooling is how it will work alongside air cooling in legacy data centres. Managers will have invested in the latter at a significant cost and will want to ensure that introducing liquid cooling won’t result in inefficiencies, such as disrupting desired hot air output. Avoiding this requires operators to coordinate the intricate cooling networks of both systems — a far-from-simple task that cannot be achieved without detailed oversight of how they’ll work in conjunction with one another.
Even if liquid cooling doesn’t need to be combined with air cooling, its introduction is complex because it requires both fluid and electrical connections to be made. This setup can seem alien and is in no way comparable to the air-cooling systems most operators are familiar with. These challenges are present regardless of whether managers opt for immersion cooling — where computers are submerged in a mineral oil, or equivalent, bath — or cold plate technology. Here, a metallic plate with fluid inside replaces what would traditionally have been a heatsink on the CPU, GPU or other hot components. What’s more, the operational challenges presented by liquid cooling create limitations to its efficacy.
Theoretically, liquid cooling should be able to divert nearly 100% of the heat away from the electronic chips and into the coolant, but material incompatibility and the systems’ reliance on buoyancy-driven flow present obstacles.
In the immersion cooling model, the electrical components are in direct contact with the coolant. In time, the materials the electronics are made up of, such as the plasticizers in wires, might react with the coolant, resulting in the coatings becoming brittle and degraded. This diminishes the longevity of the equipment. Secondly, as data centres turn to higher power densities, and chips become more densely packed, immersion systems may find it difficult to remove heat via the buoyancy-driven flow mechanisms that they rely upon.
Cold plate cooling has different issues. If the standard of the coolant is subpar, it can cause deposits and even corrosion within the plate, minimizing the efficiency of the heat transfer. This affects the systems’ effectiveness, meaning chip temperatures will increase, thermal problems will arise, and the chips’ performance will decrease. What’s more, not all the components within electrical equipment may be compatible with the heat removal channels in cold plate technology. This means that a percentage of the heat — up to 10% or even 20% — is released into the surrounding environment. Acknowledging the higher power densities can create a cooling capacity akin to the traditional air-cooling system.
Given the challenges with both cold plate and immersion cooling and the fact that neither has risen to prominence yet, facility owners and operators must carefully balance the two against one another when deciding which, if either, to implement.
Digital twins are here to help
Data centre digital twins — virtual replicas of the real-life facility — can help with this decision along with ongoing management. They do this by giving operators a detailed view of the data centre, including aspects that are traditionally hard to observe or measure. For instance, they allow operators to test out various cooling scenarios with the ability to perform steady-state and transient simulations in connected 1D-3D models. This happens using both liquid and air to assess and stress test different configurations of large complex cooling systems. In turn, engineers can assess strengths, flaws and any fine-tuning that may need to be made before they transfer their findings to implementation at their physical site. Consequently, engineers will be assured that their decisions are well-informed. As a result, this avoids the risk of downtime, which according to our digital transformation survey of 750 data centre professionals, proved to be the number-one concern keeping data centre operators up at night.
There is no cure-all for every data centre cooling challenge. Undoubtedly, immersion and cold plate systems are promising, but they aren’t yet perfect, and facilities must proceed with a measured approach. However, there are tools available, such as digital twins, that can ease the pressure on operators as they navigate the difficult decision of which to implement, to what extent, when, and how to manage it. In this way, data centre digital twin technology will enable data centres to support future business objectives and transform in line with the pace of the technology industry.
Other magazines that may be of interest - Mobile Magazine.
Please also check out our upcoming event - Net Zero LIVE on 6 and 7 March 2024.
BizClik is a global provider of B2B digital media platforms that cover Executive Communities for CEOs, CFOs, CMOs, Sustainability leaders, Procurement & Supply Chain leaders, Technology & AI leaders, Cyber leaders, FinTech & InsurTech leaders as well as covering industries such as Manufacturing, Mining, Energy, EV, Construction, Healthcare and Food.
BizClik – based in London, Dubai, and New York – offers services such as content creation, advertising & sponsorship solutions, webinars & events.