The benefits of chassis-level liquid cooling

By Jason Matteson
Jason Matteson, Director of Product Strategy at Iceotope, breaks down the benefits of liquid cooling at the chassis level for enterprise IT infrastructu...

Some may be surprised to learn that the benefits of chassis-level liquid cooling can be leveraged for deployments anywhere today. How an organisation deploys liquid cooling technologies might vary, but the key advantages are now flowing through to colocation implementations, enterprise infrastructures, legacy data centres and increasingly incorporating a view to the edge.

Several factors are driving rising demand for liquid cooling solutions - not least because tried and trusted strategies are no longer enough. 

High performance with maximum efficiency

The earliest adopters of liquid cooling, including the likes of IBM or Lenovo, for example, have often been large enterprises engaged in high performance computing (HPC) or 'supercomputing' projects, targeting highest performance at lowest temperatures possible across their entire infrastructure, with additional focus on energy efficiency in some instances.  

Meanwhile, the hyperscale companies of this world, like Microsoft, Facebook and Google require data centre support for increasingly advanced analyses, monitoring and management, and data handling. Consumers are using these companies' platforms and apps, which are required to enable and provide data intensive solutions, such as facial recognition as well as analysing behaviour and trend patterns. 

In addition to the aforementioned, a broader range of organisations have begun targeting artificial intelligence (AI), machine learning, data analytics and other resource-hungry applications such as advanced imaging. Yet traditional air cooling strategies haven't kept up with these developments, as heat sink volume and footprint limitations, along with the inability for fans to deliver the airflow required for these new high performance processors is bottoming out.

In my days of designing servers, we would allocate as much as 10% of the server power for the air-cooled circuit. Today’s solutions are likely budgeting as much as 20% of server power to the air-cooling circuit. 

It's clear, in fact, that we've reached a point of diminishing returns when it comes to air-cooling for the data centre, especially when you consider that disruptive, resource-hungry, intelligent applications are finding new deployment and use cases developing across almost every market segment, from enterprise, financial, to oil and gas, to warehousing and logistics, to healthcare and beyond.

The overarching macro trend is for using data in more intelligent ways to generate valuable outcomes, and all this data-driven activity requires better solutions for cooling. Fortunately, liquid cooling is better at removing and recovering the heat (warmed liquids, for example, can be piped elsewhere and deployed to achieve other purposes, such as site heating). 

Solving 'big data' for the emerging edge

A myriad of applications are also getting closer to the edge of the network. Software is deployed alongside smart Internet of Things (IoT) sensors that collect or process data in situ, for instance to support process automation. Edge computing means a growing need to handle, manipulate, communicate, store and retrieve data quickly, efficiently and cost-effectively, whenever required.

Organisations can no longer afford to rely completely on centralised data centres and their related latencies; it has become critical to be able to analyse and manipulate data in near real-time to be assured of improved outcomes.

Gartner has projected that by 2025, most cloud service platforms will provide at least some distributed cloud services that execute at the point of need. Distributed cloud can replace private cloud and provides edge cloud and other new use cases for cloud computing. In 2018, perhaps 10% of enterprise-generated data was processed outside a centralised datacentre or cloud, with the share expected to potentially reach 75% by 2025.

So, the stage has been set. It's clear that worlds are colliding in a race for high performance computing that can meet the needs of disruptive data-hungry applications. Everyone wants these benefits; it doesn't matter whether you're an enterprise or a hyperscaler, a financial services provider or a hospital. 

Data centre professionals might argue that this sort of technology doesn't belong at the edge - that the risks are just too great because you need environmental and climate control, and you need skilled staff on hand to guarantee that resiliency and uptime around the clock. That's where offerings like chassis-level liquid cooling platform can provide a secure, high performance, and environmentally sealed solution, which can be monitored and managed remotely to help mitigate risks even at the perimeter.

Managing risk trajectories for the future of colocation data centres

Legacy data centres looking into digital transformation stand to gain, too, because chassis-level liquid cooling can mitigate challenges around serviceability, density and restricted floor space. Colocation service providers looking to serve a wider range of customers and future-proof their offerings will similarly stand to gain. 

For colocation providers, it can be even more critical to mitigate risk, both from their own perspective as well as  their tenants. As a result, there will be a major shift to liquid cooling in ways that help mitigate the spectrum of risks, whether the solution chosen is by delivering liquid direct-to-chip or immersion-based.

We believe that chassis-level immersion technology delivers many organisations the highest possible performance cooling in a form factor that's both rackable and vertically scalable -- unlike the 'bathtub' type solutions of the past. Because it comes in a form factor that is both serviceable and sits neatly inside the rack, it still delivers many of the advantages and convenience of the air-cooled strategies of previous years.

Chassis-level immersion cooling technology is market-agnostic. We've measured up to 95% energy and heat recovery on our systems, which means that a solutions provider can implement architectures which minimise power constraint issues or are capable of capturing the heat from their servers and dump that 'waste' energy back in the grid.

Put simply, you can deploy chassis-level cooling solution anywhere for high resiliency and efficiency, at high density, while still preparing to meet the power and cooling requirements of future CPUs and GPUs. Right now, all types of organisations are struggling to understand what sort of platform they need, which generations of technology should be used, how fast their processing should be, what kind of memory and networking will suit, and so on -- so providing a clear path for cooling can ease some of their pain.

We like to say that we're responding to disruptive challenges by providing common-sense solutions. Iceotope offerings such as Ku:l 2 enable dielectric liquid cooling, at chassis-level, for off-the-shelf servers which check all the boxes around performance and energy efficiency. The industry is at a crossroads; however, you can have your cake and eat it too when it comes to increasing cooling performance, improving energy efficiency and density in your environment, while mitigating risks.

Share

Featured Articles

Blackstone's Vision for Hyperscale Data Centre Campus

Blackstone to transform Northumberland site from car battery factory to a hyperscale data centre campus, in a new initiative to meet growing data demands

Maincubes Bolsters Leadership Team with Martin Murphy as COO

maincubes appoints new COO Martin Murphy, after recent introduction of Zahl Limbuwala to Executive Chairman of the Advisory Board

How Kove Unlocks Transformative Growth for Your Organisation

Kove helps clients maximise infrastructure performance using software-defined memory. Learn how

US Data Centres Confront the Strain of Rising Power Demands

Critical Environments

Data storage, memory and generation with IEEE’s Tom Coughlin

Networking

Digital Realty Continues Renewable Rollout to the US

Data Centres