The role of containment in mission-critical edge deployments

By Gordon Johnson
Gordon Johnson, Senior CFD Engineer at Subzero Engineering, discusses the vital role containment plays in addressing edge data centre requirements

In recent years edge computing has become one of the most prevalent topics of discussion within our industry. In many respects, the main purpose of edge data centres is to reduce latency and delays in transmitting data and to store critical IT applications securely. In other words, edge data centres store and process data and services as close to the end-user as possible. 

Edge is a term that’s also become synonymous with some of the world’s most cutting-edge technologies. Autonomous vehicles have often been discussed as one of the truest examples of the edge in action, where anything less than near real-time data processing and ultra-low latency could have fatal consequences for the user. There are also many mission-critical scenarios, including within retail, logistics and healthcare, where a typically high-density computing environment, packed into a relatively small footprint and a high kW/rack load is housed within an edge environment.

Drivers at the edge  


According to Gartner, by 2020, internet-capable devices worldwide reached over 20bn, and are expected to double by 2025.  It is also estimated that approximately 463 exabytes of data (1 exabyte is equivalent to 1 billion gigabytes) will be generated each day by people as of 2025, which equates to the same volume of data as 212,765,957 DVDs per day! 

While the Internet of Things (IoT) was the initial driver of edge computing, especially for smart devices, these examples have been joined by content delivery networks, video streaming and remote monitoring services, with augmented and virtual reality software, expected to be another key use case. What’s more, transformational 5G connectivity has yet to have its predicted, major impact on the edge.

Clearly, there are significant benefits in decentralising computing power away from a traditional data centre and moving it closer to the point where data is generated and/or consumed. Right now, edge computing is still evolving but one thing we can say with certainty, is that the demand for local, near real-time computing represents a major shift in what types of services edge data centers will need to provide.

Efficiency and optimisation remain key


An optimised edge data centre environment is required to meet a long list of criteria, the first being reliability as edge facilities are often remote and have no on-site maintenance capabilities. Secondly, they require modularity and scalability, the ability to grow with demands. Thirdly, there’s the issue of a lack of a ‘true’ definition. Customers still need to define the edge in the context of their business requirements, deploying infrastructure in line with business demands, which can of course affect the design of their environment. And finally, speed of installation. For many end-users time to market is critical, so an edge data centre often needs to be built and delivered on-site in a matter of weeks. 

There is, however, one more important factor to consider. An edge data centre should offer true flexibility, allowing the user to quickly adapt or capitalize on new business opportunities while offering sustainable and energy-efficient performance.

Edge data centres are, in many respects, no different from traditional facilities when it comes to the twin imperatives of efficiency and sustainability. PUE as a measure of energy efficiency applies to the edge as much as to large, centralised facilities. 

And sustainability, especially the drive towards Net Zero, is a major focus for the sector in its entirety. However, what will change over time is the ratio of edge data centres. By 2040, it’s predicted that 80% of total data centre energy consumption will be from edge data centres, which begs an obvious question: what will make the edge energy-efficient, environmentally responsible, reliable and sustainable all at the same time?

The role of containment


Containment is almost certainly the easiest way to increase efficiency in the data centre. It also makes a data centre environmentally conscious because, instead of consuming energy, containment saves it. This is especially true at the edge. 

Containment helps users get the most out of an edge deployment because containment prevents cold supply from mixing with hot exhaust air. This allows supply temperatures at the server inlets to be increased. 

Since today’s servers are recommended to operate at temperatures as high as 80.6 degrees Fahrenheit (27 degrees Celsius), containment allows for higher supply temperatures, less overall cooling, lower fan speeds, increased use of free cooling and reduced water consumption – all important factors when it comes to improving efficiency and reducing carbon footprint at the edge.

Further, a contained solution consumes less power than an application without it, which means an environmentally friendly, cost-effective environment. Additionally, it improves reliability, delivering longer Mean Time Between Failures (MTBF) for the IT equipment, as well as lower PUE.

Uncertainty demands flexibility


At Subzero we believe an edge data centre needs to be flexible and both quick and easy to install. It needs to be right-sized for the here and now, but capable of incremental, scalable growth. Further, it should allow the customer to specify the key components, such as the IT, storage, power and cooling solutions, without constraining them by size or vendor selection.

Thankfully, there are edge data centre providers who now offer an enclosure built on-site in a matter of days, with ground-supported or ceiling-hung infrastructure to support ladder racks, cable trays, racks and cooling equipment. 

These architectures mean the customer can choose their own power and cooling systems and once the IT stack is on-site and the power is connected, the data center can be up and running in a matter of days. 

Back in 2018, Gartner predicted that, by 2023, three-quarters of all enterprise-generated data would be created and processed outside a traditional, centralised data centre. As more and more applications move from large, centralised data centres to small edge environments, we anticipate that only a flexible, containerised architecture will offer end-users the perfect balance of efficiency, sustainability and performance.  

The latest Subzero White Paper by Gordon Johnson – Making the Edge Efficient, Scalable, and Sustainable can be found here.




Featured Articles

Huawei Cloud Expands Global Footprint with New Cairo Region

Huawei Cloud is expanding its footprint with the launch of a new cloud region in Egypt, becoming the first major public cloud provider in the country

Onnec: Building Future-Proof Data Centre Strategies

Data Centre Magazine speaks with Onnec’s Niklas Lindqvist and Matt Salter about the future of sustainable data centres and how to harness the power of AI

STT GDC Vietnam Expansion to Fuel Digital Transformation

STT GDC is partnering with VNG to expand and construct data centre facilities in Vietnam to further accelerate the country’s digital transformation

New OVHcloud Data Centre in Sydney Powered by Liquid Cooling

Data Centres

The Datacloud Congress is back for 2024

Data Centres

Microsoft’s US$4bn Investment in France’s Data Centres

Technology & AI