How to Design and Build a Data Centre for the New AI Era

Share
SmartMod Max is optimised to efficiently power, cool, and support next-generation data centre workloads
Data centres are now being designed and developed in a new and innovative way to accommodate demanding AI workloads and greater networking capabilities

When it comes to the data centre sector, artificial intelligence (AI) continues to dominate the headlines. From optimising workloads, to improving customer satisfaction, the technology has fast been touted as an integral solution for the next era of data centre operation.

However, AI has already led to data centres feeling the strain, particularly as far as energy consumption is concerned. As a result, businesses within the sector are being forced to design and build facilities in a new way. 

With insights from Black & White Engineering (B&W), Vertiv, atNorth and KPMG UK, we focus on how data centre companies can design and construct new data centres to accommodate new and disruptive technologies moving forward. 

Confronting a ‘new’ AI era

The vast majority of data centres currently in operation today were not designed to support the high-power requirements of AI-led workloads. New infrastructure requirements are different from traditional data centres, as they generate greater levels of heat that current facilities cannot remove fast enough.

“The industry is now facing unprecedented demand for new infrastructure solutions to efficiently power, cool and support this next generation of compute and as a result, AI is fundamentally reshaping the architecture of IT infrastructure,” explains Rajesh Sennik, Head of Data Centre Advisory at KPMG UK.

AI workloads also require almost-instant processing of vast amounts of data, which also requires a significant amount of energy. This means that data-intensive businesses will be looking for more modern sites that are designed specifically for AI.

“A data centre configured for typical enterprise applications might require 7-10 kilowatts (kW) of power per rack. But for AI, the power requirement increases to more than 30kW per rack,” says Anna Kristín Pálsdóttir Chief Development Officer at atNorth.

“As a result, legacy data centre campuses are having to be upgraded – not just to accommodate the digital infrastructure associated with AI workloads but to allow for significant cooling systems and power distribution units (PDUs), generators, and uninterruptible power supplies (UPS).”

Some of the key differences between traditional and AI data centres are: rack density, cooling technology and server technology. Air-cooled systems are now becoming insufficient for modern workloads, leading businesses to opt for liquid cooling systems direct-to-chip to improve heat transfer and target cooling more efficiently.

“Data centres now need to accommodate increasingly dense IT loads making optimised power and cooling management even more critical,” explains Alex Brew, Regional Director, Northern Europe at Vertiv. “With rack densities predicted to exceed 100kW per rack, power and cooling infrastructure design and deployment have become significantly more complex.”

Adam Asquith, Technical Director at Black & White Engineering adds: “With the projected growth rates and increases in chip TDP or rack density, new methods of cooling and power distribution will need to be adopted. This may include submersed cooling methods and power distribution consisting of conductors with higher ampacity. 

Anna Kristín Pálsdóttir from atNorth explains how new infrastructure is needed to support AI workloads in data centres

Questions of place (and space)

With AI being extremely computational, it requires increased power and cooling demands. This means training or inference models don’t need the same levels of uptime that traditional cloud computing requires, which could result in space and efficiency and cost related opportunities.

This is why location is a strategic consideration when it comes to data centre construction. Already, data centres are having to move AI workloads closer to the network edge in order to handle large data volumes.

“Edge computing is experiencing a significant rise in popularity,” Alex says. “As the volume of data generated by connected devices continues to grow at scale, organisations are recognising the immense potential of edge capabilities to meet network demand. But as the edge of the network becomes more sophisticated, so too will the infrastructure needed to support it.”

Whilst the industry is eager to build and operate data centres that can support higher IT loads, they are also focusing on regions and climates that can provide greater benefits such as free cooling or renewables to decarbonise.

“Traditionally, data centres were situated on-prem or close to their main business location,” Anna explains. “This is still sometimes necessary for sensitive data or regulatory compliance yet it is no longer necessary or even advantageous in many cases. Most AI workloads do not require very low latency networks making them location agnostic and as a result there will be a shift away from metro location sites to areas that can better meet the needs of AI workloads.”

She cites the Nordics as an example: “Regions with a cool and consistent climate can utilise more energy efficient cooling technologies such as natural air cooling or Direct Liquid Cooling (DLC). These techniques significantly lower the PUE of data centres, a 33% improvement in energy efficiency and significant carbon reductions, particularly when combined with Iceland's use of renewable energy.

“The Nordic countries also practise circular economy principles and so it is possible to employ heat reuse technology to recycle waste heat from data centre sites.”

As far as location is concerned, AI and edge computing can complement each other - meaning that ‘edge AI’ may become more commonplace. Therefore, the proximity of AI-driven technology to ‘point of use’ will become more critical, as Adam explains.

“This is especially the case for real time functions where localised processing and decision making can take place,” he says. 

Adam Asquith highlights the importance of adopting new methods of cooling and power distribution to meet growing demands in data centres

Confronting the sustainability dilemma

Although AI presents myriad opportunities for innovation within the data centre sector, its demand for computers leads to the inevitable issue of excess power consumption. As it requires more processing power, running AI models will place additional strain on infrastructure. To mitigate this, designing facilities with AI in mind is essential.

“Embracing a more circular economy-led design and operational principles utilising the bi-products of data centres to support local communities is essential,” Alex explains. “If these technologies are embraced, the net impact of these new AI workloads can be managed.”

Such high levels of energy use is unable to reduce carbon footprint - a key problem for data centres around the world, all of whom have been tasked with being more sustainable. As companies move to increasingly adopt AI, data centres will have to come up with new strategies to curb energy use.

“Businesses should start by defining a strategy to make sustainable decisions across the entirety of their IT estate to reduce their digital carbon emissions,” Rajesh says. “When it comes to AI, measuring and managing the environmental impacts of the technology is key. This can be achieved by creating a baseline using a common set of measures and then monitoring the change associated with the implemented AI models.”

In order to keep emissions down, sustainability must remain a critical focus of the design, construction and operational processes of a data centre. A successful AI data centre will need to incorporate green energy sources to better meet power requirements in an environmentally-conscious way. 

“Employing alternative building methods that utilise local materials can also reduce the carbon footprint of data centre construction, while also supporting local economies,” Adam notes. “Refurbishing and recycling old equipment should become standard practices to mitigate e-waste. By prioritising sustainability in every aspect of their operation, data centres can play a pivotal role in fostering a greener future while supporting the burgeoning demands of AI.”

Experience the future with data centres accommodating increasingly dense IT loads and advanced cooling technology

Utilising network for data centres

When building new data centres, integrating legacy infrastructure into a new facility can raise several challenges, including backup or security issues. However, with a clear strategy and assessment, this migration can be successful. 

Such an approach includes gradual upgrades and evaluating what operations can be scaled efficiently without disrupting existing services.

“A well-defined migration plan helps minimise disruptions and ensures compatibility between old and new systems,” Rajesh explains. “By updating legacy applications, organisations can take advantage of new technologies and improve overall efficiency.”

Adam adds: “Collaboration with key stakeholders, including utility providers and technology partners, is also crucial. Engaging these stakeholders during the early design stages ensures that integration plans are aligned with current and future energy requirements, regulatory frameworks and sustainability goals. 

“By taking a comprehensive, forward-thinking approach, data centre organisations can seamlessly integrate legacy systems into their new facilities, maximising both operational efficiency and sustainability.”

As data centres become more commonplace in everyday life, especially in the UK where they were classed as critical national infrastructure, keeping data safe and resilient is essential. Organisations looking to design and build new data centres will need to future-proof their facilities, particularly as technology evolves.

“Critical environments need to have the flexibility to handle a diverse range of requirements as well as cater for equipment that is still quite traditional in terms of its needs,” Alex explains. “Operators looking to build new facilities should embrace cutting-edge technologies and design infrastructure that can scale rapidly. Planning for future density requirements is critical to avoid costly retrofits or over-provisioning of resources, both of which can lead to stranded capacity and inefficiencies.”

Facilities should also be flexible enough to meet today’s compute needs, as Adam highlights: “Data centre organisations should engage with key project stakeholders and map this out in the early design stages to ensure effective transitions and implementation. 

“As power densities and consumption rates continue to increase, sustainability and efficiency will likely become more imposing and will remain a key challenge to overcome. Power and energy security and diversity will become more important.”

To read the full story in the magazine, click HERE.


Explore the latest edition of Data Centre Magazine  and be part of the conversation at our global conference series, Tech & AI LIVE and Data Centre LIVE

Discover all our upcoming events and secure your tickets today. ​​​​​​​


Data Centre Magazine is a BizClik brand

Share

Featured Articles

Lenovo: Data Centre Regulations to Transform Sustainability

Lenovo's Ian Jeffs explains how evolving sustainability regulations are reshaping data centre operations and driving technological advancement

Re:Invent 2024: How AWS & Nvidia are Powering Ahead With AI

AWS and Nvidia showcased direct liquid cooling systems, neural interface models and quantum computing developments for data centres at Re:Invent 2024

AVK Commits to Net Zero Power Solutions with Rolls-Royce

AVK and Rolls-Royce alliance delivers over 500 sustainable generators in 2024, as data centre industry embraces HVO-ready, hydrogen-capable power solutions

How AWS Plans to Boost Sustainability with Orbital Materials

Schneider Electric's AI Data Centre Solutions: Explained

AWS Unveils Next-Gen Data Centres for AI Computing Demands