Data Centres Fuelling the AI and Connectivity Revolution

Share
Connectivity
Amid staggering growth for data, connectivity and AI adoption, the data centre industry faces challenges around scalability, power and sustainability

The world is on the brink of an artificial intelligence (AI) revolution that will transform industries and daily life. From drug discovery and climate modelling to self-driving cars and language assistants, AI is poised to drive massive efficiency and productivity gains. However, realising AI's potential requires building out powerful new infrastructure.

Speaking at Schneider Electric’s Innovation Summit event, Marc Garner, SVP of Schneider’s Secure Power division, highlighted the staggering growth projected for data, connectivity and AI adoption in the coming decades. “We'll have 100 million new IoT devices by 2050, nearly doubling the number installed today,” he said. “Data is growing 61% yearly, and AI could unlock US$16tn in new productivity.”

However, realising this digital economy requires expanding data centre infrastructure to support skyrocketing demand. “Everyone wants the data, but the data centres are the part in question,” Marc explained. “How do we deliver the needs and demands of the future?

“Data is ultimately going to drive efficiency, and we need to drive efficiency within our centres to be able to support that. Undoubtedly, AI is the topic that everyone is talking about at the moment. It's on the tip of everyone's tongue, and there's not a day that goes by where we don't hear those two letters.”

While growing 12% annually without considering the impact of AI, Garner stated the data centre industry must prepare for AI workload growth of around 30% per year going forward. “The total workload from data centres, we expect to grow from 54 gigawatts in 2023 to around about 90 gigawatts in 2025. The AI components of that will go from around 8% last year to around about 20% going forward. So the acceleration is significant, and the change is significant that we see.”

Cutting-edge AI models and workflows

Dion Harris from Nvidia's Accelerated Data Center Infrastructure group highlights how the AI models fueling today's innovations like ChatGPT are rapidly evolving.

“In the early days, AI focused on image classification with convolutional neural networks," he explained. “Then transformer models enabled large language models like GPT, with billions of parameters trained on vast datasets."

The latest frontier, according to Dion, is models based on Mixtures of Experts (MoE) that split the parameter space across multiple systems. “MoE models allow much larger models by parallelizing the infrastructure,” he said. “But it requires ultra-fast communication between the parallel expert nodes.”

This dramatic increase in model size and complexity is driving exponential growth in AI computing requirements, projected to surge from 4.3 petaflops today to nearly 800 petaflops within a couple of years. Moreover, AI inference – deploying trained models for real-world use cases – is expected to explode from just 5% of workloads today to over 60% by 2026.

“After deploying the AI training clusters over the next few years, inference is what will truly change our daily lives,” Dion says. “My favourite quote in the last six plus months comes from Nvidia's CEO Jensen Huang - ‘AI is the defining technology of our time. By working with the most dynamic companies in the world, we will realise the promise of AI in every industry.’ When you think about that quote, it really encompasses everything we do. Imagine AI in every industrial sector - education, healthcare, finance, and so on.

“When you hear the excitement of everyone rushing to adopt AI, it's not just because of ChatGPT, it's because of real efficiencies being brought out, not just in the data centre, but in business operations and scientific impact.”

Deployment of AI: The data centre challenges

Rapidly scaling AI infrastructure faces significant hurdles, particularly around data centre construction, sourcing of materials and the availability of skilled labour and power.

"Vacancy rates are at an all-time high due to data demand that we can't build out facilities fast enough to satisfy," Marc says. "We can't build out data centres fast enough, and that's a challenge which is going to continue for the next couple of years.”

That brings some other challenges around infrastructure. “Do we have enough people to be able to deliver this infrastructure, to design, operate, build, and maintain going forward? Do we have the right infrastructure in place to be able to supply the right products, the concrete, and the electromechanical materials to support this accelerated growth we're seeing at the moment? There's an expectation that costs are going to go down, but if you look at reports from CBRE, it seems something completely different – rental rates are going up at the moment.”

Arguably the biggest obstacle is the availability of power required for these power-hungry AI systems. To overcome these challenges, the data centre industry must adapt quickly. “When I first started in this industry five years ago, the average size of a data centre was around about five megawatts, and now it’s significantly larger than that – today we’re seeing data centres in Europe of 40MW-plus.

"The power density requirements are going to be significantly higher, and availability will be a major constraint on where we can deploy," Marc explains. "Geographical data centre markets will have to evolve around power availability."

While AI undoubtedly drives higher energy needs initially, both experts emphasised how AI-optimised infrastructure can actually improve efficiency if implemented properly. Within data centres themselves, Dion describes how AI techniques like predictive maintenance and digital twinning enable "AI driving AI" – leveraging AI systems to optimise their own facilities for maximum efficiency and sustainability.

With Marc identifying that around 60% of the energy produced today is lost or wasted, AI can drive more automation and less human-derived energy waste.

“The message that we all hear is that more AI equals more energy but the question is, does it? And initially, yes, the answer was that we need to deploy more power to deploy AI infrastructure. However, we are so excited about AI because of the efficiency that it's going to drive in our operational behaviours. 

“If we can start to drive more automation, more AI-led infrastructure that drives more efficiency, less dependence on human behaviour and the vulnerability that goes with it, it will drive more energy efficiency, and that ultimately will start to drive more productivity in that infrastructure landscape.

“Within the data centre itself, you see AI starting to drive AI infrastructure as well. So the productivity we see in data centres can be driven by the AI infrastructure, predictive maintenance, digital twinning driving the data centre closer to a net zero carbon infrastructure going forward.”

******

Make sure you check out the latest edition of Data Centre Magazine and also sign up to our global conference series - Tech & AI LIVE 2024

******

Data Centre Magazine is a BizClik brand

Share

Featured Articles

Re:Invent 2024: How AWS & Nvidia are Powering Ahead With AI

AWS and Nvidia showcased direct liquid cooling systems, neural interface models and quantum computing developments for data centres at Re:Invent 2024

AVK Commits to Net Zero Power Solutions with Rolls-Royce

AVK and Rolls-Royce alliance delivers over 500 sustainable generators in 2024, as data centre industry embraces HVO-ready, hydrogen-capable power solutions

How AWS Plans to Boost Sustainability with Orbital Materials

Tech giant AWS has launched a strategic partnership with Orbital Materials to develop technologies for data centre decarbonisation and efficiency

Schneider Electric's AI Data Centre Solutions: Explained

AWS Unveils Next-Gen Data Centres for AI Computing Demands

Kao Data Eyes European Growth With CBRE Partnership