The future of data centres
With the demand for the Internet of Things (IoT), automation and 5G continuing to grow, and heavily influencing businesses and supply chains over the coming years, the sheer volume of data that companies will be dealing with will become more and more overwhelming. Whereas five to ten years ago we’d see new data centres popping up everywhere to store and move all of the data around, this is no longer the case.
Many cities, such as Amsterdam, have put a stop to any more data centres being built as they drain power from the grid and cities have to invest more in power and cooling systems to keep them running efficiently. There is an urgent need for existing data centres to be utilised better and for businesses to become savvier in how they store and move data. Just because businesses can store data, doesn’t mean they should.
Sustainability is something that should be baked into the strategy as businesses move forward and seen as a positive process, as opposed to one that is a burden. There is a misconception that having more servers is the way forward. While they are able to store large volumes of data, they do not reduce the power needed and increase cooling costs in the data centre, and only a few of the capabilities of these servers are ever fully utilised. There must be smarter initiatives put in place.
Over the coming years, we are going to see a tremendous investment in large scale and High-Performance Computing (HPC) being installed within organisations to support data analytics and AI. At the same time, there will be an onus on data centre providers to be able to provide these systems without necessarily understanding the infrastructure that’s required to deliver them or the software or business output needed to get value from them.
There’s no denying that the majority of data centres are now being asked how they provide AI solutions and how they can assist organisations on their AI journey. Whilst organisations might assume that data centres will have everything to do with AI tied up. Is this really the case? Yes, there is a realisation of the benefits of AI, but actually how it is best implemented, and by who, to get the right results, hasn’t been fully decided.
Solutions to how to improve the performance of large-scale application systems are being created, whether that’s by getting better processes, better hardware or whether it’s reducing the cost to run them through improved cooling or heat exchange systems. But data centre providers have to be able to combine these infrastructure elements with a deeper understanding of business processes.
When it comes to AI, there has to be an understanding of what the whole strategic vision is and looking at where value can be delivered and how a return on investment (ROI) is achieved. What needs to happen is for data centre providers to work towards educating customers on what can be done to get quick wins.
There are some fascinating innovations already happening, where lessons can be learnt. In Scandinavia for example, there are those who are building carbon-neutral data centres, which are completely air-cooled, with the use of sustainable power cooling through solar. The cooling also comes through the building by basically opening the windows. There are also water cool data centres out there under the ocean.
As the global costs of energy rise, and the numbers of HPC clusters powering AI to drive our next-generation technologies increase, new technologies have to be found that lower the cost of running the data centre, beyond standard air cooling. It’s great to see people thinking outside of the box, with submerged HPC systems and full, naturally aerated data centres, but more will have to be done (and fast) to meet up with global data growth.