How data ubiquity is impacting data centre density

By David Craig
David Craig, CEO of Iceotope explains the effects of data ubiquity on data centre density

 

 

In 2020, humans were creating, on average, 1.7MB of data per second. We sent 500,000 Tweets per day. And by the end of 2021, we will have conducted two trillion Google searches. I think it is safe to say that data ubiquity is upon us, which is an exciting thing. 

How we access and interact with data is constantly changing and is going to have a real impact. To quote Henriq Cecci of Gartner, “the data center is no longer the center of our data.” The sheer volume we are creating will need to be processed and refined away from the centre itself and closer to the end user. 

That is because data on that scale is not easily moveable. Known as data gravity, as data sets grow larger they become harder to move. Smaller applications and other bodies of data will gravitate around larger data masses. As a result, the data ends up staying put and the applications and processing power come to where the data resides. Much of this is being driven by artificial intelligence (AI) applications. The closer you are to the customer, the more likely you’ll be using AI applications for better speed of access, operation and performance. 

Let’s look at retail, as an example. Data gathering is going to be all about the customer journey and how to enhance their experience in store. Accenture found that 83% of consumers are willing to share their data to enable a personalized experience. This can be a game changer for everything from tailored advertising and product recommendations, to more complex applications like real-time multilingual ordering and automation in a fast food restaurant. 

The energy industry, like many others, is undergoing a digital transformation as well. PWC found that use of digital technologies could result in cumulative savings in capital expenditures and operating expenditures of US$100 billion to $1 trillion by 2025. To fully realize those savings, data processing is going to have to take place on-site. Oil rigs generate about a terabyte of data per day. However, that much data can take as many as 12 days to upload by satellite. There won’t be any cost savings to be had if one day’s worth of data takes 12 days to transmit. 

With these changing parameters - more data, AI applications, and a move to the edge - are data center operators prepared for this change? Be it an enterprise operator, a colocation provider or a hyperscaler, the industry as a whole is beginning to evaluate the impact of these trends. Rack and chip densities are increasing and are requiring a new way of thinking. 

Historically, enterprise racks were configured around relatively low-level heat loads of 3 to 6 kw range and some significant operators today run in the 10 – 15 kilowatts per rack range. All of which is fine for most standard business applications. Now as AI applications are driving GPU processing, the data center operators are coming under greater pressure to deliver 30-60 kilowatts racks, something most data centers are not designed for. 

The challenge comes in the fact that nearly all components of the data center environment are designed to be air-cooled. These traditional methods are being pushed to their limits, driving the trend towards solutions like precision immersion liquid cooling. These solutions offer greater efficiency, lower operating costs, higher levels of resilience, increased reliability and near-silent operation, despite the high-power density of the GPUs being cooled.

It’s human nature to embrace change slowly. However, consumers are going to demand change faster. If Company A is using AI and understanding their target audience better and delivering products that are more personalized and innovative than Company B, the answer becomes simple. Consumers will vote with their feet and go to Company A. Data centers need to be prepared to support that change. 

At a time when there are many competing priorities with real-world consequences, a holistic approach to addressing these issues is needed. Leadership that is willing to be bold and embrace new technologies and approaches to solving problems will be rewarded. There are financial, space and emissions benefits to be had that will prepare data centers for the challenges ahead. 

 

Share

Featured Articles

Blackstone's Vision for Hyperscale Data Centre Campus

Blackstone to transform Northumberland site from car battery factory to a hyperscale data centre campus, in a new initiative to meet growing data demands

Maincubes Bolsters Leadership Team with Martin Murphy as COO

maincubes appoints new COO Martin Murphy, after recent introduction of Zahl Limbuwala to Executive Chairman of the Advisory Board

How Kove Unlocks Transformative Growth for Your Organisation

Kove helps clients maximise infrastructure performance using software-defined memory. Learn how

US Data Centres Confront the Strain of Rising Power Demands

Critical Environments

Data storage, memory and generation with IEEE’s Tom Coughlin

Networking

Digital Realty Continues Renewable Rollout to the US

Data Centres