Written by Harry Menear
The amount of data that needs to be collected, moved around, processed, sent back, duplicated, stored, and otherwise manipulated in order to power our increasingly digitalised world is, frankly, staggering.
We’re increasingly faced with great big figures, like “the amount of data generated globally every day” (about 2.5 quintillion bytes and climbing, if you’re curious), which are ever spiraling upwards towards numbers the mind is fundamentally incapable of putting into context. Not only that but, on a more granular level, the amount of data generated by relatively small pieces of what will soon be commonplace technology, is also constantly trending upwards.
“An autonomous car’s LIDAR sensor can create over 10 Terabytes of data per day,” says Neil Stobart, VP of Global Systems Engineering at Cloudian. However, with the growth of data-intensive applications in industrial, commercial, and smart city settings, this exponential spike in data traffic is already putting undue strain on existing network infrastructure. “There is a problem with all this data being generated: the time constraints of sending data back to a remote data centre or server are no longer practical as smart infrastructures require near-instant data processing to determine the required outcome,” Stobart continues. “So despite this proliferation of data, only a small fraction is being used for decision making because the data cannot be stored or transmitted efficiently.”
Over the coming decade, this friction is only going to get worse. According to Equinix’s Global Interconnection Index Volume 5, global interconnection bandwidth growth is forecast to reach 21,485+ terabits per second (Tbps), or 85 zettabytes, per year by 2024. In large cities - epicentres for highly digitalised economic activity and data consumption both locally and backhauled from smaller metros nearby without their own digital infrastructure - the growth in bandwidth is at its most pronounced.
“London is predicted to be the epicentre of data centre development as British tech firms in sectors such as fintech, ecommerce and digital health continue to grow at an exponential rate,” says Russell Poole, Managing Director for Equinix’s UK operations. “The city is expected to grow at a 45% compound annual growth rate (CAGR) year-on-year, contributing 1,735 Tbps by 2024.” By the middle of the decade, the FLAP (Frankfurt, London, Amsterdam, and Paris) metros are forecast to account for 75% of the total interconnection bandwidth capacity in EMEA.
This presents a huge challenge for data centre and network operators, as older methods of network architecture become increasingly unfit to handle the demands of a new, data-driven decade. With the ever-rising demand for lower latency connections to transmit larger and larger amounts of data, data centre operators are fundamentally reassessing their approach to the Edge.
The Regional Edge
While regional connectivity hubs like London are set to handle an increasing amount of data over the coming decade, growth in bandwidth is far from isolated to the world’s largest metros.
Thanks to a global spike in remote and hybrid work, Poole notes that “the COVID-19 pandemic has forced economies around the world into digital overdrive to adapt to a new way of doing business. Enterprise has responded to this challenge by deploying their digital infrastructure to multiple edge locations and integrating cloud solutions into their IT framework.” This trend, much like data growth, is only expected to grow more pronounced over the coming years.
“The past two years have seen the pandemic force many companies to digitise their business models which has further intensified the demand for digital services,” says Poole. “Companies are now making cautious preparations for a global economic recovery and as a result are implementing the expansion and deployment of digital infrastructure at a rate four times faster than pre-pandemic levels.”
Essentially, the past two years (dear lord, has it really been that long?) have seen business agility emerge as a core survival trait in successful enterprises. Ultimately, much of an enterprise’s ability to be agile is dependent on resilient, low-latency connections, and the ability to host data as close as possible to where it’s needed. “To better facilitate the escalating need for agile, low latency interconnection, regional markets are increasing localised capacity to improve routing efficiencies and reduce latency,” Poole explains. “Data is being moved from remote locations to local carrier-neutral data centres hosting private and public Internet Exchange peering ecosystems.” By doing so, service providers are able to improve performance by lowering latency between end users and the origin of the content they access.
“Latency is significantly reduced by bringing digital services to the edge, where data and information is consumed, generated and transferred between users and applications,” Poole adds. Following where the demand leads, Equinix’s UK division is investing heavily in building regional edge data centres close to business hubs other than London, including a new Internet Exchange in Manchester.
Smart Cities and the Edge
Data consumption growth is being driven by numerous factors, from vast networks of IoT sensors to automated vehicles. While these technologies are already being deployed heavily throughout industrial enterprise settings, it’s the smart city which will serve as the greatest catalyst for edge demand over the coming decade.
“Smart cities have driven the adoption of edge computing with many practical applications for real-time decision-making technology,” says Stobart. “Never before have we had so many sensors able to generate prolific amounts of data within traffic management, next-generation advertising, and innovations like smart parking.” However, legacy (and even current-generation) networks make it impossible to take full advantage of the potential of smart city technologies. The need to migrate vast amounts of data close enough to the point where it intersects with the physical world is driving intense edge data centre construction as a result.
“Smart cities are powered by real-time analytics engines sorting through mountains of sensor data and then taking action on said data,” Stobart continues. “Network performance has not kept up with the growth and usage of data, so the closer you can locate data to the application, the better the response time. Edge compute and storage is built upon this premise, with data sets co-located with the decision-making application that are in turn close to the sensors.”
This new “edge everywhere” world doesn’t mean the death of the large-scale data centre however, but rather a change in perspective. The idea that data centres can be either centralised hyperscalers, medium-sized colocation/carrier hotels, and small edge sites, is starting to feel outdated.
Going forward, it’s much more likely we’ll see a central core made up of a smaller number of much, much larger data centres. Beyond it, everything will be the edge, and the edge will be everywhere.