The Golden Age of Hyperscale
The modern data centre is an engineering marvel, a confluence of power, cooling, data transmission and storage systems, all working in perfect harmony to power the digital age. The global demand for data is swelling like a great wave, as the growth of IoT, 5G and AI adoption spurs on the creation of more and more information on a daily basis.
That information needs to be housed somewhere and, while they represent less than 10% of all data centres in operation by number, it’s hyperscale facilities that are attracting the most investment, home to the most powerful innovations, and forming the backbone of Industry 4.0. This month, we’re taking a closer look at the past, present and future of the hyperscale data centre market, as well as two of the biggest disruptors the industry faces: the edge and climate change.
A hyper-quick history lesson
The hyperscale data centre market is a relatively new one. Before 2016, it was rare that data centre leases exceeded 10MW of capacity. By the end of 2017, there were more than 390 data centres around the world big enough to be classified as hyperscale, with the overwhelming majority being owned by AWS, Facebook, Google and other Tier One hyperscale players (more about them in a moment). However, the majority of these facilities were built, wholly owned and operated by a single Tier One firm, almost exclusively in support of their own operations. Since then, the massive rise in public cloud adoption, which companies like AWS support through their own hyperscale infrastructure, has propelled hyperscale investment to new heights.
In 2018, market research by North American Data Centres identified 11 deals in excess of 10MW - including a record-breaking 72MW lease of data centre space in Ashburn, Virginia by Facebook.
At the end of 2019, there were more than 500 hyperscale data centres in operation around the world, a number which swelled again in the first half of 2020 to more than 540. Of the new hyperscale data centres opened in the last 12 months, AWS and Google together accounted for more than half the total, with Microsoft and Oracle following slightly behind.
“There were 100 new hyperscale data centers opened in the last eight quarters, with 26 of those being in the first half of this year,” said John Dinsdale, chief analyst at Synergy Research Group at the time. “COVID-19 has caused some logistical issues but these are robust numbers, demonstrating the underlying strength of the services that are driving these investments.”
Hyperscale growth is expected to be highly resistant to the COVID-19 pandemic, as increases in digital transformation, remote work, online education and overall internet usage have risen sharply since March. By the end of this year, Cisco Systems estimated (before the pandemic - which may only intensify the result) that more than half of all the data traffic on Earth will pass through a hyperscale facility. That’s 53% of all internet traffic passing through fewer than 10% of the data centres.
What makes a data centre hyperscale?
The term hyperscale isn’t a protected one, and the lines between hyperscale facilities and enterprise, colocation and telecom data centres can sometimes become blurred. However, there are a few widely agreed-upon characteristics that separate hyperscale facilities from their counterparts:
Hyperscale data centres are owned and operated by the companies they support, usually a Tier One operator like AWS, Apple or Facebook, as opposed to a Colocation data centre, which leases its capacity to third parties.
“We are moving from a situation where data centre capacity was built out in a rather haphazard, demand-driven way to one that is more deliberate and focused on efficiency and effectiveness” - Andrew Donoghue, director of global analyst relations, Vertiv
Hyperscale facilities are big. Most experts tend to agree that any data centre with more than 500 cabinets, or at least 10,000 square feet of floor space, is a hyperscaler. Usually, a facility with 40MW or less capacity is considered to be an enterprise data centre.
Hyperscale facilities have the ability to get much, much bigger. Think about scale as a verb. Hyperscale data centres are built to be expanded to meet the ever-growing demands of the companies that build them.
The ability to function at hyperscale. The potential challenges of handling such huge quantities of data mean that hyperscale facilities are designed to a different standard than smaller data centres.
The expansion of the edge
One of the trends with the potential to most dramatically impact the data centre industry is the rise of edge network infrastructure. The edge data centre market exceeded $5.5bn in 2019. Between 2020 and 2026, the market is expected to display a CAGR or 23%, as the demand for low-latency edge computing is driven by an explosion of IoT and AI technologies. For many, both the shape of edge network evolution and its relationship to large-scale data centres at the centre of the network still hang in the balance.
“It’s still not completely clear how edge computing and edge data centres will manifest in the future,” Andrew Donoghue, director of global analyst relations at data centre infrastructure company Vertiv, explained to Data Centre Magazine in a recent interview. Donoghue added that expanded edge capacity will be an additive force to public and hybrid cloud build outs, rather than disrupt them. Both cloud and the edge, he proposed, are part of the same continuum of data centre capacity, rather than in competition with one another. “Focusing too much on either is actually a distraction,” he said. “What is really happening is we are moving from a situation where data centre capacity was built out in a rather haphazard, demand-driven way to one that is more deliberate and focused on efficiency and effectiveness.”
Donoghue believes that this more considered approach, enabled by a world where hyperscale, enterprise and edge facilities all work in tandem, will see workloads located where it makes most sense from a cost, efficiency, latency and bandwidth perspective. He added: “Latency dependent workloads will tend to reside at the edge whilst workloads that are less latency dependent and perhaps more data intensive will reside at the core in large hyperscale sites. So it’s not so much an edge build out as reorganisation of workloads to the most effective and efficient location.”
Sustainability at (hyper)scale?
Climate change is an increasingly immediate and existential threat. Rising global temperatures, pollution, a biodiversity collapse, and an unwillingness for large-scale corporations and politicians to present more than market-based, incremental solutions to the problem represent the single greatest challenge humanity has ever faced. The data centre industry is a huge consumer of energy and therefore a massive contributor to carbon emissions. The data centre industry has a bigger environmental impact than the world’s airlines.
“Hyperscale operators have always led the way in terms of sustainability” - Andrew Donoghue, director of global analyst relations, Vertiv
Hyperscale facilities in particular, can consume enough electricity per hour to power more than 50,000 homes. Whether or not the sheer amount of power these facilities demand can be reconciled with a need for drastic climate action is the greatest challenge the industry faces today.
According to Donoghue, hyperscalers have the potential to be a part of the solution, rather than the problem. “Hyperscale operators have always led the way in terms of sustainability,” he told us, noting that their more uniform workloads mean that they tend to support a narrower range of applications compared to an enterprise facility. “Consequently, they can be designed to support those workloads in a very efficient and sustainable way,” he said. “They also benefit from large scale and well-funded owners who can invest in mechanisms like power purchase agreements for renewable energy.” Large scale sustainability projects in the data centre industry do tend to stem from the hyperscale end of the market, with companies like Google, Apple and Microsoft all making bold pledges towards carbon neutrality in their data centre operations this year already.
Vertiv has recently partnered with Honeywell on a portfolio of solutions to make it easier for data centre operators to develop micro-grid technologies in support of their power consumption needs. Data centre operators will be able to “use a greater variety of energy sources to provide power and resiliency for their sites, including hydrogen fuel cells and even on-site renewables,” Donoghue explained.
Andreas Limpak, director of solutions engineering at NetApp, however, doesn’t see the relationship between hyperscalers and sustainability. “The hyperscale market will likely not become a poster child for sustainable processes, because of the sheer amount of resources it requires to operate successfully,” he told Data Centre Magazine in a recent interview. “However, the chance to streamline, do away with overprovisioning, test and reduce resource-intensive churn, and deliver resources only when and where needed at the right level – are all factors democratised by the public cloud that help make IT more sustainable.”