Data centre industry reaches critical transition phase
The digital transformation of companies globally, new technological breakthroughs and ever greater demands for high-performance and low latency services see the data centre industry on the brink of a new transition.
According to a new report by BCS Consulting, changing requirements from clients, technology innovations, sustainability and edge computing are throwing the sector into a period of extreme growth and development.
The report cites the “infusion of capital, a push towards outsourcing, higher density and rising utilisation rates, the fluctuating nature of hyperscale companies and further advancement of edge computing" as just some of the dynamics that are forcing a change in the industry.
Customer expectations of services are based on the latest technical innovations for both speed and security. But according to Uptime Institute’s 2019 Data Centre Industry , one-third of data centres experienced a service outage in the past year.
Gartner also recently reported that data centre downtime cost an average of $5,600 per minute, while other studies show that 70% of all data centre and critical facility downtime is attributable to human error.
Big data demands
Demand for data storage management has never been stronger. According to the Forecast, the market is projected to grow at a CAGR of 6.79%, reaching a total market size of US$230.169bn in 2025 from US$155.201bn in 2019.
Bigger data requires faster networks, on-site micro data centres to improve latency and faster processing. The solution to many of these demands has been found in AI and HPC computing, which enables information storage and aggregations to be carried out more efficiently. But facilities also require new software tools and hardware to deal with the influx.
The growth of edge computing and micro data centres, where servers are located on-site to processes real-time data and manage smart systems, has contributed to the transition.
“In 2020, we’ll see enterprises tackle data gravity by bringing their applications closer to data sources rather than transporting resources to a central location,” said , CTO of Digital Realty, one of the largest data centre providers.
“By localising data traffic, analytics and management, enterprises will more effectively control their data and scale digital business. As the data gets heavier, and denser, it gets harder to move,” Sharp added. “There were data lakes. Now there’s the data ocean.”
BCS recommends data centre management teams adopt practices that will minimise downtimes and maximise efficiency, especially during the maturation phase.
The first step would be to develop and execute real-time monitoring of the critical infrastructure. That should be followed by establishing a business intelligent ecosystem that measures KPIs and drives efficiency, resilience, and performance of the critical infrastructure.
Finally, as data facilities turn increasingly to Machine Learning and Artificial Intelligence to manage systems more efficiently, BCS recommends data centre service providers turn to predictive safeguards, rather than preventative.
The report states, “Data centre owners deserve to be fully aware of the level of care, attention and detail that goes into the operations of their critical facility. The best qualified service providers will allow owners to peek behind the curtain to gain total transparency into the people, programs, processes and technology that’s used to operate and protect their critical facilities as well as how those resources are applied to benefit the customer.”
Although ongoing research into unmanned data centres will eventually see facilities that don’t require human intervention, the technology is still some years away from the mainstream application.
But data centre innovations are still moving forward at a tremendous rate, which means on-site staff require constant training updates. Since approximately 70% of data centre downtimes are caused by human error, keeping employees properly briefed and up to date on all the latest protocols can help providers avoid disaster.
Better staff training, particularly during a period of transition, states BCS, is essential. “Three-quarters of the outages that took place over the last 12-months could have been prevented, yet we’re not talking about it. We seem to accept that number. To put this in perspective, what if the commercial airline industry had a 75% human error failure rate?”
Another area of transition for data centres is the drive towards sustainability. The latest facilities are designed with renewable, clean energy in mind, along with cooling systems that function at a fraction of the cost of traditional units.
Saltwater cooling systems are being developed for off-shore data centres, while servers cooled by snow are already operational in Japan and the Nordics.
While converting a data centre from traditional systems to renewable ones is a hefty investment, the move is strategic. Green energy off-grid power sources such as wind and solar energy mean back-up power is always available in the event of unplanned outages, which according to the Ponemon Institute’s Cost of Data Centre Outage , carries an average cost of $740,357.
Renewable energy will play an increasingly important role in how data centres are powered and designed. There is a growing mandate for corporations to shift to greener energy footprint and the chare is being led by industry Goliaths like Google.
The multinational has pledged to ensure all sure all the electricity it uses for its data centres and offices will be truly 100% renewable by 2030. Google also said it will leverage more than $5 billion dollars of investment in clean-energy projects across its supply chain over the next decade.
, Google’s chief executive officer, said of the move, “The science is clear. The world must act now if we’re going to avert the worst consequences of climate change. We are committed to doing our part.”