Q&A: Paul Nelson, strategist and director at HPE Greenlake
2021 is set to be a pivotal year for the global data centre industry. As we kick off the next 12 months, Data Centre Magazine decided to sit down with Paul Nelson, strategist and director at Hewlett Packard Enterprise's Greenlake Cloud Services to discuss the future of the data centre space.
Paul, how can data centre operators in 2021 marry increased demand for scale and HPC with the need for sustainable practice?
The ability to marry HPC with application profile is highly dependent on the type of application and topology its deployed on… e.g. banking applications have low computational requirements and high storage and retrieval requirements, as opposed to say large computational or CAD style systems that require higher computing power. In the first instance, use of Virtualization to support many similar instances of smaller server instances will more efficiently utilize the HPC platform, married with the use of SAN and DB technology to facilitate fast access to storage. Data centre design still must be able to support the incremental expansion of the HPC environment in the most efficient manner, both in power and cooling facilities, whilst minimizing the environmental impact with the selection of mechanical and electrical systems that take full advantage of the site-specific conditions (including the use of free cooling, Adiabatic or alternate sources of cooling).
How useful is PUE as a measurement of data centre efficiency?
PUE is a tool which is available to help data centre Managers (and Designers) to determine the “Relative Efficiency” of a particular data centre over time and is greatly affected by exterior and interior factors. It is relative and varies over time due to:
- “IT Utilization” – both at machine level – load and distribution of processing
- Environment - the time of year. (Summer, winter), Location – UAE vs Sweden etc.
- The technology employed – static/dynamic UPS, Free cooling,
- Fit-out – Initial fit-out vs Fully populated
At a design level, data centre designers can specify a “Target PUE “ which needs to be caveated by “fully populated” at a certain rack density… this will vary from say 3.0 – 1.2 over time, expectations need to be set up front, The best use of PUE, we believe, is managing the operation of the data centre, and managing the effect of new technology introduced into the data centre which changes the overall equipment profile and thus the design factors of airflow or cooling and power requirements.
What are some of the changes you see coming to the data centre power industry in 2021?
First, due to the pandemic and a severe lack of qualified workforce in the data centre industry, Automation is no longer optional. Modern automation platforms can handle most server operations, such as component inventory, registration, automatic discovery, instant hardware replacement in case of malfunctions, network and storage. There will be more of a focus on “AI” or more specifically, remote / centralized management of facilities. Everything from fully-automated responses to facility failures and remote management and offloading of loads and taking equipment in and out of service. Certainly, remote monitoring is always a prime concern and data centre operators are constantly looking at “lights out” operations, not only from an IT perspective but also from a Facilities perspective, with engineering staff capable of managing multiple sites, with minimal staff.
The industry is going to get closer to the edge. The huge increase of IoT devices and 5G has made the need for edge computing more actual than ever. Edge processes data as close as possible to the place where it is collected instead of on a central server, greatly increasing speed and reducing latency. For edge computing, data centres will no longer only be built in major technological and economic hubs and we will see an increasing number of small facilities spread all around the world. Providers must adapt to this trend and scale their operations, to be able to deploy new small data centres in edge areas as quickly as possible.
Hybrid IT/cloud: is where an enterprise uses both in-house and cloud-based services to complete their entire pool of IT resources. The model enables organizations to lease a portion of their required IT resources from a public/private cloud service provider and still having full control over certain resources that they might not want to expose to the cloud. Despite the advantages of public cloud, a high percentage of apps and data must live in data centres and colocations due to issues such as data gravity, latency, application dependency, and regulatory compliance. And with the exponential growth of data promising to deliver new opportunities and insights, businesses can struggle to unlock the value of that data across their hybrid estates. Hybrid IT can also allow for an as-a-service experience that provides on-demand capacity, combining the agility and economics of public cloud with the security and performance of on-premises IT. It can accelerate digital transformation with cloud benefits of fast deployment, scalability, and pay-per-use economics all within the control of an own on-premises environment or a data centre as a service model.
Are there any major innovations - from battery backups to renewables - set to cause a major shift in the way data centres manage power?
Edge Data centres will provide redundancy by way of “multiple DCs” at “Lower redundancy levels”, and spreading the load across the Data Centers (this includes “edge data centres”). With the use of Virtualization, load sharing or offloading (eg “global load balancing) or “Cloud computing” enables application execution to be shifted from a site which has failed or is failing to another to continue processing.
Unlocking the next chapter of the digital revolution
As the world retreated to a hybrid world in 2020, our reliance on technology took the spotlight. But it was the jazzy new social and video calling platforms that took the encore. Behind the scenes, our servers worked overtime, keeping us connected and maintaining the drumbeat of always-on newly digital services. Let’s take a moment to pay our respect to the unsung technology heroes of the pandemic – the often-forgotten IT infrastructure keeping us connected come what may. After all, as we look ahead to more resilient futures, they will be playing a central role.
Servers could be likened to our plumbing – vital to well-functioning homes but rarely top of mind so long as it is functioning. Never seen, rarely heard – our servers do all the graft with little praise. But it is essential to reflect on the incremental advances in GPU and CPU power, which have paved the way for new workloads that previously were not possible. Chatbots and native language processing that provide essential customer touchpoints for businesses across the retail and banking sectors rely on powerful servers. They also keep businesses competitive and customers happy in an always-on world.
Serving workplace transformation
But, as businesses grappled with pandemic disruptions, the focus was largely on adopting connected devices – and awe at the rapid increase in the datasphere. As they reined in their budgets and attempted to do more with less, one aspect was perhaps overlooked—those hard working servers.
When it came to building resilience into a newly remote workforce, the initial concern was focused on the device endpoints – keeping employees productive. Many companies did not initially consider whether they had the server infrastructure to enable the entire workforce to log in remotely at the same time. As a result, many experienced a plethora of teething problems: virtual office crashes, long waits to get on servers, and sluggish internet connectivity and application performance, often rendering the shiny new PC frustrating and useless.
Most businesses only had a few outward-facing servers that could authenticate remote workers – a vital gateway as the vector for cyber hacks and attacks increased exponentially. That’s not to mention the fact that many business applications simply weren’t designed to work with the latency required for people working from home. What businesses discovered at that moment was that their plumbing was out of date.
Business and IT leaders quickly realised that to stay ahead of the curve in the hybrid working world, a renewed focus on building agile, adaptable, and flexible IT infrastructures was critical. More importantly, it accelerated the inevitable digital transformation that would keep them competitive in a data-driven economy. It is now abundantly clear to businesses that they need IT infrastructure to meet the demands of diverse workloads – derive intelligent insights from data, deploy applications effectively, and enhance data management and security.
Ripe for a digital revolution
Unsurprisingly, IDC noted that there was an increase in purchases of server infrastructure to support changing workloads. However, it also forecasts this uptick will be sustainable and last beyond the pandemic. As the economy begins to reopen, business leaders are looking ahead. IT will continue to play a crucial role in 2021 and beyond – and we have already set the foundations for the digital revolution with next-generation servers.
As we enter the zettabyte era, new innovative technologies are coming on stream, with 5G turbocharging IoT and putting edge computing to work. Exciting new services improved day-to-day efficiencies, and the transformation of our digital society will be underpinned by resilient IT infrastructures. By embracing the technological innovations of our next-generation servers, businesses keep pace with the coming data deluge.
The next generation of server architecture promises more power with less heat, thanks to improved, directed airflow, and direct liquid cooling, resulting in reduced operational costs and environmental impact. As we rebuild post-pandemic, manufacturers and customers alike strive to achieve ever more challenging sustainability goals. With this in mind, a focus on environmentally responsible design is imperative for the servers of tomorrow - uniquely designed chassis for adaptive cooling and more efficient power consumption will be critical, improving energy efficiency generation over generation.
The most notable evolution is the configuration of these next-gen servers around more specific organisational needs. Unlike clunky and often unstable legacy infrastructure, the infrastructure of tomorrow will be sturdier and more modular. The next iteration is streamlined, and in this modular form, can be more easily tailored to business needs. This equates to essential cost savings as businesses only pay for what they use.
Resolving the problem of the future, today
Tomorrow's IT challenges will focus on response times and latency as Edge and 5G technologies go mainstream. As businesses develop new and innovative services that utilise supercharged connectivity and real-time analytics, staying on top of these challenges will give them a competitive edge. For example, in the world of retail, automation will power new virtual security guards and even the slightest delay in the data relay could result in financial loss.
Similarly, in the smart cities of tomorrow, the network must be responsive. With city-centre traffic lights controlled by an AI-powered camera that monitors pedestrians, delays in data transfers could cost the life of an elderly pedestrian who has fallen in the road. The stakes are far higher in a 5G-enabled world. As our reliance on technology deepens, the margins for error narrow, placing greater emphasis on the efficiency of those critical underpinning technologies.
Fully enabling the hybrid work model today is just a stepping-stone towards more fluid, tech-enabled lives. A work Zoom call from an automated vehicle on-route to an intelligent transport hub is a highly probable vision of our future. But it requires incredible amounts of compute and seamless data transfers to make it possible. These glossy snapshots need super servers to come to life, making that IT plumbing glisten with next-gen innovation essential. Without exemplary server architecture, we risk future tech advances and the human progression that it enables.