From tech side project to return on investment
New technologies attract a lot of hype. Descriptions used to describe new technologies such as ‘revolutionary’ and ‘ground-breaking’ have lost their impact through their overuse.
Furthermore, this culture of overpromising makes technologists and customers alike cynical when they don’t see immediate or significant impact of new tech deployments. However, there are numerous examples of technologies that have been subjected to scepticism early on but gone on to become staple parts of the digital economy.
From touchscreen interfaces to the Internet of Things (IoT), this path is so well trodden that Gartner produces its annual hype cycle, which theorises the idea that new technologies go from early adopters’ enthusiasm to inflated expectations, before a sense of disillusionment sets in.
As understanding of the technology matures, a more realistic judgement can be made of its use as more viable applications are discovered and deployed.
There are many reasons why new technologies can initially flatter to deceive. It can simply be executed in the wrong way – possibly because the skills do not yet exist to design solutions and troubleshoot problems. Digital transformation is one such example, where businesses feel held back by a lack of skills – with almost one in three (30%) IT decision makers citing this as a concern according to
It can be that a technology is simply ahead of its time and the complementary technologies that give it a clear place in the world do not yet exist. Returning to the consumer example of touchscreen devices, the early efforts by Palm and Microsoft to launch personal tablets were flawed by their inability connect wirelessly to the internet or sync with PCs and laptops. It was only when wireless technology and cloud computing reached maturity that smartphones and tablets came of age.
Finally, technology can work perfectly well, but not really solve a big enough problem to warrant significant investment. That’s why you often hear talk of ‘killer apps’ or use-cases that will give a new technology purpose and meaning. QR codes are an example of a technology that the world thought it had infinite uses for but struggled to take off until they found their calling in mobile boarding passes and ticketing applications.
Experience therefore tells us that just because a new technology might not change the way things are done today, it doesn’t mean it won’t have a big impact long-term. With that said, it’s fine to get excited by the potential of a new technology. But, as an industry we must learn to temper our expectations, and those of our customers, towards how quickly and how far new technologies will create radical and lasting change.
Contain your excitement
Even for those technologies which solve a real problem, are enabled by the right complementary technologies. and are generally understood enough to be successfully tested and deployed, there are other challenges. Any enterprise IT deployment requires investment, upskilling and cultural change from business leaders and employees. That means it can take years to build a compelling enough business case to convince budget holders to incorporate new deployments into their strategy.
Once a clear business case has been established, there are regulatory, cyber security and data protection requirements to throw into the mix. Given the value modern businesses rightly place on their data and the consequences of failing to manage and protect it, this is something which must be considered as early in the tech lifecycle as possible.
"In 2021, if you cannot confidently protect and manage data within an IT service or application, don’t deploy it."
An example of a technology that is moving through the various phases of the hype cycle at a rate of knots is containers – seen by many as a natural evolution of a virtualised environment – but designed to give IT managers greater control and flexibility over their applications. As little as around 18 months ago, containers had already begun their slide into Gartner’s – the phase when businesses have begun to act on the hype but been disappointed by the lack of immediate outcomes.
However, fast-forward the clock to 2021 and containers are already a critical component of DevOps-led infrastructure and application modernisation – with Kubernetes emerging as the dominant container orchestration platform. The business case for using containers enabled by Kubernetes is becoming well established, as microservices-based architectures have gained traction within the enterprise.
This opens up new possibilities when it comes to protecting data within containerised environments. A general rule to live and die by is that if you can’t manage data, you can’t protect it.
So, deploying Kubernetes adds the vital orchestration layer, meaning there is now a significant opportunity for a single data protection platform that includes virtual, physical, cloud and containerised environments.
Establishing more advanced data protection and backup credentials is one of the advancements that will help containers go from an IT side project to achieving the return on investment businesses crave.
At the edge of reason
As hyperscalers look to extend their ever-expanding data volumes and workloads to the edge and the shift towards remote working creates a greater sense of urgency for businesses looking to transform digitally, the case for edge computing looks compelling. You can view this as a confluence of events that make edge computing more relevant.
However, alongside digital transformation, there are other words on CIOs’ minds: data protection, cyber security, cost optimisation and digital skills to name a few.
All these are relevant when it comes to taking edge computing from an overhyped proof of concept to a core hybrid infrastructure service. To manage and protect data at the edge, businesses must be able to identify the data they need, back it up and secure it. Not only does this require backup, data and replication capabilities, it also requires specific skills – so often in short supply when it comes to relatively new technologies.
Businesses looking to capitalise on edge computing at this stage need to work with specialist partners to ensure their deployments are not just conducted successfully but are done so without putting data at risk or allowing cloud storage costs to spiral out of control. Taking the time to define your businesses’ Cloud Data Management strategy will provide direction and clear objectives, allowing you to measure the success of introducing edge computing to the data management mix.
Taking a strategic view of where technologies you have not successfully deployed before sit within your wider business objectives is crucial for building the business case for them and acquiring the necessary buy-in from budget holders to invest complementary solutions and onboard the necessary skills.
For enterprises locked in a race to transform digitally, evolving customer demands along with an increased reliance on cloud and connectivity are forcing their hands.
Implementing the latest and greatest technologies to achieve the desired outcomes of digital transformation requires investment in the necessary skills, data management and protection capabilities required to do so successfully, cost-effectively and securely.
Unlocking the next chapter of the digital revolution
As the world retreated to a hybrid world in 2020, our reliance on technology took the spotlight. But it was the jazzy new social and video calling platforms that took the encore. Behind the scenes, our servers worked overtime, keeping us connected and maintaining the drumbeat of always-on newly digital services. Let’s take a moment to pay our respect to the unsung technology heroes of the pandemic – the often-forgotten IT infrastructure keeping us connected come what may. After all, as we look ahead to more resilient futures, they will be playing a central role.
Servers could be likened to our plumbing – vital to well-functioning homes but rarely top of mind so long as it is functioning. Never seen, rarely heard – our servers do all the graft with little praise. But it is essential to reflect on the incremental advances in GPU and CPU power, which have paved the way for new workloads that previously were not possible. Chatbots and native language processing that provide essential customer touchpoints for businesses across the retail and banking sectors rely on powerful servers. They also keep businesses competitive and customers happy in an always-on world.
Serving workplace transformation
But, as businesses grappled with pandemic disruptions, the focus was largely on adopting connected devices – and awe at the rapid increase in the datasphere. As they reined in their budgets and attempted to do more with less, one aspect was perhaps overlooked—those hard working servers.
When it came to building resilience into a newly remote workforce, the initial concern was focused on the device endpoints – keeping employees productive. Many companies did not initially consider whether they had the server infrastructure to enable the entire workforce to log in remotely at the same time. As a result, many experienced a plethora of teething problems: virtual office crashes, long waits to get on servers, and sluggish internet connectivity and application performance, often rendering the shiny new PC frustrating and useless.
Most businesses only had a few outward-facing servers that could authenticate remote workers – a vital gateway as the vector for cyber hacks and attacks increased exponentially. That’s not to mention the fact that many business applications simply weren’t designed to work with the latency required for people working from home. What businesses discovered at that moment was that their plumbing was out of date.
Business and IT leaders quickly realised that to stay ahead of the curve in the hybrid working world, a renewed focus on building agile, adaptable, and flexible IT infrastructures was critical. More importantly, it accelerated the inevitable digital transformation that would keep them competitive in a data-driven economy. It is now abundantly clear to businesses that they need IT infrastructure to meet the demands of diverse workloads – derive intelligent insights from data, deploy applications effectively, and enhance data management and security.
Ripe for a digital revolution
Unsurprisingly, IDC noted that there was an increase in purchases of server infrastructure to support changing workloads. However, it also forecasts this uptick will be sustainable and last beyond the pandemic. As the economy begins to reopen, business leaders are looking ahead. IT will continue to play a crucial role in 2021 and beyond – and we have already set the foundations for the digital revolution with next-generation servers.
As we enter the zettabyte era, new innovative technologies are coming on stream, with 5G turbocharging IoT and putting edge computing to work. Exciting new services improved day-to-day efficiencies, and the transformation of our digital society will be underpinned by resilient IT infrastructures. By embracing the technological innovations of our next-generation servers, businesses keep pace with the coming data deluge.
The next generation of server architecture promises more power with less heat, thanks to improved, directed airflow, and direct liquid cooling, resulting in reduced operational costs and environmental impact. As we rebuild post-pandemic, manufacturers and customers alike strive to achieve ever more challenging sustainability goals. With this in mind, a focus on environmentally responsible design is imperative for the servers of tomorrow - uniquely designed chassis for adaptive cooling and more efficient power consumption will be critical, improving energy efficiency generation over generation.
The most notable evolution is the configuration of these next-gen servers around more specific organisational needs. Unlike clunky and often unstable legacy infrastructure, the infrastructure of tomorrow will be sturdier and more modular. The next iteration is streamlined, and in this modular form, can be more easily tailored to business needs. This equates to essential cost savings as businesses only pay for what they use.
Resolving the problem of the future, today
Tomorrow's IT challenges will focus on response times and latency as Edge and 5G technologies go mainstream. As businesses develop new and innovative services that utilise supercharged connectivity and real-time analytics, staying on top of these challenges will give them a competitive edge. For example, in the world of retail, automation will power new virtual security guards and even the slightest delay in the data relay could result in financial loss.
Similarly, in the smart cities of tomorrow, the network must be responsive. With city-centre traffic lights controlled by an AI-powered camera that monitors pedestrians, delays in data transfers could cost the life of an elderly pedestrian who has fallen in the road. The stakes are far higher in a 5G-enabled world. As our reliance on technology deepens, the margins for error narrow, placing greater emphasis on the efficiency of those critical underpinning technologies.
Fully enabling the hybrid work model today is just a stepping-stone towards more fluid, tech-enabled lives. A work Zoom call from an automated vehicle on-route to an intelligent transport hub is a highly probable vision of our future. But it requires incredible amounts of compute and seamless data transfers to make it possible. These glossy snapshots need super servers to come to life, making that IT plumbing glisten with next-gen innovation essential. Without exemplary server architecture, we risk future tech advances and the human progression that it enables.