Nov 27, 2020

How is data hoarding contributing to global energy wastage?

Sustainability
Dark Data
Energy Management
Big Data
Mani Singh
6 min
Combatting the accumulation of 'dark data' may be an essential step towards mitigating the carbon impact of IT
Combatting the accumulation of 'dark data' may be an essential step towards mitigating the carbon impact of IT...

Governments around the world are hoping that the behavioural changes seen as a result of the Covid pandemic will help to kick start widespread green economic recovery. 

Businesses everywhere are making their own optimistic commitments. United Nations Framework Convention on Climate Change, (UNFCC) reports substantiate this. They have announced that levels of commitment to reaching net zero emissions have roughly doubled in less than a year, with 2040 being a common target goal for zero emissions. Around 22 regions, 452 cities, 1,101 businesses, 549 universities and 45 of the world’s biggest investors are pushing for a green recovery. Amazon is among them and now running TV advertising to publicise its carbon free goals.  

Much of the current discussion around achieving carbon neutrality focuses predominantly on cutting down on the consumption of plastics, reliance on fossil fuels, adopting sustainable production techniques, improving biodegradability of packaging materials and so on. These are clearly really important but so too is cutting the energy emissions waste associated with data centres - especially among the predominantly service-based, ‘information industries’. 

There are now well over 8mn data centres globally according to Statista and they consume vast amounts of electricity - generating carbon emissions that dwarf the airline industry. Knowledge or information industries might not have the same obvious sustainability concerns as manufacturers and retailers, but they are no less polluting when it comes to storing their data.

The reason why is because one thing almost all organisations have in common is what Gartner describes as ‘dark data’. These are big data information assets that are collected, processed and stored during regular business activities – and especially in relation to digital transformation programmes - but are generally useless for other purposes. For example, analytics and business relationship management. It is rather like the dark matter in our universe, which CERN estimates to comprise around 27% of all matter, except that the concentration of dark data is much more prevalent. Within a typical organisations’ universe of information assets, experts are suggesting upwards of 50% of what’s in the data centre is dark data. It is the organisational equivalent of hoarding, with few having a strategy or automated processes in place to understand what is being stored and manage data across its lifecycle.

Managing dark data is a problem because data centres require vast amounts of electricity. In 2020, energy consumption of data centres is expected to account for 3.5% of total worldwide carbon emissions and is expected to grow to nearly 40% by 2040. By 2025, they are expected to consume 20% of the world's electricity - more than any other sector. It is equivalent to what the Organisation for Economic Co-operation and Development (OECD) countries’ food, iron and steel, and paper industries combined are currently consuming. Organisations are focusing on cutting their use of red diesel and plastics, but what about the environmental cost of their data? Poor data management is a serious (and entirely avoidable) waste problem that’s growing at an exponential rate.

Consider this. In 2010, IDC estimated that 1.2 zettabytes (1.2trn gigabytes) of new data were created that year - over 40% more than a year earlier. At the time, they predicted levels of new data creation would reach 35 zettabytes (35trn gigabytes) in 2020. It must have seemed a huge number at the time but it was way off, reaching that level two years early. IDC has now revised up their 2020 data creation prediction to 175 zettabytes (175trn gigabytes). That’s a 99% increase. 

It is not exactly surprising to learn so much data is being created. We are in the middle of the ‘fourth industrial revolution’, a ‘big data’ era where organisations everywhere are investing in digital transformation, Internet of things, cryptocurrency investing, AI, blockchain, automation, e-commerce, online wealth management e-banking, telemedicine. Data lies right in the centre of all these strategies and coupled with these new applications for data comes an increased compliance requirement. 

Although industry regulations vary, large amounts of transactional data must be retained for compliance purposes, often for decades. In a world where the volume of data is quadrupling every five years, this just adds to the environmental and financial cost of managing its ongoing storage. In fact, research has shown that only a fraction of the data organisations hold in their systems – around 10 to 15% - is actually being used. The rest is legacy information, much of it totally redundant.

So returning to those original calculations, if 40% of current carbon emissions are coming from data centres which are consuming up to a fifth of electricity; and if 50% of the data stored there is dark data, with only 10% actively being used, that is a lot of wasted energy resource. Reducing this would have a significant positive effect on organisations’ net zero targets. What can be done to minimise this wastage and only store the data that is actually needed?

For an average mid sized organisation that holds 1000TB of data, the cost to store non-critical information is estimated at more than £550,000 annually. These estimates reflect what we are witnessing amongst our clients, many of whom are actively focused on reducing their carbon footprint. For example, a well-known drinks manufacturer has been working with TJC to review their enterprise data and reduce unnecessary energy consumption through automated data archiving projects. So far, this work has meant them being able to identify 55TB of dark data that could be archived, reducing energy consumption by 149%. Although we all appreciate the financial arguments are only one aspect of the issue, in monetary terms alone, this equated to them saving over $2.3mn.

It is not surprising that the UN is reporting such a huge increase in organisations making public announcements about their decarbonisation plans. 80% percent of consumers say they most admire brands that demonstrate a commitment to sustainability. But if organisations and governments really want to prioritise achieving net zero targets by 2040 or earlier, much greater attention should be directed at the data they are consuming energy to store, with information lifecycle management needed to reduce waste at all levels.  

What practical steps can organisations be taking now to cut the energy waste associated with data storage?

  1. Complete a full data audit to identify what data exists, the lifecycle of that data and consulting with stakeholders to establish priorities for DVM.
  2. Establish an Information Lifecycle Management (ILM) strategy, with clear policies for ongoing management and retention of data taking into consideration regulatory compliance issues like GDPR
  3. Identify a way to automate the data archiving and decommissioning process as an ongoing sweep in the future
  4. Implement reporting to monitor the long term ROI of the ILM strategy, the gradual reduction to data TCO and the positive impact on carbon reduction goals.

Mani Singh is an SAP consultant at TJC-Group, the SAP software and consultancy specialists.

Share article

Jun 6, 2021

Unlocking the next chapter of the digital revolution

Dell
servers
IT
Technology
Tim Loake
5 min
Tim Loake, Vice President, Infrastructure Solutions Group, UK at Dell Technologies highlights the importance of often-overlooked digital infrastructure

As the world retreated to a hybrid world in 2020, our reliance on technology took the spotlight. But it was the jazzy new social and video calling platforms that took the encore. Behind the scenes, our servers worked overtime, keeping us connected and maintaining the drumbeat of always-on newly digital services.  Let’s take a moment to pay our respect to the unsung technology heroes of the pandemic – the often-forgotten IT infrastructure keeping us connected come what may. After all, as we look ahead to more resilient futures, they will be playing a central role.

Servers could be likened to our plumbing – vital to well-functioning homes but rarely top of mind so long as it is functioning. Never seen, rarely heard – our servers do all the graft with little praise. But it is essential to reflect on the incremental advances in GPU and CPU power, which have paved the way for new workloads that previously were not possible. Chatbots and native language processing that provide essential customer touchpoints for businesses across the retail and banking sectors rely on powerful servers. They also keep businesses competitive and customers happy in an always-on world. 

Tim Loake, Vice President, Infrastructure Solutions Group, UK at Dell Technologies
Tim Loake, Vice President, Infrastructure Solutions Group, UK at Dell Technologies

Serving workplace transformation

But, as businesses grappled with pandemic disruptions, the focus was largely on adopting connected devices – and awe at the rapid increase in the datasphere.  As they reined in their budgets and attempted to do more with less, one aspect was perhaps overlooked—those hard working servers.

When it came to building resilience into a newly remote workforce, the initial concern was focused on the device endpoints – keeping employees productive.  Many companies did not initially consider whether they had the server infrastructure to enable the entire workforce to log in remotely at the same time. As a result, many experienced a plethora of teething problems: virtual office crashes, long waits to get on servers, and sluggish internet connectivity and application performance, often rendering the shiny new PC frustrating and useless.

Most businesses only had a few outward-facing servers that could authenticate remote workers – a vital gateway as the vector for cyber hacks and attacks increased exponentially. That’s not to mention the fact that many business applications simply weren’t designed to work with the latency required for people working from home. What businesses discovered at that moment was that their plumbing was out of date.  

Business and IT leaders quickly realised that to stay ahead of the curve in the hybrid working world, a renewed focus on building agile, adaptable, and flexible IT infrastructures was critical. More importantly, it accelerated the inevitable digital transformation that would keep them competitive in a data-driven economy. It is now abundantly clear to businesses that they need IT infrastructure to meet the demands of diverse workloads – derive intelligent insights from data, deploy applications effectively, and enhance data management and security.  

Ripe for a digital revolution

Unsurprisingly, IDC noted that there was an increase in purchases of server infrastructure to support changing workloads. However, it also forecasts this uptick will be sustainable and last beyond the pandemic. As the economy begins to reopen, business leaders are looking ahead. IT will continue to play a crucial role in 2021 and beyond – and we have already set the foundations for the digital revolution with next-generation servers. 

As we enter the zettabyte era, new innovative technologies are coming on stream, with 5G turbocharging IoT and putting edge computing to work.  Exciting new services improved day-to-day efficiencies, and the transformation of our digital society will be underpinned by resilient IT infrastructures.  By embracing the technological innovations of our next-generation servers, businesses keep pace with the coming data deluge.

The next generation of server architecture promises more power with less heat, thanks to improved, directed airflow, and direct liquid cooling, resulting in reduced operational costs and environmental impact. As we rebuild post-pandemic, manufacturers and customers alike strive to achieve ever more challenging sustainability goals. With this in mind, a focus on environmentally responsible design is imperative for the servers of tomorrow -  uniquely designed chassis for adaptive cooling and more efficient power consumption will be critical, improving energy efficiency generation over generation.

The most notable evolution is the configuration of these next-gen servers around more specific organisational needs. Unlike clunky and often unstable legacy infrastructure, the infrastructure of tomorrow will be sturdier and more modular. The next iteration is streamlined, and in this modular form, can be more easily tailored to business needs. This equates to essential cost savings as businesses only pay for what they use.  

Resolving the problem of the future, today

Tomorrow's IT challenges will focus on response times and latency as Edge and 5G technologies go mainstream. As businesses develop new and innovative services that utilise supercharged connectivity and real-time analytics, staying on top of these challenges will give them a competitive edge. For example, in the world of retail, automation will power new virtual security guards and even the slightest delay in the data relay could result in financial loss. 

Similarly, in the smart cities of tomorrow, the network must be responsive. With city-centre traffic lights controlled by an AI-powered camera that monitors pedestrians, delays in data transfers could cost the life of an elderly pedestrian who has fallen in the road. The stakes are far higher in a 5G-enabled world. As our reliance on technology deepens, the margins for error narrow, placing greater emphasis on the efficiency of those critical underpinning technologies.

Fully enabling the hybrid work model today is just a stepping-stone towards more fluid, tech-enabled lives. A work Zoom call from an automated vehicle on-route to an intelligent transport hub is a highly probable vision of our future. But it requires incredible amounts of compute and seamless data transfers to make it possible. These glossy snapshots need super servers to come to life, making that IT plumbing glisten with next-gen innovation essential. Without exemplary server architecture, we risk future tech advances and the human progression that it enables. 

Share article