What is data gravity?
The term data gravity was coined a decade ago by , the VP of engineering at GE Digital in 2010. Data gravity simply refers to the attraction between data and applications. Like the law of gravity, data gravitates towards apps.
In our digitised world, data is the new currency of economies. However, data on its own is random and lacks value. When information is aggregated through an application, insights are created and analysis can be carried out. The analytics then provide the strategy that enterprises and organisations rely on to develop roadmaps for future decisions.
Data gravity significance
Data gravity impacts businesses and service providers because it controls a company’s ability to innovate, understand customer experiences and deliver services and products that best serve consumer interests. ™ calculates the creation, aggregation, and private exchange of enterprise data. It predicts data gravity will double exponentially through to 2024 because of the global pandemic and the resulting push towards commercial digitisation.
Data sources and storage
Data generated by mobile phones, smart devices and IoTs has vastly increased the amounts being stored. As businesses digitise and move away from traditional onsite data storage facilities to cloud and hybrid data storage solutions, the sheer super processing capacity required to aggregate so much information, has risen dramatically. Dell's recent Digital Transformation Index study reported that eight out of 10 enterprises have brought forward their digitisation upgrades in 2020.
, Aon's VP of Core Infrastructure Services, explained, “Understanding data gravity and its impact on our IT infrastructure is a difference-maker for our operations and will only become more important as data continues to serve as the currency of the digital economy. As enterprises become more data-intensive, there is a compounding effect on business points of presence, regulatory oversight and increased complexity for compliance and data privacy that IT leaders are now being forced to solve.”
Data gravity sources
Not enough insights are being generated from the excessive amounts of collected data. A big data load that is not utilised due to sheer volume, causes slow innovation, unsatisfactory customer and employee experiences and higher costs. It also creates information silos, slow decision making, security issues, compliance problems and more.
Essentially, the more data you have, the harder it is to process and store. Principal Research Analyst at 451 Research, part of S&P Global Market Intelligence, says, “Data gravity is the idea that data is an anchor that is often hard to move, especially as data volumes grow. If that growth takes place in public or private clouds that are not easily accessible by the enterprise using them, the full value of that data can't be realised, and the enterprise will be trapped into spending exorbitant sums to free it.”
Businesses are therefore turning to quantum computing solutions to manage the heavy information loads. Data centre hubs and managed cloud services, as well as developments in AI, ML and applications, are upgrading and expanding to accommodate the sheer volume of information that needs to be stored and processed. The storage hubs themselves are also transitioning to adapt to the new data-driven climate. The international research and consultancy organisation, Gartner, that by 2022, there will be at least 50% more enterprise-generated data to be processed outside the cloud. Information exchanges will need to expand, says Gartner, predicting that 60% of IT hardware and infrastructure globally will comprise of shared and privately managed data centres.
, Digital Realty’s Chief Technology Officer adds, “Most enterprises and service providers are just at the beginning stages of understanding data gravity's potential impact on their innovation, customer experience, and profitability. But they need to be designing for it now. The study is designed to give CIOs, chief architects, and infrastructure leaders insight into the phenomena causing architecture constraints as well as a blueprint for addressing them.”