Are enterprises sitting on a data mismanagement time bomb?

Data management, backup and recovery still rely on traditional systems which have, over time, become inadequate for the complexity of handling vast quantities of data

Ashish Gupta
Published3 Aug 2017, 05:02 PM IST
Data, the biggest asset of an organization, also becomes its biggest challenge if left inadequately managed.
Data, the biggest asset of an organization, also becomes its biggest challenge if left inadequately managed.

With the digitization of traditional businesses across the world and the growth of new-age companies, every business is a digital business today. As a result, the data each organization produces every hour is phenomenal, and the requirement of keeping and utilizing this data is critical. However, data management, backup and recovery still rely on traditional systems which have, over time, become inadequate for the complexity of handling vast quantities of data.

Data, the biggest asset of an organization, also becomes its biggest challenge if left inadequately managed. So, are enterprises sitting on a data mismanagement time bomb?

In the past, enterprise data and applications used to operate only in the enterprise data centre. Everything was neatly contained and was possible to be managed within a single on-premise facility. Today, enterprise applications operate across multiple infrastructure and technology platforms, which may vary even within the same data centre and also across different types of cloud computing facilities. Effectively, this creates fragmentation of the enterprise data across public, private and hybrid clouds.

This kind of data fragmentation is reminiscent of what happened with consumer data in the 2000s, the early days of the Internet. A powerful organizing principle came up in the form of search engine technology. Does enterprise data need a similar organizing principle? The answer is yes.

If we were to rethink enterprise data management today, what would be the most important challenges that it should solve?

Security: Without doubt, such a data management solution needs to be highly secure. Data should never be lost regardless of hardware failures in servers, storage or network. The business data should be immune to malicious threats such as ransomware.

Scalability: The solution should be scalable. As digitization grows—with the adoption of technologies such as the Internet of Things (IoT) for tapping into data generated at every step of an organization’s operations/customer journey—the business data footprint grows exponentially. Any enterprise data management solution needs to scale up rapidly and infinitely to meet this growth.

Such a solution should be able to move at the same furious pace at which the business grows. An organization’s speed of decision-making depends on the speed of availability of data and, as such, data operations need to happen in seconds, not in minutes or hours.

Flexibility: Data management needs to be hyper-flexible—it should be possible to make the data available everywhere across the organization irrespective of where it is generated. Enterprises hate being locked-in by technology vendors to a particular solution or technology. They want to experiment, moving applications from one infrastructure stack to another, without any limitations.

In private clouds, this means they should be able to change infrastructure vendors such as one hypervisor to another, or moving from one storage vendor to another. In public clouds, this mean they are able to change the cloud provider itself and move between the likes of Amazon Web Services (AWS), Microsoft Azure or Google Cloud.

Ease of use: Contemporary data management should be user-friendly. It should give chief information officers (CIOs) and their teams a single interface, offering a view of their data footprints and growth patterns, across the entire enterprise technology infrastructure.

These are fundamental needs which an enterprise’s data plumbing needs to satisfy all at the same time, as large production workloads are moved to the cloud. However, in the desire to rush and meet their cloud and big data goals, many a time, CIOs underestimate, if not forget, the complexity it takes to build out a robust data management fabric.

Over the past decade, the consumer Internet space has witnessed deep innovation. For cloud-based companies such as Google, Facebook and Amazon, technologies such as distributed file systems enable infinite, horizontally-scalable compute, network and storage just by adding more and more commodity hardware. This gives them complete immunity to hardware faults by making software intelligent enough to be fault-tolerant.

However, these technological advances have almost completely escaped the enterprise data management space. Surprisingly, the current enterprise data products are largely unable to cater to the needs of large-scale data generation, management and recovery. They still utilize architectures such as client-server and vertically-scaled specialized hardware.

In summary, as digitization becomes more common, there is an urgent need for organizations and governments to wake up to the serious challenge of large-scale data management. The current data management infrastructure is past its use-by date. Organizations need to seriously evaluate their data storage, management and recovery practices, and introspect how long it would take for them to get back on their feet if all their data was wiped out today.

Ashish Gupta is vice president of engineering and head of India R&D at Rubrik Inc.

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.

Business NewsTechnologyAppsAre enterprises sitting on a data mismanagement time bomb?
MoreLess