What does the death of Hadoop mean for big data?
WHEN businesses, organizations, and government agencies embraced big data, they needed robust infrastructures and software utility to support the storage and computation of the massive amount of data they would collect.
The Hadoop software framework, which facilitated distributed storage and processing of big data using the MapReduce programming model, served these data ambitions sufficiently.
The modules in Hadoop were developed for computer clusters built from commodity hardware and eventually also found use on clusters of higher-end hardware.
But the broader adoption of the open-source distributed storage technology that was invented by Google, however, did not come to be, as enterprises began opting to move to the cloud and explore AI, which included machine learning and deep learning as part of their big data initiative.
Worse, several big Hadoop-based solution providers that had been unprofitable for years were forced to merge to minimize losses, and one may be forced shut down altogether.
However, the questions remain if the fate of these vendors is only indicative of the demise of Hadoop powered solutions and other open source data platforms, or the death of big data as a whole? Was big data merely a fad or a passing interest of industries?
The short answer is: Not quite.
AI is transforming big data
Just last month, Google agreed to acquire the startup, Looker, a unified platform for business intelligence, data applications, and embedded analytics. Upon full acquisition, it will be integrated into Google Cloud.
“A fundamental requirement for organizations wanting to transform themselves digitally is the need to store, manage, and analyze large quantities of data from a variety of sources,” said Thomas Kurian, CEO of Google said in a Google blog post.
“The addition of Looker to Google Cloud will help us offer customers a more complete analytics solution from ingesting data to visualizing results and integrating data and insights into their daily workflows,” he added.
In a separate press release, Google’s parent company Alphabet said,” The addition of Looker to Google Cloud will provide customers with a more comprehensive analytics solution — from ingesting and integrating data to gain insights, to embedded analytics and visualizations — enabling enterprises to leverage the power of analytics, machine learning, and AI.”
Beyond that, Salesforce also announced its intent to acquire data visualization and analytics leader Tableau, which, according to the popular CRM provider, will help them play a more significant role in driving digital transformation across industries.
“Companies of every size and industry are transforming how they do business in the digital age—customers and data are at the heart of those transformations,” explained Salesforce CEO Marc Benioff.
Regardless of which platform is being used, a significant aspect of digital transformation is figuring out how businesses can benefit from data and drive decision and actions.
While Google and Salesforce have been doing this since their inceptions, many other businesses, regardless of size and scale, are embarking on their digital journey.
And in doing so, they are also leveraging more sophisticated data science tool in the cloud and exploring advanced AI technologies such as machine learning and deep learning.
In other words, big data is not merely a fad that was passing by and will end along with the Hadoop platforms. The real answer is far from it.
Big data has evolved and now overlaps with AI, thanks in part to technology, and a greater need to unlock its true potential to solve real business problems.
- How are emerging technologies transforming the supply chain?
- China set to beat North America in 5G investments in 4 years
- Is Deutsche Bank’s move to cut tech spending a good idea?
- Why big data, IoT, and AI flourish when supported by visualizations
- Are APAC companies still concerned about the reliability of blockchains?