What you should expect from big data in 2016
Big data has truly progressed from being just a buzzword to being an essential component of many companies' IT infrastructure and business plans. How we store, analyze, and process big data is changing the way we do business, and the industry is in the midst of the biggest transformation in enterprise computing in years.
Organizations can now look for patterns that are indicative of current or even future behavior. And the acceleration in big data deployments is helping to identify where we can expect the really big advances to be made in the near future.
But in a space which is constantly changing, with many new innovative technologies on the horizon, not to mention a particularly noisy marketplace, differentiating between what is hype and what is just around the corner can be challenging.
As such, I have outlined what I believe will be the five biggest trends in big data that will consistently permeate the news agenda this year.
A Converged Approach Enters the Mainstream
It has long been accepted that best practice is for operational and analytic systems to be kept separate in business applications, so as to prevent analytic workloads from disrupting operational processing. Hybrid Transaction/Analytical Processing (HTAP) is a term that was coined in 2014 by Gartner to describe an exciting new generation of in-memory data platforms that can now perform both online transaction processing (OLTP) and online analytical processing (OLAP), without the requirement for data duplication.
Gartner was giving a name to something that is already happening in the marketplace. In 2016, converged approaches will become more mainstream as leading organizations reap the benefits of combining production workloads with analytics in response to changing customer preferences, competitive pressures, and business conditions. This convergence speeds up the data-to-action cycle for organizations and removes much of the latency between analytical processes and tangible impact on the business.
Momentum Shifts From Centralized to Distributed Data
There has been plenty of back and forth between centralized and distributed workload models; like all things in tech, these things tend to go in cycles. For a long time, big data solutions have been built around centralized data lakes that reduced data duplication, simplified management, and supported a variety of applications including 360-degree customer analysis.
In 2016 though, large organizations will increasingly move to distributed processing for big data to address the challenges of managing multiple devices, data centers, and global use cases, in countless locations. That’s not to mention the potential changes to overseas data-security rules with Safe Harbor 2.0 on the horizon. The continued growth of Internet of Things connected devices, fast networks, and edge processing will further dictate the deployment of distributed processing frameworks. This will support an increasingly diverse number of data sources providing information to the network simultaneously.
Storage (Particularly Flash) Available in Abundance
Advances in Flash memory are driving new designs for storage products in the consumer, computer, and enterprise markets. As consumer demand for flash goes up, and costs inevitably go down, flash deployments in big data will increasingly occur. The optimal solutions though, will make use of flash and disk storage to support both fast and dense configurations. Ultimately, organizations no longer need to choose between one or the other because in 2016, this new generation of software-based storage that enables multi-temperature solutions will proliferate -- and they get access to both.
"Shiny Object Syndrome" Gives Way to Increased Focus on Value
In 2016, the market will focus considerably less on the bells and whistles provided by the latest software product, and more on established solutions that have proven a fundamental business value. New community-driven open source innovations will continue to turn heads, but this year, organizations will recognize the attraction of a product that results in a tangible business impact, rather than on raw big data technologies -- which, while promising an exciting new way of working, really just cloud the issues at hand.
Quality Wins
Investors and organizations will turn away from more volatile big data technology providers that regularly change their model, unable to find one that they can make into a viable business. Instead, they will focus on doing business with more secure options -- those companies that have both a proven business model and technological innovations that lead to improved operational efficiencies and business outcomes.
Now more than ever, an organization’s competitive stance relies on its ability to leverage data to drive business results. That’s easier said than done when it’s pouring in from every origin imaginable. Organizations with access to a converged data platform can take advantage of the widest variety of data services and processing tools on a single data platform and will be able to harness real-time insight from their streaming data. This translates into real-time views into their customers, products, and operations.
John Schroeder, CEO and Cofounder, MapR
Published under license from ITProPortal.com, a Net Communities Ltd Publication. All rights reserved.
Image Credit: Tashatuvango / Shutterstock