Big Data may finally have arrived. Not “arrived” in the sense that everyone is swimming in data lakes and discovering actionable insights and other buzzwords. After all, most companies, including yours, are still baffled by Big Data and how to derive value from it.
But for the first time in years, the Big Data fence-sitters have decided to get into the action. According to recent Gartner data, Big Data experimentation has hit 73% of enterprises, suggesting that too much is at stake with big data to sit it out.
The trick now is to learn how to optimize those Big Data projects so they can fail, fail and fail again—and yet produce useful lessons for improvement with each iteration.
More Companies Jumping Into Big Data
While some signs point to a slowdown in Big Data-land, like this tweet from Gartner analyst Nick Heudecker—
—other data suggests the opposite. Some of it, ironically, from Gartner.
For example, for the last few years Gartner has been asking survey respondents,”Which of the five stages best describes your organization’s stage of Big Data adoption?” From 2012 to 2013, the number of naysayers remained roughly constant:
Source: Gartner, 2013
But this week Gartner released its newest survey data, and the percentage of respondents declaring they have “No plans at this time” to embrace Big Data declined considerably:
Source: Gartner, 2014
That’s a seven-point drop in the “no plans” contingent, swelling the ranks of those investing in or planning to use big data projects to 73% from 64% in 2013.
That’s a big deal.
Learning To Try
Of course, many organizations continue to struggle to put their data to good use, which is why a mere 13% of organizations have actually rolled out Big Data projects. That’s a nice leap from 2013, but still indicates that technology vendors haven’t done nearly enough to simplify their products and that many organizations have the wrong approach to Big Data to begin with.
The gap between “want to work with Big Data” and “actually work with Big Data” is also captured in this 451 Research chart:
Source: 451 Research
Part of the problem is that we’ve confused what Big Data actually means—volume is rarely the most important problem to solve; variety of data is—and we think of it as a discrete project rather than as a core component of a company’s culture.
Cloudera co-founder Mike Olson nails this in a recent interview with Bosch’s Internet of Things group:
We talk to a lot of people who are fascinated by the technology of [Big Data]. They are excited about Big Data as Big Data. Those are bad people for us to work with, because they are not fundamentally driven by a business problem. It’s important when you start thinking about [Big Data] to think about why it matters…. The “shiny object syndrome” of engineers who want to play with new technology—I totally get that, I am one of those guys, but those projects generally fail because they don’t have clear success criteria.
The key, as I’ve written, is to set up an architecture of experimentation. This involves a heavy reliance on open-source software, cloud-based hardware and a multi-faceted team that understands your business and the right questions to ask of your data.
It’s clear that many organizations don’t follow this practice, or we wouldn’t see nearly half of CIOs surveyed by Deloitte saying they have inadequate budget to fund innovation. Innovation isn’t a matter of big budgets; it’s a matter of little iterations.
By embracing this more agile approach to big data innovation, more organizations will discover how to turn big data tire-kicking into big data success.
Lead image courtesy of Shutterstock