As someone with a degree in English, nothing delights me more than people who can effortlessly apply literary canon to technology.
Susan Etlinger, an industry analyst at Altimeter Group, a consulting firm focusing on digital disruption, spoke late last year at TED in San Francisco about “the implications of a data-rich world” and how we can “use it to our best advantage.” Citing culture critic Neil Postman’s book Amusing Ourselves to Death (1985), she referenced the issues of irrelevance and narcissism in Aldous Huxley’s Brave New World and surveillance and power in George Orwell’s 1984. Bravo!
The report adapted from her speech addresses two key questions, one rather prosaic and the other less so: how we extract insight from data and, more important, how we use data in such a way as to earn and protect trust (italics mine). The latter encompasses multiple entities: trust of customers, constituents, patients, and partners.
As you might remember from college (or high school), Huxley and Orwell wrote about worlds gone awry in terms of trust and control. Looking at big data in the context of what’s negatively possible – even though such outcomes were posited more than a half-century ago – should never get old. A lot is at stake, after all. If people don’t trust big data – and by people I mean the technologists doing the tracking and the consumers being tracked – all the investments we’re making become useless.
Etlinger offers several terrific insights in her talk/paper, including a list of crucial processes: the identification of appropriate sources, the extraction of data, detection and interpretation of the language used, and filtering for spam. Not the least of these relate to linguistic variations – looking for relevance (i.e., the chasm between “trade gap” and “Gap store”), contextual analysis, and classification.
Most important, she lays out five key tactics for anyone tackling big data. The first two are fundamental for any new technology: defining the data strategy and operating model, and updating the analytics methodology to reflect new data realities.
But the last three are the ones I hope data scientists, CEOs, and anyone else in big data take to heart:
Seek out critical thinking and diverse skill sets. “To avoid misinterpretation means valuing not only math and engineering but also social sciences and humanities. … Without a balance of critical thinking, business knowledge and smart analytics tools, we’re in danger of making the wrong decision much more efficiently, quickly, and with far greater impact.”
Insist on ethical data use and transparent disclosure. “As organizations become more data centric, for their own benefit as well as their customers’, they must also look closely at the affirmative and passive decisions they make about [how they use and analyze data] and how transparently they disclose these actions.”
Reward and reinforce humility and learning. “The world is just starting to come to terms with the impact of data ubiquity [the most difficult impact of which] is that it radically undermines traditional methods of analysis and laughs at our desire for certainty. [Enterprises must develop an appetite for continuous learning, whether the goal is to sell a pair of shoes or to help prevent cancer.
These are the ideas, after all, that would most resonate with the social critics Orwell and Huxley, and make them feel like their visions of an inhumane future might yet be forestalled.
This article was written by Howard Baldwin from Forbes and was legally licensed through the NewsCred publisher network.