As recently as the past two years, one of the seminal issues regarding Big Data was storage, especially with respect to the exponential growth and size of unstructured data that did not fit into databases (e.g., video feeds, PowerPoint presentations). Indeed petabytes and exabytes of data exist in science, technology, commerce, national defense, telecommunications, and other fields. Today, however, the competitive landscape is very different. Proper storage is merely a pre-condition to finding the real jewels in Big Data—turning data from massive streams into knowledge, and thereby actionable intelligence in real time as events unfold.
The following five steps are imperative to master Big Data and drive business growth.
1. Infer, Infer, Infer. Understand that not all Big Data is useful data. According to the renowned AT&T Bell Telephone Laboratories statistician John Tukey, “data may not contain the answer. The coordination of some data and an aching desire for an answer will not ensure that a reasonable answer can be extracted from a given body of data.”
In an era of data-centric science, we now have advanced analytics that permit inferences from granular data. Inferences transform data into knowledge, which results in greater process transparency and improvements. When evaluating the need to institute analytics as part of your data strategy, it is important to remember that actionable knowledge is not inherent in data per se; rather, it must be extracted based upon established rules and algorithms. While extraction and inference require the type of core platform described below, this should not be overly complicated. As the National Research Council of the National Academies states, even “naive users” should be able to “carry out massive data analysis without a full understanding of systems and statistical uses.” Data scientists play an indispensable role in today’s corporation, but business line executives should not have to rely on them to run analytics and make the inferences that are the basis for decisions. See Frontiers in Massive Data Analysis (National Academy of Sciences 2013). As McKinsey puts it, “sophisticated analytics solutions . . . must be embedded in frontline tools so simple and engaging that managers and frontline employees will be eager to use them daily.” Mobilizing Your C-Suite For Big Data Analytics (McKinsey & Company 2013).
2. Empower a C-Level Data and Predictive Analytics Champion. With big data analytics changing rapidly and straining information structures, corporations and governments need what McKinsey calls “executive horsepower” or “top-management muscle” behind its data initiatives. Id. Accordingly, a C-level officer (e.g., Chief Data Officer, CTO, or Chief Analytics Officer) who comes from both a supply chain and analytics background must have the mandate to lead model analytic centers. In order to succeed, analysts with deep data experience must have a clear strategy with defined initiatives to achieve business results. A forward-thinking analytics strategy thus needs to take place at the business unit level. Why? First, priorities will differ by business unit; the treatment of data in one business unit may have little utility in another. Second, management priorities have to reinforce functional level goals with targets and metrics. A C-level executive who can work with business line managers and still champion analytics in the C-suite is a must.
3. Assess And Modify Your Supply Chain In A Multidimensional Global Context. Do not examine your supply chain without first considering logistics at a macro level. According to the World Economic Forum’s Outlook on the Logistics & Supply Chain Industry (2013), “ratios of trade to GDP for the world as a whole have increased from 39% in 1990 to 59% in 2012.” This change is in large part the result of a “targeted and concerted effort by industry and governments to increase economic growth and jobs.”
How does your supply chain fit into this larger context? Consider this: a corporation’s failure to maximize the knowledge in its data and thereby to contribute to unnecessary logistics costs imposes upon itself and others (as well as international trade) what amounts to an inefficiency-based tax. How are corporations faring on other fronts? They spend an astonishing average of eight percent of net sales on transportation, warehousing, customer service, administration, and inventory carrying costs. Yet many do not have a comprehensive view of their data, let alone their upstream or downstream logistics functions. See Supply Chain Logistics As A Driver of Business Strategy and Profitability (C.H. Robinson 2013).
Being in the blind has transformative impacts—and not the kind you want. What effect is your logistics framework having on your company? On customer loyalty? How can you use most effectively leverage your data both past and present, and what technology do you need to do so? And while an analysis of your supply chain will ultimately include your relationships with parties such as customers, manufacturers, providers and retailers, it should begin with an inward-facing assessment of key assets.
4. Give Your Data Time-Critical Situational Awareness. In most organizations, data must be pulled from disparate and distributed sources and then processed to yield actionable intelligence. Analytics help a business line identify potential points of improvement. Corporations need to make changes not only in real time as events unfold, but also within the constraints posed by the increasingly distributed nature of modern data sets.
Current supply chain management must be concerned with multi-dimensional data that includes temporal and geospatial elements. Examples of temporal data are the acquisition of data from sources such as the Internet, speech and video data, real-time imaging from satellites, and ground-based sensors. Such inputs can be difficult to analyze because the different sources that comprise the data stream have different latencies. Moreover, the amount of temporal data is growing exponentially. Geospatial data, on the other hand, tracks location, whether that of a storm, a car, or a tornado that may render impassible to your trucks certain highways, thus demanding quick redirection to avoid time lost and equipment damage. For shippers, for example, both elements come into play: it is useful to know the location of ships, containers, and even packages in real time and/or two days prior in order to see if interim movement in is unusual and requires action. Coupled with temporal data, a logistics analyst can make informed decisions as events in his supply chain unfold.
5. Rely On A Core Platform That Creates Derivative Intelligence And Knowledge In Real Time
Building a robust supply chain management platform from scratch or by combining point solutions is nearly impossible. From the perspective of cost alone, it is much more effective to partner with a third-party cloud-based solution provider. Your criteria when you choose a platform should be stringent. According to Tim Fleischer, CEO of TransVoyant, a technology and services company that enables sub-second operational decisions and support, criteria should include:
First, no data latency—you need to see your assets in real time, in motion, as they unfold. This is critical to actionable intelligence. Second, the platform must be multi-dimensional. Failure to capture temporal and geospatial data will leave even the savviest company flat-footed. Third, your core platform must allow you to visualize your data assets in multi-dimensions so that you can see what is happening in real time. Fourth, it must be flexible enough to accommodate different supply chains in your organization. Cargo shipping may bear little resemblance from auto parts to Apple computers. Finally, the platform must yield derivative intelligence that will become your company’s intellectual property and thus comparative advantage. This also applies to risk, such as controlling your DSOs, another nugget to be gleaned from your knowledge-filled data.
Supply chain management should take place on a platform that resides in a cloud such as Amazon Web Services (AWS), which recently was predicted to be worth $50 billion by 2015. See Tiernan Ray, Amazon’s AWS A $50 Billion Value (Barron’s Nov. 18, 2003). Cloud computing is not going anywhere, and the security of such massive vendors is appropriately robust. The power of the cloud allows for the extraordinary processing power made possible by distributed computing. Now, statistical inferences can turn data into actionable intelligence that supports reasoned decisions. Moreover, companies can scale as they wish and absorb only the marginal costs of their expansion.