Nobody needs to be told that data is becoming more complex. Every user realises that all forms of information are becoming richer, faster and more fine-grained. These truths of course mean that information management becomes more complex which is why we use data management platforms (such as databases) today.
Today’s sharpest unified data management platforms have a job to do. They have to act as a consolidated computing system responsible for accumulating, assembling, integrating and then subsequently managing and analyzing increasingly large (often unstructured) data sets, usually from disparate disconnected sources. What this all means if these systems work efficiently (if we buy into the world according to the big data vendors) is that information processes run outside of a data management platform ultimately have little or no context today.
The birth of complexity
Oracle, IBM, SAP, Salesforce and others are all busy using the fall convention season to refine and augment their key offerings in this space. VP and global head of developer relations at SAP Thomas Grassl has explained that data complexity and the development of predictive analytics of often comes about when trying to develop around the information bottlenecks that we come up against. This means that there’s a lot of programmer alchemy involved and, essentially, that the algorithms themselves are becoming more complex.
Why data is getting more complicated
Using a data management platform to look after names, phone numbers and address information isn’t good enough anymore – we need to be able to bring additional analytical layers to bear upon our data including:
- Geospatial analysis – we need to know where users are on the planet and factor that into our data analysis if location access privileges are open.
- Voice analysis – voice data is increasingly captured and analysis from natural language processing technology from companies like Nuance is helping to translate voice data into “written” data for onward analysis such as…
- Text analysis – not as simple as it sounds, textual information analysis might be a) plain old text b) text from voice c) optical character recognition or d) other more granular analysis such as…
- Sentiment analysis – data management platforms are being constructed with built in libraries (in different human languages) that can detect sentiment i.e. I can Tweet that I hate a particular hamburger chain, but if I Tweet that “I wish Five Guys had more A1 sauce” then I am expressing a hope and a specific product-related desire – and at an aggregated level if many users say the same kind of thing, then this starts to matter.
More industrial reasons
Industry has plenty of reasons and use cases for making data analytics more complex. Data is no longer just data; we have hot data, cold data and archived data. Hot data is needed now, often in real time. Cold data is needed a little bit later and it may not matter how long later, as long it comes after hot data. Archived data is neither hot nor cold, but it was once at least both. This is where SAP will talk about ‘dynamic tiering’ in the SAP HANA service pack 9 (SPS9) release this month. If we can tier data then we can put hot in one place, cold in another and so on. The firm says this is a means to cost effectively manage very large data sets and provide streamlined and transparent access to critical data.
SAP’s Grassl has also pointed to other ‘complexity reasons’ for developments in HANA SPS9, “Multi-tenancy aims to simplify provisioning and management of multiple database workloads on premise and in the cloud. [An] integrated smart data streaming option to process and analyze massive amounts of data in real time, allowing customers to make critical business decisions on the fly. High-performance, ACID-compliant graph storage and engine capabilities, which can be used to process enterprise data or highly interconnected social network and supply chain data.”
Machine learning and tomorrowland
These technologies will lead us onwards to looking at machine learning. Artificial Intelligence (AI) is starting to sound a lot less artificial if we start to understand the heuristics involved behind the way we deal with information. Heuristics are methods for problem solving based on learned experiences and logical guesswork rather pre-established mathematical formulas. Using many of the methods, channels and tools discussed here we are reaching a point where computers can start to make decisions that look after us. Yes we already have aircraft autopilots that operate within a comparatively (to our brains) defined physical world, but we are talking about deeper and more complex level of human understanding through analytics. The world is getting better, probably.