Someone made a lot of new friends over the holidays. And by a lot, we’re talking about millions.
Her name is Alexa, and along with her acquaintances (perhaps frenemies?) Cortana and Siri, she’s about to become as much a part of the daily life of American households as Alfred the butler is at Wayne Manor. Except that Alfred, for all his talents, can’t tell Batman exactly what time the cable repairman will arrive this afternoon, or the current balance in his bank account (not that billionaire Bruce Wayne would care).
The implications of 2017, the year virtual assistants really arrived, will be immense for business. There isn’t a minute to lose; at least a dozen industries need to be looking at how these will change their value propositions, their business models, and their customer relationships.
But more about that in a minute. Why has 2017 become so important in the development of AI-enabled devices?
Three hurdles virtual assistants just vaulted
First, as the Amazon Echo shortage over the holidays proved, people are finally getting used to the idea of virtual assistants. As a source of information, music, home automation, shopping, and more, these technologies are making inroads into daily life. Industry experts are predicting that sales of AI-enabled devices will quadruple this year. That’s a lot of assistants — and the sales pace will surely accelerate.
Second, our exposure to AI personal assistants is getting broader. Homes, smartphones, and PCs set the phenomenon in motion, but soon our automobiles, schools, public venues, and even work spaces will be equipped with assistants that allow us to complete tasks faster and more conveniently. Just as we don’t talk to only one person (and not every virtual assistant can be expert in everything), we will have different helpers, with different names and personalities, in different places.
What’s more, virtual assistants will soon begin talking to each other. For example, there’ll be no need to call for an autonomous Uber or Lyft vehicle using our home device, followed by directions once we get into the car. Instead, we’ll simply tell Alexa or Cortana where we want to go. When the car arrives, the destination will be set, along with our favorite route and other personal preferences.
Third, and perhaps most important, a tipping point is occurring in the development of AI personal assistant technology. Natural language processing, or NLP, is at the very heart of artificial intelligence — and it’s getting more sophisticated. Since the early days of computational linguistics, NLP has focused on syntax and other formal language rules that can be described and codified. This meant that our voice interactions had to be formal as well. Informal language, with its slang, jargon, and imprecise structure, was not well understood.
A lot of progress has been made in this area. For the first time, NLP is being integrated with neural networks that learn over time. The implications are huge; now, queries no longer need to be expressed exclusively with complete sentences or keywords. After trillions of interactions, machine learning technologies are creating a new world in which our spoken interactions with computing devices can be totally human, with all their quirks and inconsistencies.
This means our ability to communicate informally with devices equipped with virtual assistants will improve dramatically. As our devices get to know us, they will not only anticipate our needs, but also develop something close to personality. Furthermore, our ability to communicate quickly and with nuance will improve. Speaking, rather than typing, will become the primary means by which we interface with all our devices, from smartphones and iPads to our PCs and even directly with televisions, automobiles, appliances, and much more. The use of keyboards and mice will eventually drop significantly if not vanish altogether. Screens will be used primarily to display the results of constant and continual interactions with our assistants.
What this means for 2017 and beyond
All of this brings us back to where, and why, enterprises need to be on the move this year. If they haven’t already, customer-centric companies need to understand how to transfer their interactions to the world of AI-enabled devices.
Customer support and online sales are two categories that are currently on the firing line of the virtual assistant revolution. Enterprises that fulfill these functions via huge call centers have two options: maintain their expensive infrastructure or invest in the next generation of voice-enabled technologies. With price parity being a fact of life in many industries, the winners will be those that can offer superior customer service in real time and at the lowest possible cost. AI-enabled computing, from virtual assistants to chatbots, is the answer.
Airlines, insurance companies, and e-tailers have already embraced the importance, value, and, yes, urgency, inherent in virtual assistants. It’s time for utilities, financial institutions, health care providers, security firms, delivery services, and others to do the same.
If 2015 and 2016 cracked the door on AI-enabled devices, 2017 has kicked it wide open. A new age of computing has begun. Surpassing even the much-anticipated Internet of Things is a new concept: the AI of Things. Forward-thinking enterprises are already addressing the imperatives, and the potential, of the changes brought about by this exciting and fast-evolving technology.
Cortana, Alexa, and Siri are only the first in a long line of assistants who will be applying for work in this new era. They’re smart, focused, and ambitious — and to stay ahead, shrewd companies will want to hire them.
Jordi Torras is CEO and founder of Inbenta, an artificial intelligence technology company.