Bots. It’s all about the bots. And AI.
It seems that ever since Microsoft //Build this year, I can’t read any tech publication without the mention of some combination of bots, AI, and machine learning. There’s certainly no denying that it’s a hot topic, and lots of developers are working on it. Those developers, and more importantly, the consumers all have a lot of opinions on the future of their experiences with these bots.
As Satya Nadella mentioned in his //Build keynote, the OS that powers these experiences becomes less important. Apps become less important. It’s really about “conversations as platform” driving new experiences. But what will make our future of intelligent bots successful? If we look at the challenges we have today with well-known products, it paints a clear picture of what should happen.
Undoubtedly, the closest thing we have to the bot-centric future today is the presence of Siri, Cortana, and Alexa; the digital assistants that power many of the devices we interact with most commonly. All have their strengths and weaknesses (I’m not even going to mention the frustration that happens when they don’t understand what you are asking for) but there are two glaring problems that they all have:
As Satya mentioned, the experience needs to shift out of apps and to “conversations” instead. In order for this happens, there needs to be tightly woven integration between the specific apps that developers write and the “conversation.” A great example is music apps. I use an iPhone (recently converted from many years of Windows Phone use) but I don’t use iTunes, or Apple Music. There are a multitude of great streaming music providers, such as Spotify and Deezer. Personally, I use Deezer as I love the lossless audio that they can provide, which is great when played over my Sonos system. But when I ask Siri to play me some music, she knows nothing about this app or subscription (Cortana was no better).
Similarly, I don’t use the default iOS mail or calendar applications, because Outlook for iOS is so amazing. But again, it falls apart with Siri integration. She now knows nothing about my calendar or appointments. That said, Siri is aware of my Facebook events – they clearly have done things right. Similarly, Cortana is starting to get deep insights into apps such as Uber.
On the desktop, Cortana is well aware of my calendar…… and unfortunately everyone else’s. We use Gmail at Stackify, and when Cortana peeks into my calendar, she sees all of the calendars shared with me. I commonly get alerts about appointments that are not mine, which caused a lot of confusion one day last month when I showed up to my chiropractor’s office on the wrong day and time!
Clearly, both Apple and Microsoft need developers to buy into this vision.
One of the single greatest things about Cortana is the presence it has on both phone and desktop. However, for users like myself who have an iPhone in their pocket and a PC on the desktop, it creates odd experiences, especially when combined with some of the app integration woes I described above.
Just like with “real world” assistants, our digital assistants need to be able to have conversations with not just us, but with each other! An easy hand off from one to the other doesn’t seem unreasonable. If Siri is aware of something that Cortana isn’t they should talk, so that when I move from my phone to my PC I can have the same conversations.
Where do we go from here?
If a “conversation as a platform” is to drive our future experiences, then it’s quite clear that we need a universal language in which we can all have that conversation. Fortunately, we’re talking about software which has a tendency to evolve into open standards and APIs, left in the capable hands of developers.
And I, for one, welcome our new Bot Overlords – as long as they can tell me when my next meeting is.
CTO / Stackify