Siri may be the first mainstream digital assistant, but it’s still the butt of many jokes now that it’s been bested by just about all of Apple’s biggest competitors.
It’s fair criticism too. Siri’s intelligence and responsiveness often lags behind Amazon’s Alexa or Google Now. And the competition shows no sign of giving up its lead. Google made a bunch of splashy AI-related announcements at its I/O conference last month. And Amazon’s CEO Jeff Bezos said at the Recode conference this week that there are more than 1,000 employees working on Alexa and the Echo speaker.
So, how can Apple take Siri to the next level?
One way is to open it up to third-party developers, the way Amazon did with the Echo/Alexa. (“Siri, what’s my Bank of America balance?” or “Siri, call an Uber.” And so on.) It sounds like that will likely happen this year, as Amir Efrati of The Information reported last week. We’ll know for sure on June 13 during Apple’s WWDC event.
The other way is boosting Siri’s intelligence and making it better able to understand context. Right now, interacting with Siri happens one session at a time, and you often have to look at your iPhone to confirm it’s doing what you want it to.
For example, this is what it looks like when you want to send a text message using Siri:
Even though I just told Siri I want to send my colleague Cadie a text message that says “Hi, how are you,” it still makes me confirm the action by looking at the screen and either tapping “Send” or telling Siri “yes” to confirm. That’s not much easier than texting the old-fashioned way. It’s also an indication from Apple that Siri isn’t reliable enough to consistently get it right the first time.
It’s the same case for a lot of things you ask Siri to do. Even though Apple added the ability to activate Siri with just your voice, you still need to look at your phone a lot for confirmation that it’s doing what you want. On the other hand, assistants baked into speakers like the Echo or the upcoming Google Home don’t have a screen, so the they have to be smart enough to complete a task without asking for confirmation. That’s the biggest area Siri needs to improve.
So, how does Apple fix it?
Luckily, Apple acquired an AI company last year called VocalIQ that’s really good at understanding context and completing tasks without requiring you to look at a screen. A source familiar with VocalIQ’s technology told me that the product is able to give you complete control without having to look at your phone. In one test, VocalIQ’s team was able to get the AI to manage someone’s email while the phone stayed in the user’s pocket the whole time. Yes, just like you see in the movie “Her.”
VocalIQ is also able to remember context “just like a human” according to the source, which means you never have to remind it what you asked for when you start a new session. No other digital assistant can do that. That alone should give Siri the boost it needs to surpass its AI competition.
Now the question is when and how Apple will implement VocalIQ’s technology into Siri. (Apple declined to comment.) But there’s no doubt the company is working with some impressive AI technology.