Alexa Is About To Get Much Better At Avoiding Skill Overload
Alexa is about to get better at handling more complicated, multi-step conversations, as well as acting more like a useful personal assistant. The new functionality was revealed by Amazon's director of applied science, Ruhi Sarikaya, who leads the Alexa Brain initiative trying to make AIs more natural to engage with.
Amazon certainly hasn't been slow in rolling out new abilities for Alexa. Indeed, it's arguably trampling all over rival artificial intelligences like Google's Assistant and Apple's Siri when it comes to adding third-party talents – which Amazon refers to as "Skills" – to Alexa. That, Sarikaya concedes, can make figuring out exactly what Alexa is capable of doing tricky.
At the moment, users with an Echo smart speaker or another Alexa-enabled device must first locate the Skill they're looking for, activate it, and then interact with it. That puts the onus on discovering the Skill onto the user. However, in the coming weeks that's going to change, at least in the US.
Alexa will use machine learning to better identify an appropriate Skill – even if it's not activated – depending on the question or comment she's asked. In Sarikaya's example, he asked "Alexa, how do I remove an oil stain from my shirt?" and the assistant suggests "Here is Tide Stain Remover." That's despite never having interacted with Tide's Skill before.
"This beta experience was friction-free; the skill just walked me through the process of removing an oil stain from my shirt," Sarikaya explains. "Previously, I would have had to discover the skill on my own to use it."
It's not the only example of Alexa getting smarter at conversations, however. Another change coming soon is understanding of multi-turn utterances, or "context carryover," building on the ability to ask a single follow-up question. Initially intended for the US, UK, and Germany, it will allow utterances without pronouns to be understood.
For example, you'll be able to ask "Alexa, how's the weather in San Francisco," and then follow up straight away with "How about this weekend?" Alexa won't need to re-check where you're asking about, or be triggered with her keyword again.
Beyond that, there'll also be support for context carryover beyond domains. Whereas the last example focuses on a single topic – the weather – Alexa will also be able to understand different questions. You could ask "Alexa, how's the weather in San Francisco," then follow up with "How long does it take to get there?"
Finally, there'll be a new memory feature. Launching initially in the US, it'll turn Alexa into a PA that can keep track of any arbitrary information. The trigger "Alexa, remember that XYZ" will be used, with Alexa keeping a log of all the items.
The new features are all intended to help address some of the lingering criticisms about smart assistants, that they have plenty of abilities but can make them tricky to identify, and that they require an awkward conversational style for interactions to be successful. That's going to be increasingly key as services like Alexa spread to other locations, such as in the car, where on-screen prompts or lengthy guided tutorials could be too distracting or time-consuming.