OPEN APP
Home >Technology >Gadgets >Alexa has a new skill: Asking when it doesn’t know
Amazon's Alexa. (Reuters)
Amazon's Alexa. (Reuters)
wsj

Alexa has a new skill: Asking when it doesn’t know

‘Interactive teaching’ enables Amazon’s voice-assistant to solicit information when it encounters an unknown phrase. (Right now, it only works for lighting and temperature.)

Amazon.com Inc. said this week that after years of research its Alexa voice assistant can now figure out the meaning of requests it has never heard before. The upgrade, which the company calls interactive teaching, could represent a significant advance in the way AI-powered voice assistants interpret and learn from everyday conversation, experts say.

Interactive teaching is powered by deep-learning models, and it works by having Alexa ask questions about a task-relevant phrase it is encountering for the first time.

For instance, if a user asks Alexa to set the lights to “reading mode" and the device hasn’t heard that phrase before, it will ask what it means. If the user says it means to set the lights at 50% brightness, Alexa will remember that for the next time.

Before this, teaching Alexa new terms required manual steps, such as having engineers program updates, or having users type in verbal triggers for actions in the Alexa app.

But with the new capability, Alexa is able to recognize gaps in its understanding and initiate the learning process, said Rohit Prasad, vice president and head scientist for Alexa AI at Amazon. The upgrade is available in all U.S. Alexa devices.

AI services like Alexa “are pushing us into more generalized intelligence, and one aspect of generalized intelligence is how does the AI react to novel situations," Mr. Prasad said. “As humans, it is exactly what we do when we encounter a novel situation—we’ll come back with a clarifying question."

Adam Wright, senior analyst for IDC’s smart home research, said Amazon appears to be alone in having a system that can elicit more information on words or terms it doesn’t know. “This feature of training it to identify specific commands and remember those specific commands, that seems to be quite unique to Alexa for now," he said.

Today’s conversational AI systems are “still fairly brittle," said Satya Nitta, former global head of AI solutions for learning at IBM Research, meaning they break down easily when they encounter situations they weren’t explicitly trained to handle. A capability like interactive teaching can help make these systems, ranging from smart speakers or AI call center agents, more robust, said Mr. Nitta, who is now leading his own AI startup.

Mr. Nitta, who spent nearly a decade developing conversational AI systems, based his assessment of interactive teaching on details released by Amazon.

“This is a reasonably significant advance in the capability of conversational systems," he said. “It is not a seismic leap, but it is definitely advancing the state of the art and moving conversational systems towards a more natural conversational style."

Alexa uses multiple deep-learning models to manage interactive teaching. The first can identify an unfamiliar word or phrase—such as “reading mode"—and flag it as a learning opportunity. After Alexa asks for an explanation, another model extracts a definition from the user’s response.

A third model then steers the conversation to prompt a specific enough user response. For instance, if Alexa asks, “What do you mean by ‘reading mode’?" and the user says, “It means set the lights at a good level," the model wouldn’t accept that as sufficiently clear. In such a case, Alexa would follow up by asking: “Can you provide me with a value for brightness or color?"

The feature could make it easier for AI systems to conduct self-supervised learning, said Deon Nicholas, chief executive and co-founder of Forethought.ai, which offers natural-language-understanding software primarily for call centers. Many AI models require supervised learning, where humans label data to train systems, he said. But a capability such as interactive teaching could make it possible for “AI to label data itself by asking the user," he said.

The new feature has limitations, including the scope of its learning. For now, interactive teaching will only be available for tasks related to lights and climate-control devices, Amazon said, but will expand over time.

Kjell Carlsson, a principal analyst at Forrester Research Inc., said the capability could pave the way for AI systems that can learn without manually coded interventions. For instance, it could allow executives to ask a virtual agent for sales data about a specific product and teach it verbally about what kinds of insights it wants.

“This is the biggest step forward I’ve seen so far in the democratization of the ability to create new AI applications," he said.

This story has been published from a wire agency feed without modifications to the text.

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Click here to read the Mint ePaperMint is now on Telegram. Join Mint channel in your Telegram and stay updated with the latest business news.

Close
×
Edit Profile
My Reads Redeem a Gift Card Logout