Daily Updates, Future Ear Radio

Alexa Conversations (Future Ear Daily Update 6-6-19)

Another Major Leap Forward for Alexa

Image result for remars

Amazon held it’s brand new re:MARS conference yesterday to showcase all types of innovation from inside Amazon and outside in other companies around machine learning, automation, robotics and space (MARS). This looked like a totally different type of conference than what tech companies typically host, as this was more of a top-notch science fair showcasing all kinds of breakthroughs in each of the four fields, than a conference around products and services.

That being said, there were still Amazon-related product announcements, such as new robots that will be used in Amazon’s many fulfillment centers, new drone delivery systems, the announcement that Amazon has gained FAA-clearance to begin testing drone deliveries starting in a few months, and, of course, announcements around Jeff Bezos’ much adored, Alexa. Amazon announced a new tool called Alexa Conversations, which might be one of the more significant developments around Alexa in years.

According to Rohit Prasad, Alexa VP and head scientist, “Our objective is to shift the cognitive burden from the customer to Alexa.” There’s a reduction in cognitive load both for the user and also the developer. This is feasible due to deep neural networks that provide a level of automation to aid developers to build natural dialogues faster, easier and with less training data, which ultimately translates into less interactions and invocations required by the user.

Cross-skill_predictor.png
Image: Alexa Developer’s Blog

The best analysis I have read around Alexa Conversations was Bret Kinsella’s Voicebot breakdown. Bret interviewed a number of Alexa developers at re:MARS (which will likely be aired on future Voicebot podcast episodes), and the common theme is that this is one of the most important developments regarding skill discovery, which has been one of the core issues with Alexa from the start.

Users can’t remember all the various invocations associated with skills, so by shifting that “discovery” element to Alexa based on the context of the search and your learned behavior, it can begin to surface suggested skills to you that you’d otherwise not known about or had forgotten how to invoke. As Bret points out at the end of his analysis, “If successful, however, it will likely usher in the most significant change in Alexa skill development since the introduction of Alexa Skills Kit in 2015.”

We’ll have to keep an eye on how well this new tool is received by the developer community and how users are responding to skills built with Alexa conversations. It certainly seems like a major change and potential step forward for Alexa to become more conversational.

-Thanks for Reading-

Dave

To listen to the broadcast on your Alexa device, enable the skill here

To add to you flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”

Leave a Reply