Voicebot reported yesterday that a select number of Walmart consumers could now begin ordering items via voice through Walmart’s Google Assistant action. As Bret Kinsella pointed out in the article, Walmart had initially partnered with Google in a similar capacity, allowing customers to order via voice back in 2018 through Google Express, but ended this particular collaboration in January 2019. The Key difference here is that Google Express was controlled by Google, and exposed consumers to other retailers, while the Google Action is solely controlled by Walmart.
This is a perfect representation of where things appear to be headed with smart assistants in a broad sense. That’s to say that Alexa, Siri, Google, and Samsung’s Bixby (I may be neglecting others) sit at the very top echelon, representing “master assistants” (I saw Ben Basche use that term once and thought it very eloquent). These master assistants play the role of facilitating between the user and the brand/retailer/company/etc’s smart assistant. The middle-man per-se.
So for this particular example, Google Assistant is facilitating the interaction between Walmart and the user. As more and more companies become voice enabled, they’ll be tasked with two objectives. The first is that they’ll need to extend their brand to a voice experience to the point of having its own assistant. The second is that they’ll need to work with the companies that reside behind the master assistants to ensure that their voice experience is accessible by all the master assistants that broker the exchanges between the company’s assistant and the user.
Meanwhile, as everything on the internet becomes voice enabled, the master assistants will be tasked with interconnecting disparate actions or skills, to allow for more complex queries. For example, “Alexa, tell me when would be a good time to go to Montreal for vacation.” This then sends Alexa off in different directions to aggregate information for a response that factors in data from my work calendar (Outlook), airline prices (Southwest/Expedia), historical weather patterns (Accuweather), lodging prices (AirBnB, Hotels.com), bands or sports teams playing near the city during that time (Seatgeek), etc.
In what would take me 10-15 minutes bouncing around apps or the web, Alexa comes back in 5 seconds with, “Ok, Dave, it looks like the first week of May would be ideal, or the last week of August based on your schedule, budget, weather, and things to do. Would you like to learn more?” As the queries become more complex, the reduction in friction becomes more pronounced.
There’s a massive arms race going on right now with the top tech titans and the reason being is that they all want to own the most dominant master assistant because the master assistant appears to be set to become the master broker for all exchanges between users and companies via voice.This is why we’re seeing Jeff Bezos go all in on voice and dedicate a 10,000 person team toward Alexa. Why Google tripled its floor space at this year’s CES and had one of the biggest exhibits of all time, strictly dedicated to Google Assistant. Why Apple has…. wait, never mind.
It’s still early and things can definitely change, but signs are pointing toward a two-class, smart assistant future.
-Thanks for Reading-
2 thoughts on “Future Ear Daily Update: 4-3-19”