One of the core themes of Future Ear has revolved around the idea of an impending marriage between voice assistants and hearing aids (and all hearables). This marriage of the two technologies became feasible five years ago with the introduction of Bluetooth hearing aids, enabling the ability to wirelessly tether one’s ears to the internet. Linking hearing aids to the smartphone opened the door for developers to build apps specifically for hearing aids, as evidenced by the hearing aid manufacturers themselves creating companion apps for their devices.
Right as the first Bluetooth hearing aids were entering the market in 2014, Amazon introduced us all to Alexa. Although voice assistants have been commercially available on mobile devices for more than 10 years and made popular by Apple’s Siri in 2011, the technological progress had stalled until Amazon’s Echo devices embedded with Alexa entered the market. In a blog post issued by Amazon to celebrate Alexa’s 5th birthday, the company cited the following four reasons why Alexa has been so successful in reigniting the progress around voice assistants:
- Wake word detection: On the device, detect the keyword “Alexa” to get the AI’s attention;
- Automatic speech recognition (ASR): Upon detecting the wake word, convert audio streamed to the Amazon Web Services (AWS) cloud into words;
- Natural-language understanding (NLU): Extract the meaning of the recognized words so that Alexa can take the appropriate action in response to the customer’s request; and
- Text-to-speech synthesis (TTS): Convert Alexa’s textual response to the customer’s request into spoken audio.
As it became clear that these four fundamental AI tasks proved to be the winning combo, Google and others followed Amazon’s lead by introducing their own lines of smart speakers. In the 5 years since Alexa, smart speakers have gone on to become one of the most rapidly adopted consumer technology products of all time. Meanwhile, each of the four AI tasks outlined above have each seen a good deal of innovation, making for more capable and robust voice assistants.
Through Amazon’s Alexa Voice Service and the Alexa Skills Kit, 9,500 third party manufacturers have been able to integrate Alexa into their products. One very notable development kit that Amazon created was its Alexa Mobile Accessory Kit (AMAK), which allowed for Alexa integration into Bluetooth devices, including hearing aids.
Along with the ascendance of hearables, Bluetooth connected hearing aids have grown to become the norm, comprising 94% of the hearing aids sold this year. Five years after Bluetooth hearing aids and Alexa debuted, the union of hearing aids and voice assistants has never been more feasible.
Alexa in the Ear
Yesterday, Heidi Culbertson, founder of Marvee, gathered a group of 22 adults aged 60-70 years old to introduce them to Echo Buds and the concept of Alexa in one’s ear. As a designer specializing in voice assistants and conversational AI, Heidi has a lot of experience with designing experiences specifically for our aging population, and her findings from yesterday were very enlightening.
First of all, it’s important to point out that the 22 people she gathered were all daily Alexa users. Second, as she notes in her tweets above, there are specific design challenges that need to be considered with both the hardware and voice assistants as they relate to the older demographic. That said, all 22 agreed that they liked the idea of having Alexa reside right in their ear. In a conversation I had with Heidi, she shared a few responses from the group:
“Just very cool to have Alexa in my ear!”
“Can I talk to Alexa while on my bike?”
“This will be great out in the backyard.”
Although these folks were introduced to Alexa-in-the-ear via Amazon’s Echo Buds, one can imagine a similar outcome had the device housing Alexa been an open-fit Bluetooth hearing aid.
Why this Matters
While Heidi was conducting her research yesterday, Voicebot published an article about a study being conducted in the UK by Abbeyfield, a non-profit, along with digital agency Greenwood Campbell and the University of Reading to understand the impact voice assistants can have on loneliness with older adults. A recent New York Times study found that loneliness has the same level of negative impact on one’s health as smoking 15 cigarettes per day, so it can’t be understated how important this potential aspect might be to voice assistants.
Video from the #VoiceForLoneliness Campaign
To compound the loneliness issue, one of the most oft-cited comorbidities linked to untreated hearing loss is loneliness. Since one of the leading indicators of hearing loss tends to be age (our sense of hearing tends to gradually depreciate), even folks living around others are susceptible to loneliness too. As it becomes harder to converse, it’s understandable that one might start to socially withdraw, which can lead to loneliness and social isolation.
That’s why research like Heidi’s makes me smile from ear to ear, as it helps support this idea that maybe the way to attack the growing loneliness concern is via ear-worn devices that house voice assistants. Not only would this provide a conversational partner, but it also empowers the user to utilize their voice assistant for many of the same type of tasks that we use our smartphones for today.
For folks who might be suffering from hearing loss, a Bluetooth hearing aid with Alexa embedded might kill two birds with one stone: encourage folks to wear hearing aids (hey, it’s got Alexa built-in!) while restoring their ability to engage in conversations with those around them.
-Thanks for Reading-