Audiology, Daily Updates, Future Ear Radio, Hearables, Hearing Healthcare

Signia’s Acoustic-Motion Sensors (Future Ear Daily Update 9-18-19)

Acoustic Motion Sensor 2.jpg

Much of what I am excited about right now in the world of consumer technology broadly, and in wearables/heararables/hearing aids more narrowly, is the innovation happening at the component level inside the devices. I’m still reeling a bit from Apple’s U1 chip embedded in the iPhone 11 and the implications of it that I wrote about here. New chips, new wireless technologies, new sensors, new ways to do cool things. Now, we can add another one to the list – acoustic-motion sensors – which will be included in Signia’s new line of hearing aids, Xperience.

Whereas video and camera systems rely on optical motion detection, Signia’s hearing aids will use its mics and sensors to assess changes in the acoustic environment. For example, if you’re moving from sitting at a table speaking face to face with one person, to a group setting where you’re standing around a bar, the idea is that the motion sensors would react to the new acoustic setting and then automatically adjust the mics accordingly, from directional to omni-directional settings and balance in-between.

These acoustic-motion sensors are part of a broader platform that simultaneously uses two processors, Dynamic Soundscape Processing and Own Voice Processing. The Own Voice processor is really clever. It’s “trained” for a few seconds to identify the user’s voice and differentiate it from other peoples’ voices that will inevitably be picked up through the hearing aid. This is important, as multiple hearing aid studies have found that a high number of hearing aid wearers are dissatisfied by the way their own voice sounds through their hearing aids. Signia’s Own Voice processor was designed specifically to alleviate that effect.

Now, with the inclusion of acoustic-motion sensors to constantly monitor the changes in the acoustic setting, the Dynamic Sound processor will be alerted by the sensors to adjust on-the-fly to provide a more natural sounding experience. The hearing aid’s two processors will then communicate with one another to determine which processor each sound should feed into. If you ask me, that’s a lot of really impressive functionality and moving pieces for a device as small as a hearing aid to handle, but it’s a testament to how sophisticated hearing aids are rapidly becoming.

I’ve written extensively about the innovation happening inside the devices and what’s most exciting is that the more I learn about what’s happening, the more I realize that we’re really only getting started. A quote that still stands out to me from Brian Roemmele’s U1 chip write up, is this:

“The accelerometer systems, GPS systems and IR proximity sensors of the first iPhone helped define the last generation of products. The Apple U1 Chip will be a material part of defining the next generation of Apple products.” – Brian Roemmele

To build on Brian’s point, it’s not just the U1 chip, it’s all of the fundamental building blocks being introduced that are enabling this new generation of functionality. Wearable devices in particular are poised to explode in capability because the core pieces required for all of the really exciting stuff that’s starting to surface, are maturing to the point where it has become feasible to begin implementing them into devices as small as hearing aids. There is so much more to come with wearable devices as the components inside the devices continue to be innovated on, which will then manifest in cool new capabilities, better products, and ultimately, better experiences.

-Thanks for Reading-


To listen to the broadcast on your Alexa device, enable the skill here

To add to your flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”

Leave a Reply