
The next episode of the Future Ear Radio podcast features first-time guest and Audiologist, Kat Penno, and re-occurring guest, Andy Bellavia. Kat’s based in Peth, Australia, and started the Hearing Collective in 2018, which offers personalized online hearing health services. She’s a big proponent of utilizing digital offerings within the hearing health setting, which is why I wanted to have her, along with my trusty hearables side-kick, Andy, on to chat about some of the AirPods-specific updates that are included in Apple’s next version of its operating system (iOS14).
In this new version of iOS, AirPods will see a slew of new features heading there way. I’ll be exploring some of the additional features in later episodes, but for this week, the three of us focused on the hearing accessibility features. The update reads as follows:
Headphone Accommodations
This new accessibility feature is designed to amplify soft sounds and adjust certain frequencies for an individual’s hearing, to help music, movies, phone calls, and podcasts sound more crisp and clear.** Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.

So, through a simple update to one’s iPhone, AirPods Pro owners will have access to new settings to custom configure the audio of their AirPods. The update allows for users to boost the digital sounds coming from their phone (i.e. phone calls, music, podcast, etc) and ambient sounds through transparency mode (i.e. other people’s voices). Users will have the option to simply toggle a scrolling bar, or they’ll be able to take a basic hearing test administered through their AirPods from the phone to allow for more tailored customization for those who might have a slight hearing loss. You can even upload your audiogram.

As we talk about throughout the episode, there are other devices on the market that provide these type of features. Some devices, such as Nuheara’s new IQbuds2 Max probably do a better job at customizing and amplifying ambient sounds than what AirPods Pro will be able to do. There are apps, such as Sonic Cloud, that provide a whole lot more depth and breadth in the customization of the digital sounds than what AirPods will do. But to me, that’s not really the point.
The reason that AirPods Pro new amplification features are so important is because of the sheer scale that Apple is operating at. Apple is the wearables king. It completely dominates the market share of our wrists and our ears. Just last year, Apple sold about 60 million pairs of AirPods – owning about 50% of the global hearables market share between AirPods & Beats. Samsung is the only non-Chinese company that cracked double digits (10%) of the market share and did so by bundling its Galaxy Buds with the sale of each Galaxy 10 to do so.
And keep in mind, AirPods Pro have only been on the market since October 2019… there is a LOT of runway here for Apple’s hearables.
The point is that tens of millions of people (and in a few years, hundreds of millions) will have access to this amplification feature overnight. I’m not sure that people will go out and buy AirPods Pros for these amplification features. With numbers this big, I’m sure that a sliver of people will. However, where the impact here will really be felt will be with the people who bought AirPods Pro because of all the other cool features, and then get exposed to this new functionality as a byproduct of a software update.

Since I started at Oaktree in 2016, I’ve seen countless presentations and articles that cite the pyramid above. The vast majority of patients that hearing care professionals see fall into the top two tiers as those are the patients who really need hearing aids. As the industry looks for growth, the perpetual question has been, “how do we tap into this mild-moderate market?”
The fact of the matter is that we can’t fit a square peg (hearing aids) in a round hole (mild hearing loss). It doesn’t align with any of the needs found within this tier of the pyramid, both from a lifestyle or fiscal standpoint. AirPods on the other hand present a totally different approach – rather than position amplification as the primary feature, AirPods Pro’s amplification is a secondary feature.
Again, I don’t know how many people will go buy AirPods Pro specifically for the amplification feature. Some people may, but I believe that most are buying AirPods Pro because they’re slick & do a lot of really cool things, and work in tandem quite beautifully with other Apple devices (much more on this in later posts/podcasts). By inverting the prioritization of the feature set, AirPods users can walk around augmenting & amplifying their digital and ambient environments, while they use their devices for the primary mechanism – streaming. It might seem like a subtle point, but I think it’s actually pretty profound.
Finally, the professional has a big opportunity here. As Kat points out, hearing professionals can embrace AirPods Pro and all these new consumer devices (which there will be many more to come), and view them for what they are – gateways to professional guidance. Due to the sheer scale of AirPods presence in the market, it’s safe to say that they’re going to account for a gigantic portion of “aha!” moments from people realizing they have a loss that otherwise have no clue.
These are people prime to cement long-term relationships with, even if the professional is not selling any type of device to them at the moment, but rather offering their expertise to help preserve and eventually guide them into more robust devices that match their needs along the way. With hearing loss being as pervasive and challenging to solve as it is, we need to totally rethink the way that people should approach the way they perceive their hearing health, and it starts with rethinking the type of solutions that realistically match their needs.
-Thanks for Reading-
Dave