Today’s episode of the podcast features, Ryan Kraudel, VP of Marketing at Valencell. Valencell manufacturers optical-based, PPG (photoplethysmogram) sensors that are designed to be integrated into a variety of wearables, consumer hearables and hearing aids. If you’ve ever used an Apple Watch before and noticed the blinking green light on the underside of the device, that’s the devices’ PPG sensor that is responsible for capturing various biometric data points, such as one’s heart rate.
The way that these PPG sensors work is that they shine light into the body and then measure how much of the light is being reflected back based on blood flow. When you’re in the hospital and wearing a finger monitor, that monitor is using a PPG sensor and the aforementioned method to capture your vitals.
As we talk about throughout the episode, PPG sensors have been around for some time, but what’s changing is the miniaturization of the sensors, allowing for them to be embedded on something as small as the receiver of a receiver-in-the-canal hearing aid (the part that actually resides inside the ear canal). Along with becoming more conducive to consumer wearables, PPG sensors are big benefactors in the recent improvements in machine learning. As Ryan points out, the accuracy levels continue to climb because of how the data can be more intelligently parsed by improved machine learning algorithms.
One of the most exciting aspects of these sensors that we talk about is the fact that they’re capturing a very robust set of data and until recently, we’ve only been tapping into a small amount of insight from the total data set. Examples of new metrics that are beginning to be gleaned are things like heart rate-variability, which measures the distance of time between each heart beat. This is great for athletes who are hyper-intensive about measuring their training metrics, but heart rate-variability can also be used to identify abnormalities such as atrial fibrillation.
In addition, the latest metric that Valencell is beginning to enable capturing for is around blood pressure. According to Ryan, this would be a huge boon for the billion people living with high blood pressure, especially for the third of which who are oblivious to the fact that they have high blood pressure. Consumer wearables can help to serve as tools to identify and alert the user of preconditions that they’re not yet even aware they’re living with.
One of the biggest shifts that these type of biometric-laden wearables will usher in is the ability for people to start assembling their own, individualized longitudinal data sets for their health. Previously, metrics such as heart rate and blood pressure were captured during the few times of the year when one visits the doctor. AirPods, Hearing Aids, Apple Watches and so forth, might soon be able to collect these type of metrics on the minute, every hour that you’re wearing the device. So, rather than having two or three data points in your data set for the year, the user would have tens of thousands, painting a far more robust picture of one’s health and creating individual benchmarks that machine learning algorithms can work off of to detect abnormalities in one’s health.
For decades, we’ve largely treated our health in a reactionary manner; You go see your doctor when you’re sick. Now, we’re entering into a phase that offers much deeper biometric insights from massively proliferated consumer wearables, allowing for a more proactive approach. Each individual would have their own baseline of metrics that are established through the constant usage of consumer wearables outfitted with biometric sensors. The user would then be signaled whenever there’s a deviation from the baseline.
This type of proactive approach represents a massive opportunity for voice assistants to play the role of nurse, health coach, nutritionist, personal trainer, etc. Many of the hearable or wearable devices that we’ll be communicating with our voice assistants through (AirPods, Alexa Ring, AR Glasses), will be the same devices capturing our bio-data and establishing our baselines for each health metric. There certainly won’t be a lack of data for voice assistants to assess and communicate insightful findings from our data, but the real question will be which assistants get access to what data. That to me will all come down to which companies we trust with such sensitive data, but the possibilities with combining these technologies together have tremendous upside.
-Thanks for Reading-