Daily Updates, Future Ear Radio

Three Things I Learned from Dr. Eric Topol (Future Ear Daily Update 6-19-19)

AI and Your Doctor, Today and Tomorrow

A16z recently published a fantastic podcast titled, “AI and Your Doctor, Today and Tomorrow” where A16z general partner, Vijay Pande, interviews Dr. Eric Topol around how AI will impact healthcare. Dr. Topol is a longtime cardiologist and chair of innovative medicine at Scripps Research and he provides a fascinating perspective into how AI is already transforming healthcare and how it’s bound to shape the world of healthcare across the next decade. Here were three things I learned from this podcast conversation.

  1. “Natural Language Processing can liberate from keyboards” – this is one of the most promising use cases for voice in the healthcare space in the near-term as it applies to doctors leveraging the recent improvements in NLP to begin shifting the note taking aspect of their job from manual, typed inputs, to spoken inputs.

    “The fact that voice recognition is moving so fast in terms of accuracy and speed is really encouraging.”  – Speed is key, as doctors will only move toward a voice-based note taking system if they feel that there’s a real time reduction. Accuracy of the transcription is even more critical because it allows for machine learning to take place on top of the data. Today’s data is so error-ridden, which makes the overlaying of machine learning applications more challenging.

  2. “Multi-modal data processing – we are not doing it yet” – this was one of the most interesting portions of the discussion. Eric illustrates this concept by referencing someone with diabetes only having access to their glucose levels and that the patient can see whether the glucose levels are going up or down. Eric posits, “why aren’t we factoring in everything they eat and drink, sleep and activity and the whole works.” Wearables factor into the multi-modal data inputting that Eric is describing and will be key to adding more types of data that can constantly be monitored and factored into the total equation. Rather than going to the doctor and saying “I felt my heart flutter,” which isn’t all that helpful, patients can point to when exactly this happened on their Apple Watch.
  3. “How do you overcome the fact that not everything is quantitative, like cholesterol levels?” – Eric answers this question by alluding to something that had previously been subjective, like state of mind or mood, is becoming more objective. The way that’s happening is through new ways to measure voice biometrics, breathing patterns, and facial recognition to start establishing objective metrics. A sophisticated voice biometric system would be able to identify whether you’re depressed through all the rich data points it can collect and compare against the catalog of data that you’re constantly sharing, such as tone of voice.

My big takeaway from this conversation is that voice computing and wearables will have a significant impact on the future of healthcare, both from the doctor and the patients’ standpoint. As Dr. Topol mentions multiple times throughout the podcast, the hope is that AI and machine learning will offload a lot of the non-personal work to the computers, and free up the doctor’s time to get back providing “the human touch” that’s been lacking in the doctor’s office as they continually get saddled with more and more clerical work and drudgery.

-Thanks for Reading-

Dave

To listen to the broadcast on your Alexa device, enable the skill here

To add to you flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”

1 thought on “Three Things I Learned from Dr. Eric Topol (Future Ear Daily Update 6-19-19)”

Leave a Reply