Apple, Daily Updates, Future Ear Radio, VoiceFirst

Apple’s U1 Chip (Future Ear Daily Update 9-13-19)

Apple U1 Chip

Perhaps the most interesting revelation from Apple’s product event Tuesday was what was not announced a single time during the presentation: Apple’s U1 chip that will be embedded into the iPhone 11 and iPhone 11 pro. The only reference of the chip during Phil Schiller’s presentation was an image of it during Phil Schiller’s slides. For today’s update, I want to share what I have learned the past few days about the implications of the U1 chip  from people a lot more knowledgeable about this technology than me .

Image from Brian Roemmele’s Quora post regarding Apple’s September 10th, 2019 event and the introduction of the U1 chip

First of all, I’ll be the first to admit that I was unaware that Apple was going to be bringing this type of chip to market. The person who really turned me onto understanding what’s going on here, was none other than Brian Roemmele. Brian wrote an incredible Quora post that completely breaks down what’s going on with this chip. Much of his analysis stems from the 35+ years of patent analysis he’s been doing, largely centered around Apple’s patents. His Quora post now serves as the foundation for my understanding of the U1 chip, much like how his post on the Apple Card helped to shape my thinking around the trajectory of its Apple’s finance goals. Both are well worth your time and offer way more depth than what you’ll find here.

The U1 chip is powered by Ultra Wideband (UWB) radio technology which uses a very low energy level for short-range, high-bandwidth communications. This type of wireless technology is perfect for spatial awareness and precise GPS. It’s basically a competitor to Bluetooth, but more accurate as it’s able to precisely locate an object within 10 centimeter range, as opposed to the current version of Bluetooth which is roughly a meter. It’s also about four times faster than Bluetooth currently. Brian suggests that the power requirements for UWB is so low that you could likely power a device using UWB on a hearing aid battery-sized coin cell that would last upwards for a year.

Although UWB has been around for decades, Bluetooth has ultimately won out to this point because it has historically been cheaper to implement. As a result, we have all types of legacy infrastructure and systems built around Bluetooth, so even though UWB might be more technologically advanced and capable, the incumbent and current standard, Bluetooth, is going to be tough to unseat. There are only a handful of companies that can truly influence something such as the preferred method of connectivity and wireless communication, and Apple is one of them.

The first application of this chip will be the ability to point your phone at any fellow U1 chip user and quickly AirDrop them files. However, there’s much, much more to this. Brian, like he is so in my instances, was way out ahead of this announcement. He was cryptically tweeting as the event approached about, “following the balloons.” I now know that Brian was referring to Apple using the U1 chip in an upcoming Tile-like product from Apple, which would allow the user to hold up their iPhone camera and have an AR overlay with a red balloon signifying the location of said product.

So, an obvious application of this chip would be to use it to locate missing iOS devices embedded with the chip, including the Tile-like product, or friends and family members precise locations via the “find my friends” app (so long as they’re carrying an iOS device on their person with the U1 chip). Brian takes it step further by insinuating that we’ll eventually be able to have Siri field questions based on data from the U1 chip too.

If we’re able to precisely locate objects using the U1 chip and create AR overlays to display their locations, then it’s conceivable that we’ll see this expand considerably. Here’s how Brian suggests this might evolve:

“The use case will allow for you to find a product like you world on a website with a whimsical Balloon, also used in the Find My app, to direct you to the precise location of the Apple product. With FaceID and Apple Pay you just look at your phone and confirm and leave. It is not hard to imagine many retail businesses adopting the system. It is also not hard to imagine AppleLocate used in industrial locations and medical locations.” – Brian Roemmele

Many astute folks on Twitter (like the ones whose tweets I’ve embedded in this post) are pointing out ways in which this type of chip might impact the trajectory of these emerging technologies. Beyond geo-location, it would seem that UWB will be foundational to many upcoming technologies expected in the 2020s, from AR/MR to crypto to medical biometrics to autonomous vehicles.

So, while the Apple event might have been a bit underwhelming, I think what we’re really witnessing is an interim period where Apple is rolling out all of the building blocks required for the products it will be releasing into the next decade. Brian very eloquently describes this period:

“The accelerometer systems, GPS systems and IR proximity sensors of the first iPhone helped define the last generation of products. The Apple U1 Chip will be a material part of defining the next generation of Apple products.” – Brian Roemmele

-Thanks for Reading-

Dave

To listen to the broadcast on your Alexa device, enable the skill here

To add to you flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”

Apple, Daily Updates, Future Ear Radio, hearables

Apple Hearing Study (Future Ear Daily Update 9-11-19)

9-11-19 - Apple Hearing Study.jpg

Yesterday, Apple hosted its annual September event to show off its new set of products that are set to go on sale as we enter into the last quarter of the year. Although the bulk of the announcements were focused on the new iPhones set to debut and all the upgrades in the phones’ cameras and processors, there were a few other announcements that I thought were interesting and worth writing updates for. Today, I am writing about the Apple Hearing Study that was announced.

In the upcoming Apple WatchOS6 update that is due out September 19th, there will be a new sound level feature included that the user can configure to appear as one of the readouts on the watches’ display. Apple will be using the microphones on the watch in a low-power mode to always be recording your environment’s decibel level, which will then be visualized on the watch, and display green, yellow or red based on the volume of the noise. The knee jerk reaction might be to say, “wait they’re always recording me?” but, no, Apple has stated that it’s not going to save any of the audio; they’ll only be saving the sound levels.

Image
From Apple’s September 10th, 2019 keynote

The Apple Watch Series 5 that was announced will feature an, “always-on display,” implying that future generations will feature an always-on display as well. Therefore, users who have configured their Apple Watches’ display to show the sound level meter, will constantly be able to assess how dangerous the sound levels are in their environment.

In my opinion, this is a bigger deal than it might appear, because people tend to lose their hearing loss gradually. One of the big reasons why is because they’re completely unaware that they’re exposing their ears to dangerous levels of sound for prolonged periods of time. As an Apple Watch user myself, the ability to be able to quickly glance at my watch to assess how loud the environment I am in is really appealing. An always-on display will just make this effect more pronounced, hopefully leading more people to considering keeping hearing protection, like high quality earplugs, on them at all times. It can’t be understated how powerful the effect will be on peoples’ psyche to constantly see that sound level bar flicker or linger in the red.

So, as this new feature becomes available to all Apple Watches running on OS6, Apple will overnight have an army of users who can gather data on their behalf. Which brings us to the Hearing Study that Apple will be conducting in conjunction with the University of Michigan and the World Health Organization. Here’s Michigan professor Rick Neitzel, who will lead the study, describing the purpose:

“This unique dataset will allow us to create something the United States has never had—national-level estimates of exposures to music and environmental sound. Collectively, this information will help give us a clearer picture of hearing health in America and will increase our knowledge about the impacts of our daily exposures to music and noise. We’ve never had a good tool to measure these exposures. It’s largely been guesswork, so to take that guesswork out of the equation is a huge step forward.”

Users will be able to opt into this study, or the other two studies that were announced at the event, through a new Apple Research app. As I wrote about in August, Apple is slowly inserting itself further and further into the healthcare space, by being the ultimate health data collector and facilitator. This is just another example of Apple leveraging its massive user base to quickly gather data around the various sensors embedded in Apple’s devices at scale to offer to researchers. Creating a dedicated app to facilitate this data transfer, with explicit user opt in, will shield Apple from scrutiny around the privacy and security of sensitive data.

Apple’s wearables are increasingly shaping up to be preventative health tools, or as Apple has described “guardians of health.” The introduction of a decibel level output on the Watch’s display is another incremental step toward becoming said, “guardian of health,” as it will help to proactively notify the user of another danger to their health:  gradual hearing loss. It’s not hard to imagine future generations of AirPods supporting the same feature using mics to sense the sound levels, but instead of a notification, perhaps they’ll activate noise cancellation to protect one’s ears. One can hope!

-Thanks for Reading-

Dave

To listen to the broadcast on your Alexa device, enable the skill here

To add to you flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”

Apple, Biometrics, Daily Updates, Future Ear Radio

Follow the Integration (Future Ear Daily Update 8-28-19)

8-28-19 - Follow the Integration

There’s been a lot of chatter, analysis, and overall hot takes about Apple’s new credit card, the Apple Card, transpiring the past few weeks online. One of the best pieces I have read on the subject came from longtime Apple analyst, Horace Dediu (@asymco). Horace begins by tearing down the fallacies in the tired, old cliche, “we were promised flying cars and we got x,”  that is often used to trivialize much of the innovation that’s occurred in the past few decades. While flying cars represent, “extrapolated technologies,” innovations like the Apple Card would represent “market creating” technologies that are often much more ubiquitous, popular and behavior changing (read his piece to fully understand this point).

There was one point that he made in the article that I really want to hone in on today as I think there’s a direct parallel that can be drawn that pertains to FuturEar:

“Here’s the thing: follow the integration. First, Apple Card comes after Apple Pay, more than 4 years ago. Apple Card builds on the ability to transact using a phone, watch and has the support of over 5000 banks. Over 10 billion transactions have been made with Apple Cash. Over 40 countries are represented.”

“Follow the integration.” That’s the best way to really understand where Apple is headed. As I have written about before, Apple tends to incrementally work their way into new verticals and offerings, and if you follow the acquisitions, the product development – the integration – you start to get a sense of what’s to come with future product and service offerings.

A good example of this would be to look at the burgeoning Apple Health ecosystem. There are two separate areas to focus on: the software and services, and the hardware.  In 2014, Apple began the formation of said ecosystem by introducing the Apple Health app and its Health Kit software development kit (SDK), which was a year before the Apple Watch. This might have been cause for some head scratching as there wasn’t a whole lot of hardware on the market prior to the Apple Watch that could feed data into Apple Health (except the basic inertial data from the phone)?

A year later, in 2015, the Apple Watch came out which would become the main source to populate the data in the Apple Health app. Flash forward to today and Apple has rolled out two more SDKs and iterated on the Apple Watch four times to create a much more sophisticated biometric data collector. On the SDK front, CareKit allows for third party app developers to create consumer-focused applications around data collected by the Apple Watch or with data collected in the third party apps, such as apps centered around Parkinson’s, diabetes, and depression. ResearchKit helps to facilitate large-scale studies for researchers, all centered around Apple’s health ecosystem.

Five years after the kickoff of Apple’s health ecosystem, Apple has laid the groundwork to move deeper and deeper into the healthcare ecosystem. In 2018, the company announced AC Wellness, a set of medical clinics designed to, “deliver the world’s best health care experience to its employees.” It’s not hard to imagine Apple using these clinics as guinea pigs and then roll them out beyond their own employees. In August of 2018, Apple added its 75th health institution to support personal health records on the iPhone.

Just as there were years of innovation and incremental pieces of progress leading to the Apple Card, the same can be said for Apple Health. Follow the integration and you’ll start to get a sense of where Apple is headed.

-Thanks for Reading-

Dave

To listen to the broadcast on your Alexa device, enable the skill here

To add to you flash briefing, click here

To listen on your Google Assistant device, enable the skill here 

and then say, “Alexa/Ok Google, launch Future Ear Radio.”