June 2018 Business Insider Australia
Apple will reportedly add Live Listen technology to its AirPods later this year.
Live Listen allows you to harness your iPhone and Airpods to improve what you hear in crowded situations.
It could represent the beginning of the long-awaited era of in-ear computing.
When iOS 12 comes to iPhones everywhere later this year, Apple’s very popular $US159 AirPods will get Live Listen, a nifty feature that makes it easier to hear conversations in noisy places. Live Listen has been around since 2014, but only on select Apple-certified hearing aids. Essentially, Live Listen turns your iPhone into a microphone: If you’re in a crowded bar, point your iPhone’s microphones at the person across the table from you, or even slide it over, and you’ll hear what they have to say in your hearing aid – or, soon, your Apple AirPods.
The AirPods, which have been hailed as one of Apple’s greatest inventions in recent memory, will expand the reach of Live Listen, and let far more people take advantage of a potentially very handy feature. That said, people with hearing loss should still get an actual medical device, and not rely on a pair of consumer earbuds like the AirPods.
The really exciting part is when you look at what this could mean for the future of the AirPods, and for Apple itself. When Apple first launched the AirPods they were referred to as “Apple’s first ear computer.”
Apple’s Live Listen feature, as it exists today.
Indeed, the sky seemed to be the limit. Because AirPods give users one-touch access to the Siri virtual assistant, and because they linked up with the iPhone’s tremendous galaxy of apps, pundits were hopeful that the AirPods could enable all kinds of superpowers beyond what any other headphones could do. Almost two years later, though, those superpowers have yet to manifest, and the AirPods are still best suited for music and maybe phone calls.
Still, we’ve gotten a glimpse of what the future could look like, thanks to some of Apple’s competitors. Doppler Labs, a startup, released the Here One, a pair of earbuds that could increase the bass at a concert to quiet the sounds of a crying baby. Google, for its part, recently launched the Pixel Buds, which feature real-time language translation. Those products may have been too far ahead of the curve:Doppler Labs went out of business in 2017, after its cool technologies couldn’t overcome the inherent challenges of the hardware market. And the Google Pixel Buds received lukewarm reviews, and they haven’t become nearly as ubiquitous among gadgetheads as the Apple AirPods.
So it’s no wonder that Apple, which famously prefers being right to being first, has been slow to push nontraditional uses of the AirPods. With the addition of Live Listen, though, it means that Apple is still right on track to bring so-called audible computing to the masses, even if it’s happening slower than some would like. Once it gets going, though, things are going to get wild. It’s not hard to imagine how Apple’s App Store would get apps specifically for the AirPods – language translation is an obvious one, but what about putting Apple’s Shazam acquisition to work by automatically cataloging every song you hear in a day? Or prank apps that make it sound like your boss has inhaled helium during your big weekly meeting?
So yes, Live Listen is one little feature, but it’s one that points to a bold new future for Apple, where your headphones actually help you do things you couldn’t before. You may just have to wait a little while for it to fully come to pass.