November 23, 2024
Apple's new AirPods won't have to be taken out of your ears as often, thanks to sophisticated AI
A slew of software features launching with the new AirPods enable users to leave their earbuds in all day while navigating cities or talking to co-workers.

Kif Leswing/CNBC

Alongside new iPhones and Apple Watches, Apple is releasing a new version of its AirPods Pro this month.

The 2nd Generation AirPods Pro with USB-C — a mouthful of a model name — don’t have any radical hardware changes. Apple replaced the proprietary Lightning port with a USB-C charger to match the rest of its lineup.

But a slew of software features launching alongside the new AirPods significantly change how noise-canceling on the wireless buds works in practice, and will make it much easier for AirPods Pro users to leave their earbuds in all day while navigating cities or talking to co-workers.

Apple has given the new features various names — Adaptive Audio, Conversation Awareness, Personalized Volume — but taken together, and using the default settings on a review unit of the new $249 AirPods, the upshot is that the device uses machine learning and artificial intelligence to turn down music when in a conversation or allow necessary nearby sounds into the headphones.

Instead of taking out your AirPods or turning off noise-canceling entirely when you’re navigating a treacherous street or having a conversation with a co-worker, users can now leave in their AirPods and rely on Apple’s software to intelligently decide what the user needs to hear.

Overall, the improvements are subtle but nice. They’re not a reason to upgrade AirPods if you have an older pair that’s working perfectly, but they are worth reaching for if you are getting new wireless headphones and know you don’t like to be constantly taking them in and out.

However, from a technological perspective, the new AirPods are exciting. Apple is using cutting-edge technology and its own customized chips to filter the world of sound through Apple’s hardware, and to augment or mute individual sounds to make your daily experience better, all powered by AI. Apple’s headphones are going far beyond the simple on-or-off noise-canceling features on competing devices.

The concept is not that far away from the “spatial computing” Apple introduced with the Vision Pro VR headset, which uses machine learning to integrate the real and computer worlds. Apple calls the AirPods a “wearable,” and reports it in the same revenue category as its Apple Watch. With its new adaptive features, the AirPods are more wearable than ever, and continue to be one of the company’s most intriguing product lines in terms of a look at the future of computing, even if they don’t get the same attention as the iPhone.

How it worked

While the adaptive technology isn’t quite seamless yet, it is a nice improvement over the blunter, muffling noise-cancellation setting that used to be the default on AirPods Pro. And it’s not only limited to the latest hardware — anyone with “second generation” AirPods Pro introduced last September can download software updates for their headphones and iPhone to enable them.

The new Adaptive mode ultimately blends chaotic street noise with the artificial quiet of active noise cancellation. Apple frames Adaptive Audio as a safety feature, so users don’t miss honks or disturbances when walking around cities. It’s subtle. You definitely feel like you’re still in a cocoon of quiet, but you don’t feel as if the whole world is muffled around you.

There’s a little chime when users turn it on, either through the Settings app when the earbuds are connected or through a shortcut by long-pressing the iPhone’s volume button in the Control Center.

Screenshot/CNBC

In practice, Adaptive Audio wasn’t perfect, but it’s an improvement over active noise canceling, which can be very isolating, and Apple’s transparency mode, which often amplifies extraneous noise (like the AirPods case clicking against car keys in my pocket). If I were to walk around cities, which I try to avoid for safety reasons, I would use Apple’s Adaptive mode.

But Bay Area BART station announcements made over a central speaker were still muffled, especially when I was listening to music, and that’s the sort of information I would like to hear. I still needed to turn off the headphones or take them out if I wanted to understand what they were saying, such as which train was coming into the station.

When walking in a dog park separated from a highway by a sound wall, Adaptive Audio let in more highway noise than active-cancellation mode, which wasn’t optimal. Later, when another person in the park was arguing about something and making a scene, I didn’t catch it by hearing it in Adaptive mode — I saw the dispute first. While many people use noise-canceling headphones to zone out those kind of disturbances, from a safety perspective, that’s something urban dwellers should be aware of in their vicinity.

Another key scenario for noise-canceling headphones is in the workplace, where workers who are headed back to the office are increasingly using them to try to simulate home office-like privacy or signal to co-workers they can’t talk.

It’s here where the Conversation Awareness feature will shine, allowing office grinders to hold quick conversations without taking out their AirPods. The feature effectively turns down your music or audio when it senses you’re taking part in a conversation. Instead of fumbling in settings to turn noise-canceling off or turn off the music, or taking the earbuds out of your ears, the software does it for you, and even amplifies the conversation a little bit.

When it works, it’s great. I had a couple conversations with my wife with the AirPods in and Conversation Awareness on. We spoke as if I didn’t have $250 of technology in my ears, and when I went back to doing what I was doing before, the volume of my music automatically went back to normal levels.

But there’s one big catch to Conversation Awareness — it doesn’t engage when someone talks to you, it only starts when you open your mouth and say something. So I found myself missing the first thing that was said in several conversations, such as when a neighbor greeted me, or what the cashier said when I approached my favorite taco truck.

At the taco truck, I found myself regretting not taking out the AirPods. I did feel like I missed a little bit of context in the short exchange, and felt rude for keeping in my headphones. I heard and understood the key bits, such as the total price, but I did not feel it was the same real-time conversation as if I was just speaking without headphones.

Also, Conversation Awareness did not turn down my music five minutes later when the cashier called out my order for pickup. Ultimately, my order was wrong too, probably because I was distracted. But it’s easy to see how people will use the feature to order a cold brew without pausing their music.

There are other little quirks, too. I like to sing along to music when I’m alone. With Conversation Awareness on, the music gets turned down, leaving you to hear your own flat singing. Once, when I was working at my computer, I laughed, and the AirPods algorithm thought I was trying to speak. I also never realized how much I mutter to myself when I’m writing.

Personalized Volume uses machine learning to adjust the overall audio level, taking into account your historical preferences — for me, louder than is healthy — and the exterior noise. I only noticed it once, when it turned down the volume after I had jacked it up.

Taking all this into account, the new AirPods features might not be a reason to rush out and get the latest model, but they clearly show that Apple’s headphones are evolving to become something more sophisticated than small speakers.