June 2020 Medium
Screenshots of Group FaceTime, notifications of Sound Recognition, Custom Audio Setup for Headphone Accommodations on an iPhone 11 Pro Max running iOS 14 dev beta 1
The all-new online Worldwide Developer Conference (WWDC) 2020 announced some of the biggest updates to Apple’s popular platforms. Most notably, the new features coming to iOS 14, iPadOS 14, watchOS 7 and macOS Big Sur.Accessibility is one of Apple’s core values and they take designing for accessibility very seriously. Without fail, they keep the spotlight on ensuring their software and hardware products are as accessible and inclusive to as many users as possible. Last year with iOS 13, they introduced some of the most significant new accessibility features ever on their products such as Voice Control — being able to fully use the device just with your voice, a big win for users with motor disabilities.
We are discussing the new accessibility features that Apple has added, particularly the ones that help the Deaf community and Hard-of-Hearing users all over the world.
Sign Language Detection in Group FaceTime Calls
Prior to iOS 14, during a Group FaceTime call, when a caller speaks, their tile would expand in size, increasing their prominence. This is useful when someone says something, the attention would be on them on everyone’s devices, especially since Group FaceTime supports up to 32 people. But… you can see how this goes wrong for users who use sign language to communicate in FaceTime calls instead. Automatic speaker prominence isn’t a deaf-friendly feature.
Deaf and Hard of Hearing users has been using FaceTime or any video calling service for a long time, allowing them to connect with others in a way that didn’t exist just a few decades ago. Now, they can just pick up their phone or laptop, and start chatting away. You can see how significant FaceTime is to a lot of people, so this brand new feature would positively impact many users of Apple products. On iOS 14, users who start signing would now cause their tile to become larger, grabbing the attention during a Group FaceTime call.
This new feature is really cool. It does exactly what it says:
“Your iPhone will continuously listen for certain sounds, and using on-device intelligence, will notify you when sounds may be recognised.”
Screenshots of the Sound Recognition Settings page and sample notifications on an iPhone 11 Pro Max running iOS 14 dev beta 1
Still respecting user’s privacy by using on-device analysis, it listens for ambient sounds like fire alarms, sirens, smoke detectors, cats, dogs, appliance bells, car horns, doorbells, door knocks, water running, baby crying and people shouting. It will send a notification which users can temporarily disable notifications for that specific sound for 5 minutes, 30 minutes or 2 hours.
It is also important to note that Apple says “Sound Recognition should not be relied upon on in circumstances where you may be harmed or injured, in high-risk or emergency situations, or for navigation.”
“This new accessibility feature is designed to amplify soft sounds and adjust certain frequencies for an individual’s hearing, to help music, movies, phone calls, and podcasts sound more crisp and clear.”
Screenshots of Headphone Accommodations settings and Custom Audio Setup for Headphone Accommodations on an iPhone 11 Pro Max running iOS 14 dev beta 1.
There’s also this cool Custom Audio Setup where it plays two audio samples of subtle frequency boosting settings and you choose which version sounds best. It helps you tune the Headphone Accommodations settings. Apple mentioned that it also works with Transparency mode on AirPods Pro, that it will make “quiet voices more audible and tuning the sounds of your environment to your hearing needs”. Headphone Accommodations is available on Apple and Beats headphones featuring the H1 headphone chip, such as second generation AirPods, AirPods Pro, Powerbeats Pro and more, as well as EarPods.
Great job, Apple!
There are more Accessibility features on iOS 14 that aren’t particularly targeted to users with hearing loss. One of my favourite is Back Tap, where you double or triple tap the back of your iPhone to perform an action like locking the screen, invoking Spotlight or starting a shortcut. I like mine to open the camera when I triple tap my iPhone. You can get more information on the feature set on the rest of iOS 14 and everything else they announced in WWDC this year at Apple’s Newsroom. We’re gonna love iOS 14 when it comes out later this year and I’m sure Apple worked hard this year than ever before, working and refining on the user experience year after year