Enhanced sign language detection, sound recognition, and VoiceOver are among a bevy of accessibility features expected in iOS 14’s fall release. Apple unveiled the updated platform at its Worldwide Developers Conference (WWDC20). While much of the improvements focused on hearing accessibility, iOS 14 also includes updates to vision, cognitive, and physical accessibility features, some of which are useable with iPadOS 14 and WatchOS 7.
Here’s a rundown of what to expect — based on the information we've found (not a promise of what will definitely be available).
- Group FaceTime will now recognize sign language in a video call using motion detection. Historically, video calls have switched screens to feature the person speaking, failing to recognize silent forms of communication like sign language. The updated version will trigger the video to switch screens to feature the person signing.
- Sound recognition will listen for 14 different sounds such as door knocks, doorbells, sirens, car horns, appliance beeps, water running, smoke alarms, dogs barking, and more. Users will receive notifications on their device when sounds are detected. In the future, Apple intends to include sound detection for people and animals.
- Headphone Accommodations will enable users to customize the audio streamed through AirPods Pro devices as well as some Beats headphones and EarPods models. The feature will amplify soft sounds and create clearer audio with three amplification tunings and strengths.
- The Noise app will now inform a user of how loud the audio is in their headphones and notify them once the weekly safe listening amount is reached. The app’s guidelines are based on the World Health Organization's recommended noise limit of 80 decibels about 40 hours a week.
- Real-Time Text (RTT) will allow people who have hearing or speech disabilities to communicate by typing text on phone calls in real time. This new feature will enable users to interact simultaneously with calls and incoming RTT messages and send notifications when neither device is in use.
- VoiceOver screen reader improvements, created from machine learning technology, will recognize more visual details on the screen or in other apps that lack accessibility support. Image description enhancements will enable VoiceOver to speak in complete sentences and analyze a photo’s contents to try to relay additional details of what’s onscreen.
- With enhanced magnification, users will be able to magnify larger areas and improve image clarity with filtering and brightening options.
- Expanded support for Braille will auto-pan larger amounts of Braille text without pressing a pan button on an external refreshable display.
- Coding tools Xcode Playgrounds and Live Previews will provide enhanced accessibility to coders who are blind, with the hope of encouraging people with visual impairments to become coders.
Cognitive and physical accessibility
- Back tap will enable users with to tap the back of their phone two or three times to create shortcuts, turn on accessibility features, take screenshots, or switch apps.
- Voice Control along with VoiceOver will initiate common functions like "read all" or "activate" a display control. Improvements will also create a more consistent voice navigation. Voice Control helps people with physical disabilities browse and operate iOS devices via voice commands that can request emojis for emails, reproduce screen taps or mouse clicks with grid overlays, and more.
- iOS will begin supporting Microsoft Xbox Adaptive Controller, a device designed to enhance game accessibility. The controller can function in the same way a standard controller does by using plug switches, buttons, pressure-sensitive tubes, and other tools. (Related: 2019 furthered the prioritization of video game accessibility.)