As World Accessibility Day approaches on May 19th, Apple is making a bold statement in support of users with disabilities. The tech giant has officially announced a series of new accessibility features set to launch later this year, likely alongside iOS 17. These cutting-edge tools are designed to assist users with cognitive, language, and visual impairments, making technology more accessible and enjoyable for all.
Simplifying Interfaces for Users with Cognitive Impairments
For users with cognitive disabilities, Apple has reimagined the interactive interface in iOS to reduce cognitive overload. By removing extraneous features and focusing on the most popular functions, the company aims to make navigating apps a breeze.
Take the Photos app, for example: Apple has ditched the album level option in favor of a full-screen, larger image display. Similarly, the Camera app now comes with just the photo shutter button, while the Phone app highlights the user’s most frequently used contacts.
Text-based communication has also been streamlined. Users can now reply solely with emojis, eliminating the need for keyboard typing altogether.
Assisting Users with Language Disabilities
For individuals with language disabilities, Apple’s new suite of tools is a game-changer. During video calls, users can type text, which the device will then convert into speech and read aloud to the other party. This feature allows for seamless communication without the need for spoken language.
Additionally, users can save frequently used phrases for quick access and playback during conversations. For those at risk of losing their language abilities, such as individuals with ALS, Apple offers a voice preservation feature. By recording just 15 minutes of speech, the iPhone can learn the user’s voice through machine learning and subsequently read text in that unique voice.
EDITOR PICKED: Get Ready for Apple’s iPhone 15 Pro: Everything You Need to Know
Empowering Visually Impaired Users with Machine Learning
Apple’s commitment to accessibility extends to visually impaired users as well. Through machine learning, the iPhone’s Narration feature and camera can work in tandem to identify and describe buttons on electronic devices. This means that when a user touches a button, the iPhone will read its function aloud.
Apple’s advancements in machine learning also enable finger narration for tasks like person detection and door detection, making everyday life more manageable for those with visual impairments.
With these powerful new accessibility features, Apple is taking a significant step towards creating a more inclusive user experience. By focusing on the needs of individuals with cognitive, language, and visual impairments.