Apple Announces Inclusive Design with New Accessibility Features

As World Accessibility Day approaches on May 19th, Apple is making a bold statement in support of users with disabilities. The tech giant has officially announced a series of new accessibility features set to launch later this year, likely alongside iOS 17. These cutting-edge tools are designed to assist users with cognitive, language, and visual impairments, making technology more accessible and enjoyable for all.

Simplifying Interfaces for Users with Cognitive Impairments

For users with cognitive disabilities, Apple has reimagined the interactive interface in iOS to reduce cognitive overload. By removing extraneous features and focusing on the most popular functions, the company aims to make navigating apps a breeze.

Take the Photos app, for example: Apple has ditched the album level option in favor of a full-screen, larger image display. Similarly, the Camera app now comes with just the photo shutter button, while the Phone app highlights the user’s most frequently used contacts.

Text-based communication has also been streamlined. Users can now reply solely with emojis, eliminating the need for keyboard typing altogether.

Assisting Users with Language Disabilities

For individuals with language disabilities, Apple’s new suite of tools is a game-changer. During video calls, users can type text, which the device will then convert into speech and read aloud to the other party. This feature allows for seamless communication without the need for spoken language.

Additionally, users can save frequently used phrases for quick access and playback during conversations. For those at risk of losing their language abilities, such as individuals with ALS, Apple offers a voice preservation feature. By recording just 15 minutes of speech, the iPhone can learn the user’s voice through machine learning and subsequently read text in that unique voice.

EDITOR PICKED: Get Ready for Apple’s iPhone 15 Pro: Everything You Need to Know

Empowering Visually Impaired Users with Machine Learning

Apple’s commitment to accessibility extends to visually impaired users as well. Through machine learning, the iPhone’s Narration feature and camera can work in tandem to identify and describe buttons on electronic devices. This means that when a user touches a button, the iPhone will read its function aloud.

Apple’s advancements in machine learning also enable finger narration for tasks like person detection and door detection, making everyday life more manageable for those with visual impairments.

With these powerful new accessibility features, Apple is taking a significant step towards creating a more inclusive user experience. By focusing on the needs of individuals with cognitive, language, and visual impairments.

Latest

What to Watch in Qualcomm’s Q2 2024 Earnings Amid AI and Auto Ambitions

As Qualcomm prepares to drop its fiscal Q2 2024...

Qualcomm Expands Snapdragon X Family With New Entry-Level Snapdragon X Plus Chip

Qualcomm, mobile chipmaker doubled down on its processor roadmap...

Meta’s Multimodal AI Unleashed on Ray-Ban Meta Glasses, Bringing Vision More AI Interactions

Meta enabled multimodal AI assistant on the Ray-Ban Meta...

Apple Teases May 7th Event, Setting Stage for iPad Air/iPad Pro Refresh Bonanza

You know it's spring when those colorful Apple event...

Newsletter

Don't miss

What to Watch in Qualcomm’s Q2 2024 Earnings Amid AI and Auto Ambitions

As Qualcomm prepares to drop its fiscal Q2 2024...

Qualcomm Expands Snapdragon X Family With New Entry-Level Snapdragon X Plus Chip

Qualcomm, mobile chipmaker doubled down on its processor roadmap...

Meta’s Multimodal AI Unleashed on Ray-Ban Meta Glasses, Bringing Vision More AI Interactions

Meta enabled multimodal AI assistant on the Ray-Ban Meta...

Apple Teases May 7th Event, Setting Stage for iPad Air/iPad Pro Refresh Bonanza

You know it's spring when those colorful Apple event...

Interesting Meta is now “Open” Company, Opening up its Horizon OS to 3-Party AR/VR Headset

Meta turned its virtual reality strategy on its head....
Max Hyland
Max Hyland
Long form contributor Apple iPhone, iPad, watch reviews, opinion, editorial

What to Watch in Qualcomm’s Q2 2024 Earnings Amid AI and Auto Ambitions

As Qualcomm prepares to drop its fiscal Q2 2024 earnings report, there are a few key areas investors will be laser-focused on to gauge...

Qualcomm Expands Snapdragon X Family With New Entry-Level Snapdragon X Plus Chip

Qualcomm, mobile chipmaker doubled down on its processor roadmap with another new product announcement. Enter the Snapdragon X Plus, fresh budget-focused addition to the...

Meta’s Multimodal AI Unleashed on Ray-Ban Meta Glasses, Bringing Vision More AI Interactions

Meta enabled multimodal AI assistant on the Ray-Ban Meta glasses, allowing you to talk to and receive visual feedback from the wearable device like...