Apple Reveals A Bunch Of Accessibility Features Coming To iPhone, iPad

For the past several years, Apple has been working to make its products accessible to a broad spectrum of users, including people with disabilities. The last time Apple did this was back in 2022, when it introduced a bunch of accessibility-focused features for its devices. 2021 also saw similar initiatives from the company.  

Advertisement

Among the accessibility-focused features previously announced by Apple included SignTime — which lets users communicate with AppleCare and customer care executives in sign language — and AssistiveTouch for watchOS. Almost two years since Apple's last major accessibility-oriented update, the company is reiterating its commitment to the space by announcing new accessibility initiatives.

New features that Apple announced latest round of accessibility updates include Assistive Access (a feature tailor-made for people with cognitive disabilities), Live Speech (which lets users type what they want to say and let the phone read it out loud), and Personal Voice (which is an extension of Live Speech that allows users create a voice that sounds like them). In addition, the company also announced something called Detection Mode within the magnifier tool, which uses a function called Point and Speak to point out and read out loud words in the user's physical surroundings.

Advertisement

Apple asserts that the new features previewed today would improve the overall experience for people with disabilities. What is also notable is the fact that most of these features are software-based updates that aim to improve cognitive, vision, hearing, and mobility accessibility for iPhone and iPad users across the globe.

What is Apple's Assistive Access feature?

Apple's Assistive Access feature is an umbrella term for a set of accessibility-focused features aimed at simplifying the use of its products. The core focus of this feature is to let Apple users with cognitive disabilities enjoy the experience of using the iPhone and iPad without having to worry about navigating complex menus and sub-menus. 

Advertisement

Among the features enabled by Assistive Access includes a new, easy-to-use interface with large text labels, high-contrast buttons, and one-touch access to call and communicate with close family members. Take the case of the new "Calls" app, which lets users make standard voice calls as well as FaceTime calls without browsing various menu items.

Assistive Access offers consumers various ways to customize their user experience on Apple devices. For example, it lets them choose between a grid-based layout or a row-based layout for the home screen and apps. If the iPhone user has trouble typing out or saying long words, there is an emoji-only keyboard option within the Messages app to let them communicate their emotions only using emojis. Users can also easily record a video message and quickly share it with family and friends.

Advertisement

Apple's Live Speech and Personal Voice features

The soon-to-be-introduced Live Speech feature essentially allows users to type what they want to say and have the phone say it out loud during calls. While the feature works with voice calls and Facetime video calls, an even more interesting use case is the ability to use Live Speech to make in-person conversations more engaging. The feature also lets users save the most commonly used phrases during video calls and use them judiciously when they are quickly needed during conversations.

Advertisement

Moving on to the Personal Voice feature, it is essential to consider it an extension of Live Speech. Except this time, it lets users create a voice that sounds like them. This feature could be of great importance to people who are unable to speak, lost their voice, and suffer from conditions that could progressively limit their ability to speak over time. 

The Personal Voice feature would let these users regain their lost voice and speak to friends and family in a familiar voice — instead of older, impersonal, and robotic sounding voices. To teach the iPhone and iPad to create a personal voice, users must read a randomized text passage and record around 15 minutes of audio on the device. Data from these recordings will then be used to generate a personal voice that is unique to each user.

Advertisement

Apple's Point and Speak feature for the visually impaired

In addition to the new features listed above, Apple is also making several changes to its existing accessibility-focused features. Take the case of the old magnifier feature, which was used to magnify the small text on the iPhone and iPad screens. The Magnifier has now been enhanced with a "Point and Speak" feature. This feature uses a combination of the iPhone and iPad's camera, LiDAR sensors, and on-device machine learning to help visually impaired users navigate the world in a better way.

Advertisement

A standout feature made possible by Point and Speak is the capability to read instructions and small labels. Users can, for example, use the feature to read the various options written on the buttons of a microwave oven or even a washing machine, for that matter. The feature also works with the other Magnifier features, including People Detection, Door Detection, and Image Descriptions.

While Apple mentions that these accessibility-focused features will be rolled out to iPhones and iPads later this year, the company has yet to provide a timeline for the actual rollout.

Recommended

Advertisement