Apple iOS 17 preview: New accessibility features revealed ahead of WWDC 2023
Apple has announced a range of new accessibility features to help make an impact on the lives of users with disabilities. These features are expected to be a part of the iOS 17 which could be unveiled at WWDC 2023
Apple has announced a host of new accessibility features for its iPhone, iPad and Mac devices. The new features will be released later this year and are expected to be a part of the new iOS 17 which could be previewed at this year's WWDC.
Here are the new accessibility features announced by Apple:
The new Assistive Access feature distills Apple's applications and experiences down to their most essential features. It also combines the Phone and FaceTime into a single app. Aimed at people with cognitive disabilities, Assistive Access is designed to help reduce the "cognitive load" on users while tailoring the iPad and iPhone experience to their needs.
While introducing the new feature Apple said, “The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support."
Live Speech and Personal Voice Advance Speech Accessibility:
Live Speech will allow users to type what they want and have it played during a phone call or FaceTime conversation. The new feature will also provide the users with the ability to save their commonly used phrases and have them spoken out loud during a conversation.
Interestingly, users can create a voice that sounds just like them using the Personal Voice feature. It will be particularly helpful for users with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other such conditions where users can progressively lose their speaking abilities.
While talking about the Personal Voice feature, Apple said, “Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad."
Point and Speak:
Aimed at users with visual disabilities, Point and Speak will make it easier to use physical objects that have text labels. Built into the magnifier app, Point and Speak uses input from the camera and the LiDAR Scanner along with inputs from machine learning to help users navigate their using day-to-day physical objects like microwaves or refrigerators.
Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!