- Kyle Chua
Apple Previews New Software Features For Users With Disabilities
Updated: Dec 19, 2023
Apple has previewed new software features designed for users with cognitive, vision, hearing and mobility disabilities in celebration of Global Accessibility Awareness Day on Thursday, 18 May.
"At Apple, we’ve always believed that the best technology is technology built for everyone," said Apple CEO Tim Cook. "Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love."
One of the new features is 'Assistive Access', which simplifies the interface to ease the load for users with cognitive disabilities. The feature includes a customised experience for the iPhone and iPad's most used apps, including 'Messages', 'Camera', 'Photos' and 'Music'.
For example, Messages includes an emoji-only keyboard and an option to record a video message for those who prefer communicating visually. Phone and FaceTime have also been combined into a single 'Calls' app. As for the interface itself, it'll feature high-contrast buttons and large text labels, along with tools to help supporters tailor the experience for the users they support.
Another new feature is 'Live Speech', which allows users to type what they want to say and have it be spoken out by their device during calls or in-person conversations. It also includes the option to save commonly used phrases, so users don't have to always type when they want to chime in quickly. The feature has been designed to support users who are unable or have lost their ability to speak.
Meanwhile, users who may be losing their ability to speak, perhaps through the diagnosis of ALS (amyotrophic lateral sclerosis), can create a voice that sounds like them that they can use to communicate in the future. They do that by reading along with randomly generated text prompts to record 15 minutes of audio on their iPhone and iPad. The recorded audio is synthesised by an on-device machine learning system, which then integrates with 'Live Speech'.
Apple notes that users' data are kept private and secure.
Last is 'Point and Speak', a feature in Magnifier that allows users with vision disabilities to better interact with physical objects that have text labels. For instance, users can point their iPhones towards the physical buttons of a microwave and their mobile device can leverage the Camera app using the LiDAR Scanner and the on-device machine learning system to announce the text on each button. The feature can be used with other Magnifier features like 'People Detection', 'Door Detection' and 'Image Descriptions'. Apple, however, notes that 'Point and Speak' is only supported by iPhone and iPad models with a LiDAR scanner.
Other accessibility features that Apple plans to introduce soon include the ability to pause images with moving elements, such as GIFs, in 'Messages' and 'Safari' and new phonetic suggestions for text editing in 'Voice Control'.
Apple plans to roll out the new accessibility features later in the year.
"Accessibility is part of everything we do at Apple," said Sarah Herrlinger, Apple’s Senior Director of Global Accessibility Policy and Initiatives. "These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways."
Apple has previewed new software features designed for users with cognitive, vision, hearing and mobility disabilities in celebration of Global Accessibility Awareness Day on Thursday, 18 May.
One of the new features is 'Assistive Access', which simplifies the interface to ease the load for users with cognitive disabilities.
Another new feature is 'Live Speech', which allows users to type what they want to say and have it be spoken out loud by their device during calls or in-person conversations.
Apple plans to roll out the new accessibility features later in the year.