When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Apple reveals game-changing accessibility features including Live Speech and Personal Voice

Accessibility on Apple devices

Apple has announced several new accessibility features for Global Accessibility Awareness Day which takes place later this week. The new features include Live Speech, Personal Voice, and Point and Speak in Magnifier. They can be used by anybody but will mainly benefit those who can’t speak, won’t be able to speak, or who are blind or have a vision impairment.

With Live Speech, iPhone, iPad, and Mac users that can’t speak can type what they want to say over a phone or FaceTime call. As phone calls tend to be quite fast-moving, Apple has built in a feature that lets users quickly send commonly used phrases to help speed the conversation up a bit. Apple said it has worked with many people who can’t speak or lost their speech over time to build and refine the feature.

Live Speech on iPhone

Next up is Personal Voice. With this, users can read along to randomized text prompts and record 15 minutes of audio on their iPhone or iPad. Personal Voice then uses this data to synthesize other words for the user. Apple has designed this feature for people who know they’re going to lose the ability to speak and it integrates with Live Speech so you can continue to have calls in your own voice even after losing the ability to speak. The voice synthesis is done using on-device machine learning to keep data private and secure.

Finally, Apple has introduced Point and Speak in Magnifier which allows it easier for users with poor or no vision to interact with physical objects that have text labels. Explaining how this feature works, Apple said:

“For example, while using a household appliance — such as a microwave — Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad. Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.”

Other new accessibility features that Apple is working on include:

  • Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.
  • Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.” Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
  • Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
  • For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
  • Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
  • For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customize the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.

Apple said that these features will land across the Apple ecosystem later this year so be on the lookout for them if you think you could benefit from them.

Report a problem with article
OpenAI
Next Article

OpenAI CEO Sam Altman tells US Senate panel generative AI needs regulation

An image with a colorful Windows 11 logo and dimmed background
Previous Article

Windows 11 debloater app, which got Microsoft Store-banned, gets separate junk remover

Join the conversation!

Login or Sign Up to read and post a comment.

0 Comments - Add comment