Apple is gearing up to revolutionize the accessibility landscape with a lineup of groundbreaking features for its devices, including iPhones, iPads, and Mac computers. Set to released later this year, these additions aim to empower individuals facing challenges in speech clarity and confidence. Among the remarkable new features is Personal Voice, a game-changer in the field of speech assistance.
In a move that will transform communication for users, Personal Voice enables individuals to effortlessly express themselves through their devices. By means of a simple software update, iPhone users can now type messages or commands and have their words spoken aloud by their personalized Voice or a similar voice, making verbal communication a breeze.
Gone are the days of juggling multiple apps and accounts, as Personal Voice streamlines the process. Users can now save commonly used sentences or phrases as shortcuts, allowing for quick playback whenever needed. Whether it’s face-to-face conversations or integrating spoken audio into phone calls and FaceTime, this feature provides seamless assistance.
To achieve a truly unique and personalized experience, users have the option to create their own Personal Voice model. By providing roughly 15 minutes of spoken samples, conveniently completed at their own pace, individuals can enjoy typing messages and hearing them in their very own Voice. It’s important to note that the Personal Voice model is device-specific by default. However, with explicit permission, users can share their model across devices without the need for repetitive training.
In addition to Personal Voice, Apple’s commitment to inclusivity extends to the cognitive impairments community. Introducing Assistive Access, a remarkable feature designed to simplify the user interface, enabling straightforward interaction with iPhones and iPads. By removing unnecessary visual elements, users with cognitive disabilities can navigate their devices effortlessly. For instance, setting up favorite contacts for quick access to voice or video calls streamlines the calling process, while a simplified messaging experience enhances communication.
But that’s not all. Apple’s Magnifier app is also set to undergo a transformative upgrade. Introducing Point and Speak, a cutting-edge feature that utilizes the device’s built-in lidar sensor. By simply pointing a finger at text using the app’s camera, users can have it read aloud. Whether it’s deciphering the small text on microwave buttons or accessing vital information in a flash, Point and Speak harnesses the power of lidar. It’s important to note that this feature will only available on Apple devices equipped with a lidar sensor, currently exclusive to the company’s Pro iPhones and iPads.
While the official launch date for these remarkable accessibility features is yet to announced, they typically coincide with the release of new iOS versions, iPadOS, and macOS software in the fall. With a mission to enhance the accessibility and inclusivity of their devices, Apple’s latest innovations empower individuals with speech impairments or cognitive disabilities, making the world more accessible and communication more effective for all.