Apple Announces New Accessibility Features Including Eye Tracking

Apple Announces New Accessibility Features Including Eye Tracking
Apple announces new accessibility features, including eye tracking for iPad, VoiceOver enhancements, Live Captions, and more, available later this year.

In a recent announcement, Apple has introduced a range of new accessibility features aimed at enhancing the user experience for people with disabilities. These updates, which leverage the latest advancements in hardware, software, and machine learning, will be available across various Apple devices later this year.

Eye Tracking for iPad

One of the standout features is the introduction of eye-tracking technology for iPad. This feature allows users to control their iPad using third-party eye-tracking devices. It is designed to assist individuals with severe motor impairments, enabling them to navigate their device and interact with apps through eye movements alone. This development underscores Apple’s commitment to making technology accessible to all users, regardless of their physical abilities​.

Enhanced Vision and Hearing Support

Apple is also expanding its support for users with vision and hearing impairments. The new updates include improvements to the VoiceOver screen reader, which will now support over 20 additional languages and locales. Additionally, a new Point and Speak feature allows users with low vision to identify and read text on household objects, such as buttons on a microwave, simply by pointing their device’s camera at the object​​.

For the Deaf and hard-of-hearing community, Apple is introducing Live Captions on iPhone, iPad, and Mac. This feature provides real-time transcription of audio content, making it easier for users to follow along with conversations, phone calls, and media content. Moreover, Apple is expanding the compatibility of hearing devices designed for iPhone to include Mac, ensuring seamless integration across their ecosystem​​.

Cognitive and Speech Accessibility

In addition to visual and auditory enhancements, Apple is rolling out features to support cognitive and speech accessibility. Live Speech allows users to type what they want to say and have it spoken out loud during phone or FaceTime calls. This feature is particularly beneficial for individuals with conditions such as ALS, who may lose their ability to speak over time. Users can also create a Personal Voice by recording 15 minutes of random phrases, which can then be used with Live Speech and other assistive technologies.

Fitness and Health Integration

Apple continues to integrate accessibility into its health and fitness offerings. The Apple Fitness+ service now includes features like Audio Hints, which provide descriptive verbal cues for users who are blind or have low vision. Additionally, all workout videos incorporate American Sign Language (ASL), and the trainers demonstrate modifications to accommodate different fitness levels. The new Time to Walk or Push and Time to Run or Push workouts are tailored for wheelchair users, promoting inclusivity in fitness activities​.

Broader Impact and Availability

These accessibility updates reflect Apple’s long-standing dedication to creating products that are inclusive and user-friendly for everyone. By incorporating feedback from the accessibility community and leveraging cutting-edge technology, Apple aims to provide meaningful solutions that enhance the daily lives of individuals with disabilities.

The new features will be available through software updates later this year, spanning across iOS, iPadOS, macOS, and watchOS. Users can look forward to a more inclusive and accessible experience across all their Apple devices​​.


About the author

Avatar photo

Swayam Malhotra

Add Comment

Click here to post a comment