Just a glance: Apple accessibility for a barrier-free world

Just a glance: Apple accessibility for a barrier-free world

Apple announced significant new accessibility features in observance of Global Accessibility Awareness Day, including Eye Tracking for controlling iPhone and iPad with your eyes, Music Haptics for experiencing music through vibrations, and Voice Shortcuts.

Apple announced significant new accessibility features in observance of Global Accessibility Awareness Day, including Eye Tracking for controlling iPhone and iPad with your eyes, Music Haptics for experiencing music through vibrations, and Voice Shortcuts.

Apple announced significant new accessibility features in observance of Global Accessibility Awareness Day, including Eye Tracking for controlling iPhone and iPad with your eyes, Music Haptics for experiencing music through vibrations, and Voice Shortcuts.

From the Apple world

May 16, 2024

Imagine being able to control your iPhone or iPad with just your eyes. With Eye Tracking, Apple's new accessibility feature coming later this year, it will be possible.

This is one of the major innovations presented by Apple in conjunction with Global Accessibility Awareness Day, along with Music Haptics which will offer users with hearing impairments or deafness a new way to experience sound, using the iPhone's Taptic Engine to "vibrate" the music.

And again, with Vocal Shortcuts it will be possible to perform actions simply by pronouncing a personalized voice command.


These new features will have a major impact on the lives of many users, providing new ways to communicate, control their devices, and move around the world. Designed to make Apple devices more accessible to everyone, the new solutions combine the power of hardware and software, machine learning and artificial intelligence, opening up new horizons in terms of accessibility.



Eye Tracking comes to iPhone and iPad

The first and most surprising feature is the Eye Tracking feature, powered by artificial intelligence, which will offer users an integrated option to navigate iPhone and iPad with just their eyes.

Designed for users with physical disabilities, Eye Tracking uses the front-facing camera for setup and calibration in seconds.

It will then be possible to navigate between apps simply by moving your gaze to the interface elements, confirming the selection with a prolonged gaze, without additional hardware or accessories.



Music Haptics makes songs more accessible

Music Haptics feature will vibrate the phone to the rhythm of the music, allowing people who are deaf or hard of hearing to experience music on iPhone.

The feature will be available on millions of songs in the Apple Music catalog and as an API to allow developers to make music more accessible in their apps.



New features for a wide range of speech

The accessibility update package also includes several new features to improve the experience for users with speech and language disabilities.

The Vocal Shortcuts feature allows you to assign personalized voice commands to perform quick actions and complex tasks, while Listen for Atypical Speech improves voice recognition for a wide range of atypical speech patterns.

These features have been specifically designed for users with conditions that affect speech, offering them greater personalization and control over device use through voice commands.



Vehicle Motion Cues against motion sickness

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Vehicle Motion Cues addresses this problem by displaying dotted animations around the edges of the screen that replicate the movements of the vehicle, helping to synchronize visual and motor perception and thus reduce the sensory conflict that underlies motion sickness. Using the motion sensors built into iPhone and iPad, this feature automatically detects when the user is in a moving vehicle and activates the dotted animations accordingly, without interfering with the main content being displayed. An intelligent system that operates completely transparently for the user.

The new generation of CarPlay also includes voice control for navigating and controlling apps with just your voice, color filters to make the interface visually easier to use for colorblind users, and with Sound Recognition, deaf or hard of hearing drivers or passengers can activate notifications to be alerted to horns and sirens.



Augmented Reality Subtitles

This year, accessibility features coming to visionOS will include real-time system-wide subtitles to help everyone, including users who are deaf or hard of hearing, follow spoken dialog in live conversations and app audio. With real-time subtitles for FaceTime in visionOS, more users can easily experience the unique experience of connecting and collaborating. In addition, it will be possible to adjust ambient brightness, flashing lights and colors to make the environment more comfortable for users.


The best technology is the one that works for everyone. That's why Apple products and services are designed from the ground up to be inclusive, with built-in accessibility features to help people communicate, create, and share in the way that works best for each of us.

Share the article

Share the article

Instagram

Follow us

Instagram

Follow us

Instagram

Follow us