Apple adds new accessibility features across the senses, including eye tracking

Share

Apple’s accessibility offerings continue to expand, as new features like device eye tracking, touch enhanced music listening, and settings for those with atypical speech come to on-the-go devices.

Announced in the midst of a month-long recognition of Global Accessibility Awareness Day (May 16), the lineup of customizability options to help users with physical disabilities better control and interact with their iPad or iPhone.

SEE ALSO:

Google I/O: Google announces new safety framework for responsible AI

“These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world,” wrote Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.

Eye Tracking capitalizes on machine learning

Apple’s new eye tracking controls are, unsurprisingly, powered by AI, which turns the device’s front-facing camera into a calibrating device to scan and track facial movements. “With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes,” Apple explains.

While eye-tracking systems for computers are a long established technology, mobile devices are slowly catching up. Apple — and other tech companies cashing in on quickly evolving AI technologies — capitalizes on the integrated nature of internal machine learning to process facial movements and migrate the tech into a hardware and accessory-free offering.

Music Haptics adds touch to songs

A feature that feels long overdue for the technically advanced Apple Music streaming service, Music Haptics allow for users who are Deaf or hard of hearing to experience music on their device via touch, by turning the iPhone’s Taptic Engine into a conveyer of beats and vibrations. When turned on, the setting adds “taps, textures, and refined vibrations” to the music.

The feature will only be available on Apple Music’s catalogue of songs, for now.

Vocal Shortcuts allows more people to simplify their life

Acknowledging a range of varied speech ability and atypical speech patterns among people with disabilities, Vocal Shortcuts allow users to assign actions to custom utterances, not just phrases. The setting is paired with the new Listen for Atypical Speech setting, which uses on-device machine learning to recognize a user’s unique speech, targeted to those with conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, Apple explains.

VisionOS, CarPlay, and more get upgrades

Apple also introduced improvements to its range of accessibility tools, including a Reader Mode for the app’s vision assistant Magnifier, a new option of Hover Typing for those with low vision, a Virtual Trackpad for those using AssistiveTouch with limited range, and new customizations for VoiceOver and VoiceControl.

The company will be adding systemwide Live Captions to VisionOS, as well tools like Reduce Transparency, Smart Invert, and Dim Flashing Lights “for users who have low vision, or those who want to avoid bright lights and frequent flashing.”

And, rounding out the additions, CarPlay users can now access Voice Control, Color Filters, and Sound Recognition, helping individuals access controls with just their voice, view color blind friendly screens, and be alerted to outside sounds.

Source : Apple adds new accessibility features across the senses, including eye tracking