top of page

Eye Tracking, Music Haptics, and Vocal Shortcuts, Apple New Features

For decades, Apple has been at the forefront of inclusive design and assistive technology for users with disabilities. This year, the tech giant is doubling down on their commitment to accessibility with a suite of powerful new features using artificial intelligence and machine learning.

Headlining the announcements is Eye Tracking for iPad and iPhone - an eye gaze interaction system that allows users with physical disabilities to operate their devices using only their eyes. Harnessing on-device neural engines and vision capabilities, Eye Tracking provides a seamless, calibration-free experience for navigating apps, activating controls, and performing gestures through gaze alone



Another feature announced is Music Haptics, which translates audio into nuanced vibrational patterns using the iPhone's advanced Taptic Engine. This inclusive innovation gives deaf and hard of hearing users an entirely new way to experience the rhythms and dynamics of their favourite songs through the sense of touch.



Apple is also enhancing accessibility for users with speech disabilities. Vocal Shortcuts empowers the creation of custom voice commands to trigger complex Shortcuts routines. Listen for Atypical Speech leverages on-device machine learning to improve voice recognition accuracy for those with conditions affecting speech patterns like ALS or stroke. And Personal Voice expands to Mandarin Chinese, helping more users create a personalized synthesized voice assistant match. “Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.



For users prone to motion sickness, Vehicle Motion Cues animates visual cues that sync with a vehicle's movement to reduce sensory conflicts that can cause nausea when using iPhone or iPad during travel.



Looking ahead, Apple's upcoming visionOS for their Vision Pro spatial computing device will integrate powerful new accessibility tools as well. This includes system-wide Live Captions for enhanced audio comprehension, extensive customisation options for vision/colour/transparency/motion filters, and flexible multimodal controls using eye gaze, hand gestures, voice commands and more.



Other key updates span VoiceOver navigation enhancements, Magnifier document reading mode, fully customisable braille input/output tables, virtual trackpad controls, and much more.

Apple is commemorating Global Accessibility Awareness Day by spotlighting these innovations along with curated App Store collections, virtual events, and more to celebrate the disability community.


As Sarah Herrlinger, Apple's Senior Director of Global Accessibility Policy and Initiatives stated: "These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world."


Apple continues to set the standard for accessible mainstream technology through their relentless pursuit of inclusive, barrier-breaking design for all people - regardless of ability. This latest cycle of accessibility innovations will empower and enrich the lives of countless users worldwide.


Images & video credits: apple.


5 views0 comments

Comentarios


bottom of page