Apple has announced a suite of new accessibility features for iPhones and iPads, set to be released “later this year.” These updates are expected to be part of iOS 18 and iPadOS 18, anticipated to be officially unveiled next month.
The standout feature is eye tracking, which allows users to control their devices using just their eyes. This technology will be available on both iPhones and iPads and will function as the name suggests.
While eye tracking is primarily designed to assist users with physical disabilities, it will be accessible to all users. Powered by AI, the feature requires a brief calibration using the front-facing camera. All data processing occurs on the device itself, ensuring privacy through on-device machine learning.
No additional hardware or accessories are needed, and eye tracking will work with all apps. Users can navigate app elements with their eyes and activate elements using Dwell Control by focusing on them.
Another upcoming feature is the enhanced Taptic Engine in iPhones, which will provide taps, textures, and refined vibrations synchronized with the music you’re listening to. Initially, Apple’s Music Haptics will be exclusive to Apple Music, but developers will have access to an API to integrate this feature into their apps, enhancing the music experience.
Apple is also introducing Vehicle Motion Cues to help alleviate motion sickness for passengers using their iPhones or iPads in moving vehicles.
CarPlay will receive several new features, including Voice Control for hands-free navigation and app management, Sound Recognition to alert deaf or hard-of-hearing users to car horns and sirens, and Color Filters to assist colorblind users.