Apple devices, including MacBooks, have supported eye-tracking technology for some time.
However, it has always required external hardware.
Thanks to advancements in AI, iPhone and iPad owners can nowcontroltheir devices without peripheral devices.

Apple Eye Tracking uses the front-facing camera to calibrate and track eye movement.
As users look at different parts of the screen, interactive elements highlight.
It can also mimic physical button presses and swipe gestures.
Vocal Shortcuts are another way that users can obtain some hands-free control.
Apple didn’t provide a detailed explanation.
However, it looks easier to use than the existing Shortcuts system, which automates simple to complex tasks.
A decent and wide selection of pre-made shortcuts from third-party providers would make the feature more appealing.
Another feature added to the suite of voice assistive technology is Listen for Atypical Speech.
This setting allows Apple’s voice recognition tech to look for and learn a user’s speech patterns.
All of these new AI-powered features work using onboard machine learning.
Biometric data is securely stored and processed locally and is never sent to Apple or iCloud.
VoiceOver, Magnifier, Personal Voice, Live Speech, and other existing accessibility features are also getting improvements.
Apple didn’t have a specific timeline for rollout other than “before the end of the year.”
However, the company celebrates accessibility throughout May, so launching the new features sooner than later makes sense.
The developers are probably in the final stretch of working out the kinks.