How to enable eye tracking on iPhone in iOS 18

,
Seguimiento ocular iPhone

iOS 18 comes packed with new accessibility features, such as Vehicle Motion Indicators. And today we’ll talk about one of the most striking ones to date: eye tracking. No joke—we can control the iPhone with our eyes. This opens new possibilities both for those who need it and for those of us who are simply curious to try it out. And yes, it works really well.

Eye tracking in iOS 18: A feature designed for everyone

For Apple, one of its top priorities is to make its products accessible to everyone, and in this regard, eye tracking is a very important step. It allows us to operate the iPhone without touching the screen. Basic actions like opening apps, scrolling, or returning to the home screen can all be done using only our eyes.

Setting it up is simple, but it’s worth preparing the environment to ensure everything works as well as possible. Apple recommends placing the iPhone on a stable surface, about 45 centimeters from our face. This way, the TrueDepth camera can calibrate our eye movements more precisely and respond naturally.

Step-by-step guide to enable eye tracking on the iPhone

The process couldn’t be simpler. To enable it, we just need to follow these steps:

  1. Check that we have iOS 18 installed.
  2. Open Settings.
  3. Go to Accessibility.
  4. Scroll down to Eye Tracking (under the Physical and Motor section).
  5. Enable the option.

At this point, calibration begins: we’ll see a colored circle moving across the screen, and we have to follow it with our eyes. In a few seconds, the iPhone learns how we look and how it should interpret our movements.

When we finish, the system automatically enables Dwell Control. This means that if we fix our gaze on a point for the amount of time set in the settings, the iPhone will perform the corresponding action. For example, keeping our eyes on an icon until the indicator appears and then releasing it with a blink.

To make the most of it, eye tracking works together with AssistiveTouch, which allows us to perform gestures, scroll, or lock the device without touching it. The system even highlights with a white outline whatever we’re focusing on, so we know exactly what we’re about to select.

If at any point we feel that the response isn’t as smooth as it was at first, the best thing to do is to repeat the calibration. Placing the iPhone on a stable stand and keeping a similar distance to the recommended one greatly improves results.

Eye tracking on the iPhone is truly a surprising and powerful accessibility tool, as well as a curious and different way of interacting with our device. And the great thing is that eye tracking doesn’t replace the usual touchscreen use. We can still tap and swipe across the screen as always, but with the option to switch to visual control whenever we need or feel like it, giving us complete versatility.

On Hanaringo | Apple will put an end to spam with iOS 26: this is the company’s plan to protect us