Google AI's Iris Software Tracks Eye Movement and Distance
Current technology can already estimate eye movement by tracking a user's iris — something that's handy for augmented reality headsets, for example.
Google AI has taken this one step further by not only showing that iris tracking is possible but tracking the distance between the user and the camera is also possible — even without the use of a dedicated depth sensor.
This will prove useful for a variety of AI technologies that include computational photography, virtual glasses, and hats, as well as automatic font size change depending on the user's distance to the device.
SEE ALSO: GOOGLE CEO CALLS FOR AI REGULATION IN NEW EDITORIAL
Tricky tracking
As per Google AI's blog post, tracking iris movements on mobile devices is no easy feat. Just think of the times when there's sunshine hitting your face as you look down at your screen, or when you're squinting to see smaller writing, or if you have a strand of hair in the way. So, typically, specialized hardware is needed to do so.

Enter, MediaPipe Iris, Google AI's new machine learning model that estimates iris movement and distance.
MediaPipe doesn't need any specialized hardware and is still able to track landmarks of the iris, pupil, and eye contours by using an RGB camera during real-time use. Its relative error is under 10%.

What's more, MediaPipe can run on most mobile devices, laptops, desktops, and more.
The Google AI team trained its model by manually annotating over 50,000 images with a number of different aspects such as lighting, head poses, different backdrops, etc. The team also collected front-facing, synchronized video, and depth images from over 200 participants to ensure the accuracy of their method.

