Now Google automatically adapts the text size to the distance from the screen

In order to develop fun applications, Google is improving its artificial intelligence related to iris tracking. A process which also makes it possible to adapt the size of the text in real time according to the distance from the user.

Follow the user’s pupil in real time to apply effects in augmented reality or automatically adapt the display to his view according to his distance from the screen, this is one of the work carried out by Google around Intelligence artificial . The concept was named MediaPipe Iris and the results are published on the firm’s AI blog .

In addition to being able to track the position of the iris instantly, the developers working on this subject explain that, with their method, they manage to determine the distance between the camera and the user’s eye , without even exploiting a possible depth sensor. This does not seem like much since the messaging services of social networks offer many animated stickers in augmented reality which are embedded in real time. If the playful effect is bluffing, in almost all cases the dimensions are not appropriate and there are lags.

In the case of MediaPipe Iris, Google seeks precision with a minimum of equipment, that is to say, a simple RGB photo sensor without a depth reading system. The system uses landmarks on the iris  that it will endeavor to follow and analyze. To assess proportions, AI relies on the diameter of the iris of the human eye. This is roughly 11.7mm with a margin of plus or minus 0.5mm for most of the population.

No need for depth sensor

This method makes it possible to determine the metric distance between the subject and the camera with a relative error of less than 10%. At the moment, the error rate is higher for eyeglass wearers.

The work carried out is an extension of what Google had already achieved with MediaPipe Face Mesh, a system capable of 3D modeling the face in real time without any special equipment other than a single RGB sensor.

In its post, Google would like to point out that iris tracking does not allow inference where people are looking and that it does not participate in identity recognition. The system is designed above all to develop fun or practical applications , such as automatically adapting the size of the characters to the distance from the screen. It is able to work on most mobile phones, computers and even  through a web browser.