SEAT explores infrared and iris sensors to drive road safety
Infrared light sensors, high resolution images and a sophisticated algorithm. All this technology is used to find out exactly where people are looking. As we drive, the road must obviously be the main focus.
Thatβs why itβs key to safety to be able to locate everything weβre looking for on the central console of the infotainment system at a glance, from the navigation system to the air conditioning or the radio. βWe must guarantee the minimum interaction time with the screen, and to do this the information must be where users intuitively and naturally look for itβ says RubΓ©n MartΓnez, head of SEATβs Smart Quality department. To accomplish this, they now have an innovative system.
What is it?Β Eye-Tracking is a technology that enables a computer to know where a person is looking. It does so through glasses with infrared sensors in the lenses and a camera in the centre of the frame. βThe sensors detect the exact position of the iris at every moment, while everything the user sees is recordedβ RubΓ©n explains. A complex 3D eye model algorithm interprets all this data and obtains the exact viewing point.

What does it do?Β This technology makes it possible to obtain very precise studies on human interaction with all kinds of devices. For example, it will serve to analyse the usability of mobility apps. βWe can know where users expect to find information such as battery level or range of kilometresβ says the head of Smart Quality.
How is it used?Β The team is now working on a pilot test in order to introduce the Eye-Tracker glasses in the testing of new models. They select users with different profiles who, while wearing them, will get behind the wheel of the SEAT Leon. βWeβll ask them, for example, to turn up the temperature or change the radio station and weβll analyse which part of the screen theyβve directed their gaze at first, how long it takes them to do so and how many times they look at the road while interacting with the deviceβ says RubΓ©n. Before, these tests were done by asking people questions, but βthe brain often misleads and where you think youβre looking is not where youβre actually doing itβ he adds. Now they will have accurate data.
How do you interpret the data?Β In the Smart Quality department facilities, using the complex algorithm, the behavioural patterns of each driverβs gaze are obtained through different indicators. One of them is the heat zone indicator, which shows the intensity of each focus of attention. βThe red spot, which indicates the greatest number of impacts, should always be on the roadβ RubΓ©n points out. It is the guarantee that users continue to pay attention to the road, even when interacting with the screen.
Another indicator is the order in which they look, a key to knowing where each driver expects to find a function. βWe may think, for example, that the lower part of the screen is the most accessible, but with the Eye-Tracker glasses, we can discover that, for whatever reason, first they look at the upper partβ he says.
What future does it have?Β All of these usability patterns will be key in developing the central consoles of tomorrowβs cars, determining the location, size and distribution of information that is most comfortable for users. βThis technology will help us humanise the interfaces, improving the user experience. With it weβll certainly go a step further in the quality of the infotainment console of the futureβ he concludes.

















