SEAT explores infrared and iris sensors to drive road safety
Infrared light sensors, high resolution images and a sophisticated algorithm. All this technology is used to find out exactly where people are looking. As we drive, the road must obviously be the main focus.
That’s why it’s key to safety to be able to locate everything we’re looking for on the central console of the infotainment system at a glance, from the navigation system to the air conditioning or the radio. “We must guarantee the minimum interaction time with the screen, and to do this the information must be where users intuitively and naturally look for it” says Rubén Martínez, head of SEAT’s Smart Quality department. To accomplish this, they now have an innovative system.
What is it? Eye-Tracking is a technology that enables a computer to know where a person is looking. It does so through glasses with infrared sensors in the lenses and a camera in the centre of the frame. “The sensors detect the exact position of the iris at every moment, while everything the user sees is recorded” Rubén explains. A complex 3D eye model algorithm interprets all this data and obtains the exact viewing point.
What does it do? This technology makes it possible to obtain very precise studies on human interaction with all kinds of devices. For example, it will serve to analyse the usability of mobility apps. “We can know where users expect to find information such as battery level or range of kilometres” says the head of Smart Quality.
How is it used? The team is now working on a pilot test in order to introduce the Eye-Tracker glasses in the testing of new models. They select users with different profiles who, while wearing them, will get behind the wheel of the SEAT Leon. “We’ll ask them, for example, to turn up the temperature or change the radio station and we’ll analyse which part of the screen they’ve directed their gaze at first, how long it takes them to do so and how many times they look at the road while interacting with the device” says Rubén. Before, these tests were done by asking people questions, but “the brain often misleads and where you think you’re looking is not where you’re actually doing it” he adds. Now they will have accurate data.
How do you interpret the data? In the Smart Quality department facilities, using the complex algorithm, the behavioural patterns of each driver’s gaze are obtained through different indicators. One of them is the heat zone indicator, which shows the intensity of each focus of attention. “The red spot, which indicates the greatest number of impacts, should always be on the road” Rubén points out. It is the guarantee that users continue to pay attention to the road, even when interacting with the screen.
Another indicator is the order in which they look, a key to knowing where each driver expects to find a function. “We may think, for example, that the lower part of the screen is the most accessible, but with the Eye-Tracker glasses, we can discover that, for whatever reason, first they look at the upper part” he says.
What future does it have? All of these usability patterns will be key in developing the central consoles of tomorrow’s cars, determining the location, size and distribution of information that is most comfortable for users. “This technology will help us humanise the interfaces, improving the user experience. With it we’ll certainly go a step further in the quality of the infotainment console of the future” he concludes.