RealEye webcam eye-tracking tool vs. AI-generated results -comparison.

Adam Cellary
6 min readMay 18, 2020

--

Photo by Alex Knight on Unsplash

There is more and more talk about replacing traditional eye-tracking by heatmaps generated by AI. In RealEye, we use the computing power of a regular PC/laptop to run AI (Deep neural network) that analyzes images coming from a webcam. The AI detects panelist’s face, pupils and predicts a gaze point. We decided to compare the results received from our technology with AI itself. And we were really surprised!

First, we needed some results to compare to them. We’ve got inspired by two companies: Attention Insights and their study about Covid-19 ads and Feng-GUI examples. There were 13 pictures in total (10+3).

All images were shown to 48 people for 5 seconds each (the first 0.5 seconds was cut out because of the “fixational bias” phenomenon).

Images from Attention Insights had AOIs with attention data expressed in percentages, so it was easy to compare the results by making identical (in terms of size) AOIs on RealEye heatmaps - both gaze and fixation based. We were trying to make the heatmaps as similar to the AI-generated as possible, but you are welcome to try your heatmap settings (point size, shadow, and opacity) on our results here. It’s just a matter of visualization and won’t change the numerical data from AOI’s.

Fixation heatmaps filters were all set by default:

  • minimum fixation duration: 100 ms,
  • noise reduction: 21 frames,
  • gaze velocity threshold: 0 px/s,
  • point size: 100 px,
  • shadow: 0 px,
  • opacity: adjustable.

Let’s get to the results

The first two heatmaps are generated by the RealEye tool (real people results) - the first one is based on gazes (AOI score is the total time spent by looking at the given area), the second - on fixations (fixations within the AOI are divided by total image fixations to calculate the score). The last heatmap is based on AI prediction only.

Ad: #stayhomicillin (BRID)

Our heatmap settings:

  • point size: 18 px,
  • shadow: 36 px,
  • opacity: 0.2.

As you can see, the major difference is the headline (10%/6% vs. 47%) and the top right flat (14%/16% vs. 2%). Other areas have similar results. Real people’s gazes were spread more evenly throughout the entire picture, and the headline has the same level of attention as top flats, while, according to the AI data, the headline gained almost half of the attention.

But let’s take a look at other examples.

Ad: Stay Home (IKEA)

Our heatmap settings:

  • point size: 19 px,
  • shadow: 45 px,
  • opacity: 0.3.

Ad: #ArtOfQuarantine

Our heatmap settings:

  • point size: 21 px,
  • shadow: 60 px,
  • opacity: 0.2.

Ad: #FightBackCorona (Velocita Brand)

Our heatmap settings:

  • point size: 17 px,
  • shadow: 48 px,
  • opacity: 0.2.

Here’s quite a difference, isn’t it? It’s also worth noticing that if the text AOIs were drawn a bit bigger, both results would be more similar.

Ad: #ArtOfQuarantine (MCPU)

Our heatmap settings:

  • point size: 18 px,
  • shadow: 50 px,
  • opacity: 0.2.

Here results are pretty similar — isn’t it amazing? So far, one can conclude that AI often favors texts over images. Though not always, as the previous example showed. But let’s keep analyzing.

Ad: Keep Distance (Audi)

Our heatmap settings:

  • point size: 19 px,
  • shadow: 67 px,
  • opacity: 0.2.

Ad: Wash Your Hands (Association of Communication Agencies of Georgia)

Our heatmap settings:

  • point size: 22 px,
  • shadow: 72 px,
  • opacity: 0.2.

Ad: Keep Your Distance (United Nations)

Our heatmap settings:

  • point size: 27 px,
  • shadow: 78 px,
  • opacity: 0.2.

Ad: Be a Warrior (Sr Franco)

Our heatmap settings:

  • point size: 20 px,
  • shadow: 78 px,
  • opacity: 0.2.

And yet again, AI favors text, while people’s attention seems to be attracted more by visuals.

Ad: COVID-19 — Now is your chance (Nike)

Our heatmap settings:

  • point size: 23 px,
  • shadow: 69 px,
  • opacity: 0.2.

Now, let’s take a look at images from Feng-GUI. There wasn’t any AOI data available, but we still can make some interesting conclusions.

Ad: Sisley

Our heatmap settings:

  • point size: 24 px,
  • shadow: 79 px,
  • opacity: 0.2.

The results seem similar — most attention caught the bottom face, but it seems that AI did not consider the text part of the image.

Ad: Moschino

Our heatmap settings:

  • point size: 29 px,
  • shadow: 65 px,
  • opacity: 0,1.

Here we can see that AI provides not only heatmaps but also gaze plots — a path of users’ gaze. These paths give us a little better insight into the testers’ attention - it shows the order of looking.

Ad: Nissan Pathfinder

Our heatmap settings:

  • point size: 26 px,
  • shadow: 75 px,
  • opacity: 0.2.

Instead of the gaze plot, RealEye provides gaze recordings (video instead of the static map). It’s a recording of all items in the study.

To sum things up.

We were surprised at how accurate the AI results can be. It’s surely a great solution, but… we believe it can’t replace a real human. People react dynamically to environmental changes, emotions, new situations, etc. Moreover, AI shows general data, while our experience shows that different groups of respondents perceive the same designs differently. For example, a medical journal - doctors and patients look at the same pages differently. Once we did a study about a job offer in Poland. From the general data, it was hard to find any useful conclusions, but by dividing testers by their gender, we found out that women read mainly requirements, while men were mostly interested in benefits. The insights may differ in different groups significantly. It’s pretty important in the world of marketing and neuroscience research.

Of course, AI has many advantages. It’s cheaper and faster than the traditional eye-tracking - results are generated in seconds. To compare, with the RealEye webcam solution you can see the results as soon as testers finish their job, so at least a few minutes to a few hours. Both give you the opportunity to track similar formats (ads, digital, etc.).

But, FYI, RealEye now has video eye-tracking, and the life website feature is coming soon. Also, we give you the opportunity to connect the eye-tracking study with external surveys to gain even more insights. But of course, you have to decide how do you want to conduct your studies.

--

--

Adam Cellary

I love to engage in interesting and forward-looking projects. I am a mixture of scrum master interested in startups with strong technical knowledge