“Beauty is in the eye of the beholder"

The combined power of affective computing and eye tracking.

 

DigitalEmpathy

 

In the middle of a generalized digital transformation journey, many companies have now realized about the importance of incorporating user experience (UX) knowledge to create more relevant and personalized products and services. Incorporating UX insights at different stages of the product development process allows to create a solid understanding on how users/customers really interact with products and helps to uncover specific pain points and opportunity areas, increasing the chance to come up with innovative solutions. Beyond the traditional tools used by UX researchers such as interviews and self-report surveys, new technological tools are now becoming more available, increasing their knowledge on how users interact with products and helping them to bring product/service development to the next level.

 

Digital Empathy is here the key concept. In a recent series of studies, Prof. Alexander Hahn, Katharina Klug and Prof. Florian Riedmüller demonstrate the power that new technologies can bring to the UX field.

According to them, user activity in a digital interface can be tracked for example via mouse movements or click log-files. These measures help in terms of functionality and usability evaluations. They can answer to the questions “What do users do/see with the digital product?” and “How do users use/buy the product?”. However, analyzing the user experience goes beyond the mere clicking behavior. What is underneath – the why – is a difficult question to respond from this perspective alone.

 

Understanding the why can be the difference between creating functional and emotional products/services. Digital empathy can become the key to answer these questions and build a more holistic understanding of user interactions and relationships with products. Digital empathy aims to improve human-computer interaction (HCI) by focusing on the user’s current mood and emotions and responding accordingly. A combined approach of  two different research methods – eye-tracking and Affective Computing (AC) – can not only capture the moments where users set their attention, but also how they feel about them, setting the ground for products based on digital empathy.

On the one hand, eye-tracking systems are becoming increasingly used by UX researchers to collect data on visual attention patterns. Expensive and complex hardware and software setups are thing of the past. With more available high-quality solutions supporting mobile and fixed research settings, researchers have more access than ever to qualitative and quantitative attention performance indicators, such as heatmaps and eye-fixation measures. These allow researchers to map precise visual triggers and locate specific areas of interest and pain points. To boost the insights gathered from eye-tracking, AC can provide real-time, reliable and valid measurements of user emotions in a scalable manner at a low cost. AC utilizes sensors to capture physiological data – such as facial expressions, voice frequency, heart rate, gestures – which are evaluated and classified by Machine Learning (ML) algorithms.

From their series of experiments, Hahn, Klug and Riedmüller conclude that emotion and eye-tracking data can provide more valid and objective results compared to classical surveys or interviews. After all, people are not always aware of all their specific reactions and feelings, but these methods can complement each other and reveal deeper and more accurate patterns and insights on what users see and feel.

 

For more information, see here.

More from our blog:

Introducing the TAWNY Emotion Analytics web-platform

Aug 5, 2020 3:25:12 PM

The world's first DIY Emotion SaaS subscription - powered by AI

Top 5

Sep 9, 2020 3:59:08 PM

TAWNY amongst the "Top 5 Emerging Affective Computing Startups"

Mystery deals - a pleasant surprise beats a sure thing.

Oct 30, 2020 2:14:44 PM
How Mystery Deals influence consumers’ affective reactions before and after the purchase.