Real-time Emotion Measurement in Consumption Decisions

Emotions play a decisive role in almost all areas of the customer journey. They can trigger impulse purchases, foster product, or brand relations, or determine the reaction to and the perception of advertising. In general, emotions are relevance detectors which is why the automatic detection of emotional reactions to stimuli like TV commercials has become a focus topic in Affective Computing. But which emotional reactions do consumers show, e.g., in response to TV commercials and how can they be measured?

Scientists agree that emotional processes happen at different levels, such as subjective experience, physiological, or behavioral-expressive. However, whilemeasurement approaches exist at each of these levels, there is a sparsity of research trying to identify response pattern across different processing levels.

This project tries to address this issue by using indicators from physiological and behavioral-expressive measures to identify the emotional states of valence and arousal with the aim to help to choose the best measurement approach.

For addressing the research gaps, we experimentally induced emotional states, namely positive and negative valence as well as high and low arousal and measure their interaction with physiological responses (specifically heart rate), vocal, and facial expression.

Besides the implications for academic research, this topic is also of importance for practical applications. Emotions are one of the main drivers in many different decision situations or determine, e.g., product or brand loyalty or word-of-mouth. Thus, their measurement is relevant in many different areas. However, the measurement approaches differ with respect to their costs, applicability, reliability, etc. For practical applications, for example, there are contexts where web cams can be used, like for determining emotions of customers in service calls that are done via video chat. With web cams not only voice and video but also heart rates could be measured easily. Yet, heart rate measurement might not be very accurate. In other situations, video and voice might not be available, but physio measurement, like in medical applications. Moreover, voice and videos could be used in situation where emotions of whole groups needs to be analyzed, whereas measurement of physiology in groups of people might be very effortful and hard to realize. Hence, there are trade-offs to be made for selecting the measurement approaches.

This project answers the question, which methods are most appropriate for different emotional reactions and how these methods can be applied in marketing contexts.


Prof. Dr. Jella Pfeiffer, University of Gießen

Prof. Dr. Klaus Scherer, University of Geneva#



Full manuscript in preparation

Related Projects

What the voice reveals: emotional arousal drives sharing of experiences

Related Publications

Seuss, D., Hassan, T., Dieckmann, A., Unfried, M., Scherer, K. R., Mortillaro, M., Garbas, J., (2021), Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals, IEEE Transactions on Affective Computing.

Scherer, K. R., Mortillaro, M.,  Dieckmann, A., Unfried M. & Ellgring, H., (2021), Investigating appraisal-driven facial expression and inference in emotion communication, Emotion, 21(1), 73-95.

Dieckmann, A., & Unfried, M. (2020). Thrilled or Upset: What Drives People to Share and Review Product Experiences?, NIM Marketing Intelligence Review, 12(2), 56-61.

Seuss, D., Dieckmann, A., Hassan, T., Garbas, J.U., Ellgring, J.H., Mortillaro, M., & Scherer, K. (2019). Emotion expression from different angles: A video database for facial expressions of actors shot by a camera array. Proceedings of the 8th International Conference on Affective Computing and Intelligent Interaction (ACII 2019) (pp. 35-41). Cambridge, United Kingdom.

Scherer, K. R., Ellgring, H., Dieckmann, A., Unfried, M., & Mortillaro, M. (2019). Dynamic facial expression of emotion and observer inference. Frontiers in Psychology, 10, 508.

Dieckmann, A., Unfried, M., Garbas, J., & Mortillaro, M. (2017). Automatic analysis of facial expressions in an advertising test with Chinese respondents. GfK Verein Working Paper Series, 5.

Eyben, F., Unfried, M., Hagerer, G., & Schuller, B. (2017). Automatic multi-lingual arousal detection from voice applied to real product testing applications. Proceedings of the 42nd IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP, New Orleans, LA, USA.

Garbas, J. U., Ruf, T., Unfried, M., & Dieckmann, A. (2013). Towards robust real-time valence recognition from facial expressions for market research applications. Proceedings of the Humaine Association Conference on Affective Computing and Intelligent Interaction, 570-575.