Towards Emotional Interaction: Using Movies to Automatically Learn Users’ Emotional States - Human-Computer Interaction – INTERACT 2011 Access content directly
Conference Papers Year : 2011

Towards Emotional Interaction: Using Movies to Automatically Learn Users’ Emotional States

Abstract

The HCI community is actively seeking novel methodologies to gain insight into the user’s experience during interaction with both the application and the content. We propose an emotional recognition engine capable of automatically recognizing a set of human emotional states using psychophysiological measures of the autonomous nervous system, including galvanic skin response, respiration, and heart rate. A novel pattern recognition system, based on discriminant analysis and support vector machine classifiers is trained using movies’ scenes selected to induce emotions ranging from the positive to the negative valence dimension, including happiness, anger, disgust, sadness, and fear. In this paper we introduce an emotion recognition system and evaluate its accuracy by presenting the results of an experiment conducted with three physiologic sensors.
Fichier principal
Vignette du fichier
978-3-642-23774-4_15_Chapter.pdf (394.65 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01590568 , version 1 (19-09-2017)

Licence

Attribution

Identifiers

Cite

Eva Oliveira, Mitchel Benovoy, Nuno Ribeiro, Teresa Chambel. Towards Emotional Interaction: Using Movies to Automatically Learn Users’ Emotional States. 13th International Conference on Human-Computer Interaction (INTERACT), Sep 2011, Lisbon, Portugal. pp.152-161, ⟨10.1007/978-3-642-23774-4_15⟩. ⟨hal-01590568⟩
119 View
88 Download

Altmetric

Share

Gmail Facebook X LinkedIn More