To recognize emotions using less obtrusive wearable sensors, we present a novel emotion recognition method that uses only pupil diameter (PD) and skin conductance (SC). Psychological studies show that these two signals are related to the attention level of humans exposed to visual stimuli. Based on this, we propose a feature extraction algorithm that extract correlation-based features for participants watching the same video clip. To boost performance given limited data, we implement a learning system without a deep architecture to classify arousal and valence. Our method outperforms not only state-of-art approaches, but also widely-used traditional and deep learning methods.

Original languageEnglish
Title of host publicationICMI'19
Subtitle of host publicationProceedings of the 2019 International Conference on Multimodal Interaction
EditorsWen Gao, Helen Mei Ling Meng, Matthew Turk, Susan R. Fussell, Bjorn Schuller, Bjorn Schuller, Yale Song, Kai Yu
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Number of pages5
ISBN (Print)978-1-4503-6860-5
Publication statusPublished - 2019
Event21st ACM International Conference on Multimodal Interaction, ICMI 2019 - Suzhou, China
Duration: 14 Oct 201918 Oct 2019


Conference21st ACM International Conference on Multimodal Interaction, ICMI 2019

    Research areas

  • Emotion recognition, Machine learning, MAHNOB-HCI database, Pupil diameter, Skin conductance response

ID: 66953562