Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems

Ayata D., Yaslan Y., Kamaşak M. E.

JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, vol.40, no.2, pp.149-157, 2020 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 40 Issue: 2
  • Publication Date: 2020
  • Doi Number: 10.1007/s40846-019-00505-7
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, EMBASE, INSPEC
  • Page Numbers: pp.149-157
  • Keywords: Physiological data, Emotion recognition, Multi-sensor data fusion, RELEVANCE
  • Istanbul Technical University Affiliated: Yes


Purpose The purpose of this paper is to propose a novel emotion recognition algorithm from multimodal physiological signals for emotion aware healthcare systems. In this work, physiological signals are collected from a respiratory belt (RB), photoplethysmography (PPG), and fingertip temperature (FTT) sensors. These signals are used as their collection becomes easy with the advance in ergonomic wearable technologies. Methods Arousal and valence levels are recognized from the fused physiological signals using the relationship between physiological signals and emotions. This recognition is performed using various machine learning methods such as random forest, support vector machine and logistic regression. The performance of these methods is studied. Results Using decision level fusion, the accuracy improved from 69.86 to 73.08% for arousal, and from 69.53 to 72.18% for valence. Results indicate that using multiple sources of physiological signals and their fusion increases the accuracy rate of emotion recognition. Conclusion This study demonstrated a framework for emotion recognition using multimodal physiological signals from respiratory belt, photo plethysmography and fingertip temperature. It is shown that decision level fusion from multiple classifiers (one per signal source) improved the accuracy rate of emotion recognition both for arousal and valence dimensions.