28th Signal Processing and Communications Applications Conference (SIU), ELECTR NETWORK, 5 - 07 October 2020
The need of social interaction between human and robot is extensively highlighted in recent studies involving social robots. Language, emotions, postures, and gestures are commonly used to increase the quality of human-computer interaction. In this study, we focus on the design of a cognitive architecture to model the emotions and the dynamics of them to implement artificial empathy during human-computer interaction. Human-like empathy is considered as an emergent behavior based on social interaction with humans, gut feelings, mirroring system, and association between external stimuli and emotions in the developmental robotics theory. Our study uses developmental robotics theory and it presents a simulation of the internal emotional states of an agent/robot. Furthermore, our study demonstrates a model of the changes of the affective state of the robot from one emotion to another, in synchronization with the emotions expressed by its human partner. The robot can adjust its inner state and mood in harmony to the emotional state of the human partner after training. The simulations are performed and the proposed computational affective system is evaluated by the human participants subjectively.