Towards Automated Human-Robot Mutual Gaze


Broz F., Kose-Bagci H., Nehaniv C. L., Dautenhahn K.

4th International Conference on Advances in Computer-Human Interactions (ACHI), Gosier, Fransa, 23 - 28 Şubat 2011, ss.222-227 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Basıldığı Şehir: Gosier
  • Basıldığı Ülke: Fransa
  • Sayfa Sayıları: ss.222-227
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

The role of gaze in interaction has been an area of increasing interest to the field of human-robot interaction. Mutual gaze, the pattern of behavior that arises when humans look directly at each other's faces, sends important social cues communicating attention and personality traits and helping to regulate conversational turn-taking. In preparation for learning a computational model of mutual gaze that can be used as a controller for a robot, data from human-human pairs in a conversational task was collected using a gaze-tracking system and face-tracking algorithm. The overall amount of mutual gaze observed between pairs agreed with predictions from the psychology literature. But the duration of mutual gaze was shorter than predicted, and the amount of direct eye contact detected was, surprisingly, almost nonexistent. The results presented show the potential of this automated method to capture detailed information about human gaze behavior, and future applications for interaction-based robot language learning are discussed. The analysis of human-human mutual gaze using automated tracking allows further testing and extension of past results that relied on hand-coding and can provide both a method of data collection and input for control of interactive robots.