iSign: An Architecture for Humanoid Assisted Sign Language Tutoring


Köse H., Akalın N., Yorganci R., Ertugrul B. S., Kıvrak H., Kavak S., ...Daha Fazla

INTELLIGENT ASSISTIVE ROBOTS: RECENT ADVANCES IN ASSISTIVE ROBOTICS FOR EVERYDAY ACTIVITIES, cilt.106, ss.157-184, 2015 (Scopus) identifier identifier

Özet

This paper investigates the role of interaction and communication kinesics in human-robot interaction. It is based on a project on Sign Language (SL) tutoring through interaction games with humanoid robots. The aim of the study is to design a computational framework, which enables to motivate the children with communication problems (i. e., ASD and hearing impairments) to understand and imitate the signs implemented by the robot using the basic upper torso gestures and sound in a turn-taking manner. This framework consists of modular computational components to endow the robot the capability of perceiving the actions of the children, carrying out a game or storytelling task and tutoring the children in any desired mode, i. e., supervised and semi-supervised. Visual (colored cards), vocal (storytelling, music), touch (using tactile sensors on the robot to communicate), and motion (recognition and implementation of gestures including signs) based cues are proposed to be used for a multimodal communication between the robot, child and therapist/parent. We present an empirical and exploratory study investigating the effect of basic non-verbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the child will give relevant feedback in SL or visually to the robot, according to the context of the game.