We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Visual Perception Based Engagement Awareness for Multiparty Human-Robot Interaction.
- Authors
Li, Liyuan; Xu, Qianli; Wang, Gang S.; Yu, Xinguo; Tan, Yeow Kee; Li, Haizhou
- Abstract
Computational systems for human-robot interaction (HRI) could benefit from visual perceptions of social cues that are commonly employed in human-human interactions. However, existing systems focus on one or two cues for attention or intention estimation. This research investigates how social robots may exploit a wide spectrum of visual cues for multiparty interactions. It is proposed that the vision system for social cue perception should be supported by two dimensions of functionality, namely, vision functionality and cognitive functionality. A vision-based system is proposed for a robot receptionist to embrace both functionalities for multiparty interactions. The module of vision functionality consists of a suite of methods that computationally recognize potential visual cues related to social behavior understanding. The performance of the models is validated by the ground truth annotation dataset. The module of cognitive functionality consists of two computational models that (1) quantify users' attention saliency and engagement intentions, and (2) facilitate engagement-aware behaviors for the robot to adjust its direction of attention and manage the conversational floor. The performance of the robot's engagement-aware behaviors is evaluated in a multiparty dialog scenario. The results show that the robot's engagement-aware behavior based on visual perceptions significantly improve the effectiveness of communication and positively affect user experience.
- Subjects
VISUAL perception; APPERCEPTION; MANAGEMENT; ANIMATRONICS; ROBOTS
- Publication
International Journal of Humanoid Robotics, 2015, Vol 12, Issue 4, p-1
- ISSN
0219-8436
- Publication type
Article
- DOI
10.1142/S021984361550019X