We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A multimodal human-robot sign language interaction framework applied in social robots.
- Authors
Jie Li; Junpei Zhong; Ning Wang
- Abstract
Deaf-mutes face many difficulties in daily interactions with hearing people through spoken language. Sign language is an important way of expression and communication for deaf-mutes. Therefore, breaking the communication barrier between the deaf-mute and hearing communities is significant for facilitating their integration into society. To help them integrate into social life better, we propose a multimodal Chinese sign language (CSL) gesture interaction framework based on social robots. The CSL gesture information including both static and dynamic gestures is captured from two different modal sensors. A wearable Myo armband and a Leap Motion sensor are used to collect human arm surface electromyography (sEMG) signals and hand 3D vectors, respectively. Two modalities of gesture datasets are preprocessed and fused to improve the recognition accuracy and to reduce the processing time cost of the network before sending it to the classifier. Since the input datasets of the proposed framework are temporal sequence gestures, the long-short term memory recurrent neural network is used to classify these input sequences. Comparative experiments are performed on an NAO robot to test our method. Moreover, our method can effectively improve CSL gesture recognition accuracy, which has potential applications in a variety of gesture interaction scenarios not only in social robots.
- Subjects
SOCIAL robots; SIGN language; HAND signals; RECURRENT neural networks; MOTION detectors; HEARING; RECOGNITION (Psychology)
- Publication
Frontiers in Neuroscience, 2023, Vol 17, p1
- ISSN
1662-4548
- Publication type
Article
- DOI
10.3389/fnins.2023.1168888