We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Deep Reinforcement Learning for Flipper Control of Tracked Robots in Urban Rescuing Environments.
- Authors
Pan, Hainan; Chen, Xieyuanli; Ren, Junkai; Chen, Bailiang; Huang, Kaihong; Zhang, Hui; Lu, Huimin
- Abstract
Tracked robots equipped with flippers and LiDAR sensors have been widely used in urban search and rescue. Achieving autonomous flipper control is important in enhancing the intelligent operation of tracked robots within complex urban rescuing environments. While existing methods mainly rely on the heavy work of manual modeling, this paper proposes a novel Deep Reinforcement Learning (DRL) approach named ICM-D3QN for autonomous flipper control in complex urban rescuing terrains. Specifically, ICM-D3QN comprises three modules: a feature extraction and fusion module for extracting and integrating robot and environment state features, a curiosity module for enhancing the efficiency of flipper action exploration, and a deep Q-Learning control module for learning robot-control policy. In addition, a specific reward function is designed, considering both safety and passing smoothness. Furthermore, simulation environments are constructed using the Pymunk and Gazebo physics engine for training and testing. The learned policy is then directly transferred to our self-designed tracked robot in a real-world environment for quantitative analysis. The consistently high performance of the proposed approach validates its superiority over hand-crafted control models and state-of-the-art DRL strategies for crossing complex terrains.
- Subjects
DEEP reinforcement learning; ROBOT control systems; REINFORCEMENT learning; ARTIFICIAL intelligence; ENGINE testing; ROBOT programming; MANUAL labor
- Publication
Remote Sensing, 2023, Vol 15, Issue 18, p4616
- ISSN
2072-4292
- Publication type
Article
- DOI
10.3390/rs15184616