We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Grasping Pose Estimation for Robots Based on Convolutional Neural Networks.
- Authors
Zheng, Tianjiao; Wang, Chengzhi; Wan, Yanduo; Zhao, Sikai; Zhao, Jie; Shan, Debin; Zhu, Yanhe
- Abstract
Robots gradually have the ability to plan grasping actions in unknown scenes by learning the manipulation of typical scenes. The grasping pose estimation method, as a kind of end-to-end method, has rapidly developed in recent years because of its good generalization. In this paper, we present a grasping pose estimation method for robots based on convolutional neural networks. In this method, a convolutional neural network model was employed, which can output the grasping success rate, approach angle, and gripper opening width for the input voxel. The grasping dataset was produced, and the model was trained in the physical simulator. A position optimization of the robotic grasping was proposed according to the distribution of the object centroid to improve the grasping success rate. An experimental platform for robot grasping was established, and 11 common everyday objects were selected for the experiments. Grasping experiments involving the eleven objects individually, multiple objects, as well as a dark environment without illumination, were performed. The results show that the method has the adaptability to grasp different geometric objects, including irregular shapes, and it is not influenced by lighting conditions. The total grasping success rate was 88.2% for the individual objects and 81.1% for the cluttered scene.
- Subjects
CONVOLUTIONAL neural networks; ROBOT hands; ROBOTS
- Publication
Machines, 2023, Vol 11, Issue 10, p974
- ISSN
2075-1702
- Publication type
Article
- DOI
10.3390/machines11100974