We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
UnVELO: Unsupervised Vision-Enhanced LiDAR Odometry with Online Correction.
- Authors
Li, Bin; Ye, Haifeng; Fu, Sihan; Gong, Xiaojin; Xiang, Zhiyu
- Abstract
Due to the complementary characteristics of visual and LiDAR information, these two modalities have been fused to facilitate many vision tasks. However, current studies of learning-based odometries mainly focus on either the visual or LiDAR modality, leaving visual–LiDAR odometries (VLOs) under-explored. This work proposes a new method to implement an unsupervised VLO, which adopts a LiDAR-dominant scheme to fuse the two modalities. We, therefore, refer to it as unsupervised vision-enhanced LiDAR odometry (UnVELO). It converts 3D LiDAR points into a dense vertex map via spherical projection and generates a vertex color map by colorizing each vertex with visual information. Further, a point-to-plane distance-based geometric loss and a photometric-error-based visual loss are, respectively, placed on locally planar regions and cluttered regions. Last, but not least, we designed an online pose-correction module to refine the pose predicted by the trained UnVELO during test time. In contrast to the vision-dominant fusion scheme adopted in most previous VLOs, our LiDAR-dominant method adopts the dense representations for both modalities, which facilitates the visual–LiDAR fusion. Besides, our method uses the accurate LiDAR measurements instead of the predicted noisy dense depth maps, which significantly improves the robustness to illumination variations, as well as the efficiency of the online pose correction. The experiments on the KITTI and DSEC datasets showed that our method outperformed previous two-frame-based learning methods. It was also competitive with hybrid methods that integrate a global optimization on multiple or all frames.
- Subjects
LIDAR; LOW vision; SPHERICAL projection; GLOBAL optimization; MULTISENSOR data fusion; PRIOR learning
- Publication
Sensors (14248220), 2023, Vol 23, Issue 8, p3967
- ISSN
1424-8220
- Publication type
Article
- DOI
10.3390/s23083967