We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Effective fusion module with dilation convolution for monocular panoramic depth estimate.
- Authors
Han, Cheng; Cai, Yongqing; Pan, Xinpeng; Wang, Ziyun
- Abstract
Depth estimation from monocular panoramic image is a crucial step in 3D reconstruction, which is a close relationship with virtual reality and metaverse technologies. In recent years, some methods, such as HRDFuse, BiFuse++, and UniFuse, have employed a two‐branch neural network leveraging two common projections: equirectangular and cubemap projections (CMPs). The equirectangular projection (ERP) provides a complete field of view but introduces distortion, while the CMP avoids distortion but introduces discontinuity at the boundary of the cube. In order to address the issue of distortion and discontinuity, the authors propose an efficient depth estimation fusion module to balance the feature mapping of the two projections. Moreover, for the ERP, the authors propose a novel inflated network architecture to extend the receptive field and effectively harness visual information. Extensive experiments show that the authors' method predicts more clear boundaries and accurate depth results while outperforming mainstream panoramic depth estimation algorithms.
- Subjects
MONOCULARS; STEREO image processing; VIRTUAL reality; MAP projection; SHARED virtual environments; CONVOLUTION codes
- Publication
IET Image Processing (Wiley-Blackwell), 2024, Vol 18, Issue 4, p1073
- ISSN
1751-9659
- Publication type
Article
- DOI
10.1049/ipr2.13007