We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Perceptual self-position estimation based on gaze tracking in virtual reality.
- Authors
Liu, Hongmei; Qin, Huabiao
- Abstract
The depth perception of human visual system is divergent between virtual and real space; this depth discrepancy affects the spatial judgment of the user in a virtual space, which means the user cannot precisely locate their self-position in a virtual space. Existing localization methods ignore the depth discrepancy and only concentrate on increasing location accuracy in real space. Thus, the discrepancy always exists in virtual space, which induces visual discomfort. In this paper, a localization method based on depth perception is proposed to measure the self-position of the user in a virtual environment. Using binocular gaze tracking, this method estimates perceived depth and constructs an eye matrix by measuring gaze convergence on a target. Comparing the eye matrix and camera matrix, the method can automatically calculate the actual depth of the viewed target. Then, the difference between the actual depth and the perceived depth can be explicitly estimated without markers. The position of the virtual camera is compensated by the depth difference to obtain perceptual self-position. Furthermore, a virtual reality system is redesigned by adjusting the virtual camera position. The redesigned system makes users feel that the distance (from the user to an object) is the same in virtual and real space. Experimental results demonstrate that the redesigned system can improve the user's visual experiences, which validate the superiority of the proposed localization method.
- Subjects
VIRTUAL reality; EYE tracking; DEPTH perception; VISUAL perception; BINOCULAR vision; GAZE; CAMERAS; HUMAN-computer interaction
- Publication
Virtual Reality, 2022, Vol 26, Issue 1, p269
- ISSN
1359-4338
- Publication type
Article
- DOI
10.1007/s10055-021-00553-y