Computational Methods for Natural Walking in Virtual Reality
Files
Publication or External Link
Date
Authors
Advisor
Citation
DRUM DOI
Abstract
Virtual reality (VR) allow users to feel as though they are really present in a computer-generated virtual environment (VE). A key component of an immersive virtual experience is the ability to interact with the VE, which includes the ability to explore the virtual environment. Exploration of VEs is usually not straightforward since the virtual environment is usually shaped differently than the user's physical environment. This can cause users to walk on virtual routes that correspond to physical routes that are obstructed by unseen physical objects or boundaries of the tracked physical space. In this dissertation, we develop new algorithms to understand how and enable people to explore large VEs using natural walking while incurring fewer collisions physical objects in their surroundings. Our methods leverage concepts of alignment between the physical and virtual spaces, robot motion planning, and statistical models of human visual perception. Through a series of user studies and simulations, we show that our algorithms enable users to explore large VEs with fewer collisions, allow us to predict the navigability of a pair of environments without collecting any locomotion data, and deepen our understanding of how human perception functions during locomotion in VR.