Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing
Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users within a virtual reality environment. A variety of mixed device environments are supported to let different users connect to the system. Techniques are implemented to minimize jitter and synchronize views, and deal with different fields of view.
Vazquez, Iker and Cutchin, Steve. (2016). "Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing". D. Reiners, D. Iwai, and F. Steinicke (Eds.), ICAT-EGVE 2016: International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments (pp. 77-84). Eurographics Association. https://doi.org/10.2312/egve.20161438