When we look into the world, we see geometry that tells us about the layout of our surroundings. People have spent a lot of time and effort researching how people perceive distance and scale. An important tool that is often used to this perception is virtual reality (VR). However, VR has its own limitations that affect people’s perceptions and actions. A common problem is distance compression. This problem causes people to perceive their surroundings to be closer to them than they actually are, essentially making the world look smaller. Most space perception research in VR has focused on this topic. However, we see more than just distances. We also see orientations or angular relationships between objects. Very little work has examined whether or not orientation perception is similarly affected by VR. The work that we talked about in our presentation was the first study to investigate this area.
We found that people are very inaccurate when judging orientations in both VR and the real-world (1). Interestingly, the compression seen in VR distance perception does not seem to affect orientation perception. This finding has implications for how people do tasks that need accurate judgment of positions, like air-traffic control or ground-traffic coordination. This also may affect how we design tools and computer interfaces for reading or displaying orientations. Fortunately, the orientation errors that people make are very consistent which may let us model and correct them in real-time.
- Jones, J. Adam, Jonathan E. Hopper, Mark T. Bolas, and David M. Krum (2019). “Orientation Perception in Real and Virtual Environments.” IEEE transactions on visualization and computer graphics 25, 2050-2060.
J. Adam Jones
Department of Computer and Information Science
University of Mississippi