This slide illustrates the relative motion between an eye and the display. Here the eye counter-rotates to remain fixated on a tree in the real world, while the head, and consequently the head mounted display, rotates 20 degrees, something that can easily happen in a hundred milliseconds or less. The red dot shows the pixel on the display that maps to the tree’s perceived location in the real world, and thus the orientation of the eye, before and after rotation, with the original position shown with a faint red dot in the right-hand diagram. You can see that the red dot shifts a considerable distance during the rotation. While it’s true that the effect is exaggerated here because the tree is about six inches away, it’s also true that the eyes and the display can move a long way relative to one another in a short period of time.
This rapid relative motion makes it very challenging to keep virtual images in fixed positions relative to the real world, and matters are complicated by the fact that displays only update once a frame.
That results in a set of major issues for VR that don’t exist in the real world, and that are barely noticeable at worst on monitors, because there are no useful cases in which your eyes move very quickly relative to a monitor while still being able to see clearly.
Much of the rest of this talk will be dedicated to exploring some of the implications of the relationships between a head-mounted display, the eyes, and the real world.
The first implication has to do with tracking.
By tracking, I mean the determination of head position and orientation in the real world, known as pose.
Images have to be in exactly the right place relative to both the head and the real world every frame in order to seem real; otherwise the visual system will detect an anomaly, no matter how brief it is, and that will destroy the illusion of reality.
The human perceptual system has evolved to be very effective at detecting such anomalies, because anomalies might be thinking about eating you, or might be tasty.