The eye is the window into the soul. Technology aims to seize it.
Eye tracking is one of the key technologies behind Virtual Reality(VR) and Augmented Reality(AR). AR/VR furnishes data to adjust the perception of images using precise eye tracking about what the user is noticing within the field of view.
This data will be useful in every aspect of VR and AR but the implementation of response to generate stereoscopic 3D illusions is necessarily particular to the usage environment.
So the foremost use of AR/VR eye tracking will be to improve general AR/VR headset comfort and usability. Why? Eye tracking is a method that detect the point of gaze for each individual pupil.
Over the years, many techniques has been developed, including dynamic analysis of the eye images and dynamic analysis of the electrooculogram. As a direct input device, eye tracking is actually fairly frustrating and useless.
However, as a general contextual signal indicating possible user intent or attentional focus, it is still quite useful. Many use cases for eye tracking will work in the background, and will probably include the following:
Graphics Rendering Resource Allocation
If a person is looking somewhere, more graphics rendering resources can be allocated in that general direction. This can provide better-quality output for a given amount of rendering power.
Some VR data operations require time to complete, e.g. looking up something in an online database.
If a person glances in a particular direction, data fetching can begin in the background even before the person selects an item to interact within.
This improves the perceived responsiveness of the VR environment, which can be especially useful e.g. over mobile data networks.
Multi-Modal Smart 3D Object Selection
In VR, pointing at a small object in a cluttered environment can be quite difficult. Eye tracking can be used to help disambiguate the object that the user intends to select by combining the information with the controller input.
Automatic Headset Calibration
A headset which knows where the user’s eyes are can better adjust its own image output parameters.There can be a feature of desktop class VR headsets in the near future. The feature can be seen integrated into the high end Vive or Oculus headsets.
Perhaps eye tracking is important enough to remain a feature that gets minimized into mainstream mobile VR headsets of the future as well.
Theis a well-known automatic effect linking eye movements to changes in the vestibular system.
Knowing eye movements as well as those of the headset (via accelerometers) allow deductions about the state of the user’s vestibular system.
Thus systematic manipulations to heighten changes in balance or possibly to reduce the effects of motion sickness during VR use.
Pupil dilatation becomes que or trigger for certain key moments AI will action upon. What does the pupil look like when someone’s sad, happy, lonely, or pleasured?
We expect fine distinctions like this will arise in every considered area of implementation. Some solutions will prove harder than others, but overall, good designs will arise.
So we can be confident in predicting that eye tracking technology will prove useful across the board. At the same time the designers will be challenged in particulars for each category.
What all of these use cases have in common is that, when working well, you don’t notice that they are doing anything.
In fact, we would go as far as to say that the above use cases make eye tracking essential for enabling truly useable AR/VR for mass-market applications.
Excerpts from this report were written as answers to the question: In which areas of virtual reality are we likely to find eye-tracking technology within the next five years? on Quora