STEREO GRADING

Gaze Prediction using Machine Learning for
Dynamic Stereo Manipulation in Games

George Alex Koulieris

Type
Funded Project

Prinsiple Investigator
Katerina Mania

Collaborators
• G. Dretakis, INRIA
• D. Cunningham
• T.U.Cottbus

Research Domain
• Gaze Prediction
• Real-Time Graphics

Abstract

Comfortable, high-quality 3D stereo viewing is becoming a requirement for interactive applications today. Previous research shows that manipulating disparity can alleviate some of the discomfort caused by 3D stereo, but it is best to do this locally, around the object the user is gazing at. The main challenge is thus to develop a gaze predictor in the demanding context of real-time, heavily task-oriented applications such as games. Our key observation is that player actions are highly correlated with the present state of a game, encoded by game variables. Based on this, we train a classifier to learn these correlations using an eye-tracker which provides the ground-truth object being looked at. The classifier is used at runtime to predict object category – and thus gaze – during game play, based on the current state of game variables. We use this prediction to propose a dynamic disparity manipulation method, which provides rich and comfortable depth. We evaluate the quality of our gaze predictor numerically and experimentally, showing that it predicts gaze more accurately than previous approaches. A subjective rating study demonstrates that our localized disparity manipulation is preferred over previous methods.

Play Video

Koulieris George Alex, George Drettakis, Douglas Cunningham and Katerina Mania. “Gaze prediction using machine learnign for dynamic stereo manipulation in games.” In 2016 IEEE Virtual Reality (VR), pp. 113-120. IEEE, 2016

Skip to content