CONTEXT-GAD: A Context-Aware Gaze Adaptive Dwell model for Gaze-based Selections in XR Environments
Georgios Ramiotis
Type: Research · Supervisor: Prof. Katerina Mania
Abstract
Gaze-based selection, via techniques such as gaze dwell, is one of the most common hands-free interaction performed by users in eXtended Reality (XR) environments. However, selecting a small constant dwell threshold to activate a target might lead to miss-interactions, also known as the Midas Touch problem, while a large threshold leads to eye fatigue. Prior research has proposed methodologies to adapt dwell thresholds based on the probability of the user activating a certain target considering past interactions or predicting intent based on gaze features. However, utilizing past inputs or gaze features leads to a heavily biased system towards individual strategy or physiology and cannot be generalized to other XR scenarios or users. In this work, we propose a novel context-aware system that leverages visual features of the task environment and user behavioral features such as the frequency of interactions, gaze speed variance, and head rotation velocity to adapt dwell thresholds across three distinct levels. We conducted a data collection experiment with twenty participants performing gaze dwell interactions in a general User Interface (UI) navigation task, and a visual search task. We trained a hierarchical machine learning model to predict and adapt dwell thresholds into three levels based on the induced cognitive load. We evaluated our system by utilizing standard machine learning metrics and by conducting a user study (n=17) based on quantitative and qualitative measures. Our system achieves a classification accuracy of 70.72% on the first level and 85.43% on the second. In addition, the system significantly reduces task completion time in less complex tasks and improves error rates in more cognitive intensive scenes.
Publications
- Ramiotis, G., & Mania, K. (2025, November). CONTEXT-GAD: A Context-Aware Gaze Adaptive Dwell model for Gaze-based Selections in XR Environments. In Proceedings of the 2025 31st ACM Symposium on Virtual Reality Software and Technology (pp. 1-11). LINK
