UTFacultiesEEMCSDisciplines & departmentsPSEducationAssignment: Investigating Gaze-Contingent LLM Systems for Context-Aware Assistance in XR

Assignment: Investigating Gaze-Contingent LLM Systems for Context-Aware Assistance in XR

Investigating Gaze-Contingent LLM Systems for Context-Aware Assistance in XR

Problem Statement:

In XR environments, retrieving relevant information quickly and intuitively is crucial—especially when users are multitasking or operating hands-free. Traditional interaction methods like voice or manual selection can be slow, cognitively taxing, or impractical. A promising alternative is gaze-contingent interaction, where a system intelligently reacts to where the user is looking. However, current solutions are limited in adaptiveness and rely on predefined responses. This project explores how integrating eye-tracking data with LLMs (Large Language Models) can enable seamless, context-aware assistance based on gaze behavior.

Task:

You will explore the design and implementation of an AI-enhanced gaze interaction system within XR. This includes developing prototypes, experimenting with gaze data integration, and evaluating the effectiveness of the system in supporting complex tasks. Whether you're interested in interaction design, machine learning, programming, or cognitive UX research—this project offers a valuable opportunity to contribute to cutting-edge XR research.

Research Scope:

1. Development of a Gaze-Contingent AI System:

2. User Interaction Design:

3. Evaluation of User Performance:

Work:

Contact:

Gwen Qin (gwen.qin@utwente.nl)