[B] Egocentric vision for human behaviour understanding

Master Assignment

[B] Egocentric vision for human behaviour understanding

Type: Master EE/CS/HMI

Period: TBD

Student: (Unassigned)

If you are interested, please contact :

This project has several MSc topics available.

  1. First Person View Emotion-related Body Pose Dataset Curation: Data curation for facial and body pose emotion recognition from first-person-view (FPV) data, utilizing state-of-the-art pose estimation and body posture analysis techniques.
  2. From Exo- to Ego-view behaviour analysis, adapting perspectives from cross-view exo-view to FPV videos using the Ego-Exo4D dataset.
  3. Emotion recognition through pose Analysis, with a focus on the analysis of extracted skeletal trajectories from Ego4D social interaction videos and correlating these with emotion-related body postures.
  4. Other project topics of interest for the MSc student can be discussed.

Why Join?

Who Should Apply?

This opportunity is open to MSc students with a background in computer vision, machine learning, or related fields. Prior experience with pose estimation models and emotion recognition is a plus, but not required.

More context on Egocentric Perception

The goal of the field of egocentric perception in computer vision is to develop algorithms and models that enable machines to understand and interpret the world from the perspective of a person wearing a camera. This type of perception, also known as first-person vision, focuses on analyzing the visual and sensor data captured from a wearer’s point of view. The key objectives in this field include:

  1. Understanding Human Actions and Interactions: Recognizing and analyzing the wearer's actions, as well as their interactions with objects and other people, in everyday tasks or specific environments.
  2. Modelling Intentions and Attention: Inferring the wearer's focus of attention, gaze, or intent based on the visual and sensor data, such as identifying what objects the person is likely to interact with or where they are looking.
  3. Environmental Awareness: Understanding the wearer's surroundings, such as recognizing objects, places, and scene contexts, while considering the dynamic and often chaotic nature of real-world environments.
  4. Behavior Analysis: Analyzing both the wearer’s behavior and the behaviors of people around them, which is valuable in applications like social interaction analysis, sports training, and surveillance.
  5. Personal Assistance: Supporting the development of assistive technologies, like wearable devices that can provide context-aware assistance in daily activities, health monitoring, or navigation for visually impaired individuals.
  6. Task Monitoring and Performance Enhancement: Improving task understanding and execution in specific fields such as surgery, cooking, or industrial work, where a first-person perspective helps track and evaluate performance or automate assistance.

By focusing on interpreting the world as experienced directly by an individual, egocentric perception aims to bring more personalized, context-aware, and interactive systems into applications such as augmented reality, robotics, sports training, lifelogging, and assistive technologies.