Multimodal Human Activity Recognition Project
Introduction
Smart devices are becoming more and more included in our daily lives, and various type of data are collected from different devices, which may be used for the same purposes!
Objective
We cannot translate one data type to another data type and this is a big challenge and missed opportunity; often results in training a lot of different models for very specific tasks, while models could potentially learn from one another. In this project, you are expected to provide a solution to a Human Activity Recognition task (HAR), in which the data comes from different sources, with different natures.
You will be working on a multi-modal neural network (based on the study in [1]). You are expected to explore specific ’data layers’ that we retrain and further train depending on the data type. You are both expected to work in different datatypes such as but not limited to (e.g., IMU or Wi-Fi CSI).
Project Description
You will be tasked with:
• Literature Review: Conduct a review of existing research on human activity recognition, with a particular focus on multimodal data.
• System Design: Design a system that uses both signal data and accelerometer data to recognize and record various activities.
• Data Collection & Analysis: Collect and analyze data from both types to gain insights into the activities.
• System Testing & Evaluation: You will be expected to make a demo of your classification experiment.
Benefits
This project will provide you with the opportunity to:
• Work on and gain hands-on experience with a cutting-edge topic at the intersection of human-activity recognition, machine learning (ML), and data science.
• Learn how to transfer knowledge acquired in one domain to another with the help of ML.
Pre-Requisites
Prospective students should have a background in Computer Science, Embedded Systems, Data Science, or a related field. Knowledge of machine learning, data analysis, and experience with HAR would also be beneficial.
Contact
Jeroen Klein Brinke (j.kleinbrinke@utwente.nl)
Egemen I¸sgu¨der (egemen.isguder@utwente.nl)
References
[1] Egemen I¸sgu¨der and˙ Ozlem Durmaz¨ Incel.˙ FedOpenHAR: Federated MultiTask Transfer Learning for Sensor-Based Human Activity Recognition. 2023. arXiv: 2311.07765[cs.LG]. url: https://arxiv.org/abs/2311.07765.