Advancing technology for multimodal analysis of emotion expression in everyday life
Emotional expression plays a crucial role in everyday functioning. It is a continuous process involving many features of behavioral, facial, vocal, and verbal modalities. Given this complexity, few psychological studies have addressed emotion recognition in an everyday context. Recent technological innovations in affective computing could result in a scientific breakthrough as they open up new possibilities for the ecological assessment of emotions. However, existing technologies
still pose major challenges in the field of Big Data Analytics. Little is known about how these lab-based technologies generalize to real world problems. Rather than a one-size-fits-all-solution, existing tools need to be adapted to specific user groups in more natural settings. They also need to take large individual differences into account. We take up these challenges by studying emotional expression in dementia. In this context, emotional functioning is strongly at risk, yet highly important to maintain quality of life in person-centered care. Our domain-specific goal is to gain a better understanding of how dementia affects emotion expression. We carry out a pilot study, a comparative study between Alzheimer patients and matched healthy older adults, as well as a longitudinal study on the development of emotion expression in Alzheimer patients across time. We develop a unique corpus, use state of the art machine learning techniques to advance technologies for multimodal emotion recognition, and develop visualization and statistical models to assess multimodal patterns of emotion expression. We test their usability in a workshop for international researchers and make them available through the eScience Technology Platform.
Funding: NWO/Netherlands escience center
Partners: Department of Human Media Interaction, UMCG Universitair Netwerk Ouderenzorg
Supervisors: Gerben Westerhof, Dirk Heylen, Khiet Truong
Years: 2017-2021
PhDstudent: Deniece Nazareth