Social Signal Processing Network
Project Manager: Dr. Dirk Heylen
Faculty of Electrical Engineering, Mathematics and Computer Science - EEMCS
Tel.: +31 (53) 489 3745
The ability to understand and manage social signals of a person we are communicating with is the core of social intelligence. Social intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for success in life. Although each one of us understands the importance of social signals in everyday life situations, and in spite of recent advances in machine analysis and synthesis of relevant behavioural cues like blinks, smiles, crossed arms, head nods, laughter, etc., the research efforts in machine analysis and synthesis of human social signals like empathy, politeness, and (dis)agreement, are few and tentative. The main reasons for this are the absence of a research agenda and the lack of suitable resources for experimentation.
The mission of the SSPNet is to create a sufficient momentum by integrating an existing large amount of knowledge and available resources in Social Signal Processing (SSP) research domains including cognitive modelling, machine understanding, and synthesizing social behaviour, and so:
- enable creation of the European and world research agenda in SSP,
- provide efficient and effective access to SSP-relevant tools and data repositories to the research community within and beyond the SSPNet, and
- further develop complementary and multidisciplinary expertise necessary for pushing forward the cutting edge of the research in SSP.
The collective SSPNet research effort will be directed towards integration of existing SSP theories and technologies, and towards identification and exploration of potentials and limitations in SSP. More specifically, the framework of the SSPNet will revolve around two research foci selected for their primacy and significance: Human-Human Interaction (HHI) and Human-Computer Interaction (HCI). A particular scientific challenge that binds the SSPNet partners is the synergetic combination of human-human interaction models, and automated tools for human behaviour sensing and synthesis, within socially-adept multimodal interfaces.
Project duration: 1 February 2009 / 1 February 2014
Project budget: 8.2 M-€ / 6.268 k-€ funding
Number of person/years: 30 fte / 6 fte per year
Project Coordinator: IDIAP Research Institute
Participants: IDIAP Research Institute, Imperial College, University of Edinburgh, UT, Università di Roma Tre, Queen’s University Belfast, DFKI, CNRS, Université de Genève, TU Delft
Project budget CTIT: 392.5 k-€ / 295.6 k-€ funding
Number of person/years CTIT: 2.3 fte / 0.5 fte/year
Involved groups: Human Media Interaction (HMI)