Touch is one of the important non-verbal forms of social interaction, where it is used to communicate emotions and social messages. However, identification of social touch has received far less research attention than vision and audition. To bridge this gap, research needs to be conducted on the detection, recognition and interpretation of touch gestures. Automatic recognition of social touch is necessary to apply interaction in the tactile modality to areas such as Human-Robot Interaction (HRI). If a robot can understand social touch behavior, the robot can respond accordingly – resulting in richer and more natural interaction.
For this grand challenge, two data sets will be made available containing labeled pressure sensor data of social hand touch gestures. Each set was collected from a similar matrix-type sensor grid, but under conditions reflecting different application orientations: (1) CoST: Corpus of Social Touch and (2) HAART: The Human-Animal Affective Robot Touch gesture set. Participants can choose to work on one of the data sets or on both. The purpose of this challenge is to develop relevant features and classification methods for recognizing social touch gestures. Participants will share their innovative findings at the ACM International Conference on Multimodal Interaction (ICMI '15) in Seattle, USA. Accepted papers will be included in the challenge proceedings of ICMI'15.
To participate in the challenge please send an email with you name(s) and affiliation(s) to firstname.lastname@example.org. Registration is due by May 1st 2015.
For further questions contact: email@example.com