eNHANCE – intelligent multimodal interface to improve autonomy of upper-extremity motor function disabled people
The objective of the eNHANCE project (http://www.enhance-motion.eu/) is to develop, demonstrate and initiate exploitation of new concepts to enhance and train upper extremity motor function in people with physical disabilities during daily-life. The new concepts entail an intelligent and natural multimodal adaptive interface that features high-performance natural intention detection using low-cost high data-rate eyetracking and supporting sensor modalities, and intelligently-controlled dexterous arm and hand support. The intelligent control ensures that the user is enabled to perform the intended motor tasks with minimal support, while effectively motivating the user in a personalized and context-aware manner to maximize his/her contribution to ensure maximal therapeutic effect.
People with upper-extremity motor impairments of muscular or neurological origin are extremely disabled in their daily-life, making them reliant on support by care-givers and partners for their most basic and personal activities. This reduces their self-esteem and deteriorates their physical abilities quickly. Robotic arm support systems have been developed to enable the users to perform daily-life tasks autonomously. However, these systems lack powerful intention detection that reacts quickly and does not require user attention. In addition, these supports do not effectively ensure maximum contribution of the user in a motivating and personalized manner. Therefore, the therapeutic effects are suboptimal.
The eNHANCE project strives to provide an intelligently controlled upper-extremity arm support for these people that features high-performance functional support of user motor intentions, while optimizing therapeutic effects in a context-aware, motivating and personalized manner.
Enabling motor impaired people to maximize their daily-life performance, while keeping their condition and ability at the highest possible level, is an important European priority in the face of the increasing number of elderly with multiple chronic diseases that make them reliant on extensive care-giver support and result in a downward spiral with increasing disabilities due to inactivity. In addition, it is a European-wide societal challenge to improve the independency and life-expectency of a smaller but important group of young people with seriously disabling motor disorders like Duchenne muscular dystrophy. At the same time, it is an economic business challenge to create the required intelligent and personalized multimodal interfaces to solve this immediate social problem and offer it to the world and apply it to other fields like ergonomic motor skills training.
It is the right time to create the envisioned solutions, since powerful intention detection, dexterous arm/hand support and behavioral modeling approaches are currently being developed and become available, but the integration of these approaches into the intelligent and personalized multimodal interface needed to solve the eminent societal problem is currently lacking .
The eNHANCE consortium provides the unique combination of companies, knowledge and clinical institutes that is required to solve this societal problem. The expertise covers such diverse fields as body-mounted sensing, high-performance intention detection, machine intelligence, behavioral monitoring and modeling, dexterous mechatronic arm and handfunction support, intelligent and user-enabling daily-life care, and economical and clinical exploitation of such new approaches.
The new concepts include: 1) a multimodal adaptive interface, consisting of user control input and observation interface with eye tracking being the key modality, environmental observation interface to create context-awareness and a symbiotic user mechatronics support and activation interface, 2) A personalized motor support intelligence that optimizes the functional and therapeutic characteristics of the eNHANCE support and that drives the multimodal adaptive interface.
2015 – 2019