HMI Student Assignments

If you as a bachelor or master student are looking for a final thesis assignment, capita selecta, research topic, or internship, you can choose between a large number of internal (at HMI) or external (at a company, research institute, or user organisation) assigments. You can also choose to create one yourself. Searching a suitable location or a suitable topic often starts here: below you find a number of topics you can work on.  

Note that in preparation for a final MSc assignment you must carry out a Research Topics resulting in a final project proposal. For Research Topics, you need to hand in this form to the bureau of educational affairs (BOZ).

ASSIGNMENTS AND TOPICS THAT CAN BE CARRIED OUT INTERNALLY AT HMI

  • Master’s Assignments: Design of Smart Objects for Hand Rehabilitation after Stroke

    Stroke impacts many people and is one of the leading causes of death and disability worldwide [1]–[3] and in the Netherlands [4], [5]. The predicted acceleration of the ageing population is expected to raise the absolute numbers of stroke survivors that need care [7]. 80% of all stroke patients suffers from function loss and needs professional caregivers [8], [9] and experiences lower quality of life due to their limited ability to participate in social activities, work and engage in daily activities [10], [11].

    The hand is the highly functional endpoint of the human arm as it enables a vast variety of daily activities related to high quality of life [12]. Only 12% of stroke patients recovers arm and hand function in the first 6 months [13]. For the remaining, the limited ability to use their hand leads to financial and psychological impact on them and their families, as it limits the execution of daily activities [14]. A treatment with substantial evidence for its effectiveness is CIMT (Constrained Induced Movement Therapy) [15]. CIMT usually employs intensive sessions focused on task-specific exercises, combined with constraining the unaffected hand and forcing patients to use their affected hand. CIMT relies on the principle of ‘use it or lose it’ [16] and requires patients using their affected hand.

    So far, attempts in creating effective home training methods focus on the direct translation of clinical exercises to home training, by designing them to be executed regardless of the location of the patient [17]. Monitoring with the use of smart objects [18]–[21] accounts for the lack of direct supervision and gaming and virtual reality elements have been added to make training more challenging [22]. Such methods assume that patients are motivated, able and willing to clear time in their schedule to engage in training, and/or to sit down at a specific location in their house to execute it. We need a new method to apply this principle in a more flexible way by engaging people in clinically meaningful activities in their daily routine. This way, patients will seamlessly perform functional training activities at a much higher dose than can be achieved in clinics.

    Our key objective is to develop a new method using smart objects in which training exercises will be seamlessly integrated into the daily routine of a patient at home.

    This method will aim to use the performance on these activities as a functional training set over the day, leading to improved hand function and therefore motivation to perform the activities again in the future [23]–[25]. Patients will not have to schedule their training, but the exercise will be part of their regular daily activities. We will do this by investigating a way of transferring clinical exercises to a home setting using smart objects. Smart objects can be integrated into the daily activities of patients and trigger (by design) a certain user behaviour. The focus in our proposal is for these objects to go beyond simply monitoring [18]–[21], and create a stimulating environment where people feel invited to train and intrinsically motivated to perform the task again in the future. Think of a smart toothbrush, that is designed to promote the use of the affected hand and enables operation only when used by this hand! Fundamental research into the transferability of clinical hand rehabilitation to a smart object home-based setting is needed to theoretically underpin our method. Using smart objects and artificial intelligence, personalized health will be more accessible, and the plurality of data will allow future clinicians more flexibility and overall control of the rehabilitation process.

    In this assignment, the masters’ student is expected to:

    1. Review literature on existing technologies (sensors, actuators, AI, etc.) of smart objects for rehabilitation to identify gaps/opportunities

    2. Specify the requirements for design of smart daily objects that can drive seamless rehabilitation with the use of technology

    3. Design and validate a product concept in a co-design manner including clinicians, users and developers

    What do we offer?

    We offer an interdisciplinary network of researchers who among others are experienced in hand rehabilitation and rehabilitation technology (dr. ir. Kostas Nizamis-DPM), Artificial Intelligence, smart technology and stroke rehabilitation (dr. ir. Juliet A.M. Haarman-HMI), and additionally behaviour change and design research (dr. Armağan Karahanoğlu-IxD). Additionally, the student will collaborate close with clinicians from Roessingh Research & Development (RRD) that aspire to be the end users of the product.

    Bibliography

    [1] S. S. Virani et al., “Heart Disease and Stroke Statistics—2020 Update,” Circulation, vol. 141, no. 9, Mar. 2020.

    [2] S. Sennfält, B. Norrving, J. Petersson, and T. Ullberg, “Long-Term Survival and Function After Stroke,” Stroke, 2019.

    [3] E. R. Coleman et al., “Early Rehabilitation After Stroke: a Narrative Review,” Current Atherosclerosis Reports. 2017.

    [4] “StatLine.” [Online]. Available: https://opendata.cbs.nl/statline/#/CBS/en/. [Accessed: 21-Apr-2020].

    [5] C. M. Koolhaas et al., “Physical activity and cause-specific mortality: The Rotterdam study,” Int. J. Epidemiol., 2018.

    [6] R. Waziry et al., “Time Trends in Survival Following First Hemorrhagic or Ischemic Stroke Between 1991 and 2015 in the Rotterdam Study,” Stroke, 2020.

    [7] A. G. Thrift et al., “Global stroke statistics,” International Journal of Stroke. 2017.

    [8] W. Pont et al., “Caregiver burden after stroke: changes over time?,” Disabil. Rehabil., 2020.

    [9] P. Langhorne, F. Coupar, and A. Pollock, “Motor recovery after stroke: a systematic review,” The Lancet Neurology. 2009.

    [10] M. J. M. Ramos-Lima, I. de C. Brasileiro, T. L. de Lima, and P. Braga-Neto, “Quality of life after stroke: Impact of clinical and sociodemographic factors,” Clinics, 2018.

    [11] Q. Chen, C. Cao, L. Gong, and Y. Zhang, “Health related quality of life in stroke patients and risk factors associated with patients for return to work,” Medicine (Baltimore)., vol. 98, no. 16, p. e15130, Apr. 2019.

    [12] R. Morris and I. Q. Whishaw, “Arm and hand movement: Current knowledge and future perspective,” Frontiers in Neurology, vol. 6, no. FEB, 2015.

    [13] G. Kwakkel, B. J. Kollen, J. V. Van der Grond, and A. J. H. Prevo, “Probability of regaining dexterity in the flaccid upper limb: Impact of severity of paresis and time since onset in acute stroke,” Stroke, 2003.

    [14] J. E. Harris and J. J. Eng, “Paretic Upper-Limb Strength Best Explains Arm Activity in People With Stroke,” Phys. Ther., 2007.

    [15] G. Kwakkel, J. M. Veerbeek, E. E. H. van Wegen, and S. L. Wolf, “Constraint-induced movement therapy after stroke,” The Lancet Neurology. 2015.

    [16] Y. Hidaka, C. E. Han, S. L. Wolf, C. J. Winstein, and N. Schweighofer, “Use it and improve it or lose it: Interactions between arm function and use in humans post-stroke,” PLoS Comput. Biol., vol. 8, no. 2, 2012.

    [17] Y. Levanon, “The advantages and disadvantages of using high technology in hand rehabilitation,” Journal of Hand Therapy. 2013.

    [18] M. Bobin, M. Boukallel, M. Anastassova, M. Ammi, U. Paris-saclay, and F.- Orsay, “Smart objects for upper limb monitoring of stroke patients during rehabilitation sessions .,” no. August 2018, 2017.

    [19] M. Bobin, F. Bimbard, and M. Boukallel, “Smart Health SpECTRUM : Smart ECosystem for sTRoke patient ’ s Upper limbs Monitoring,” Smart Heal., vol. 13, p. 100066, 2019.

    [20] M. Bobin, H. Amroun, M. Boukalle, M. Anastassova, and M. A. Limsi-cnrs, “Smart Cup to Monitor Stroke Patients Activities during Everyday Life,” 2018 IEEE Int. Conf. Internet Things IEEE Green Comput. Commun. IEEE Cyber, Phys. Soc. Comput. IEEE Smart Data, pp. 189–195, 2018.

    [21] G. Yang, J. I. A. Deng, G. Pang, H. A. O. Zhang, and J. Li, “An IoT-Enabled Stroke Rehabilitation System Based on Smart Wearable Armband and Machine Learning,” IEEE J. Transl. Eng. Heal. Med., vol. 6, no. May, pp. 1–10, 2018.

    [22] L. Pesonen, L. Otieno, L. Ezema, and D. Benewaa, “Virtual Reality in rehabilitation : a user perspective,” pp. 1–8, 2017.

    [23] A. L. Van Ommeren et al., “The Effect of Prolonged Use of a Wearable Soft-Robotic Glove Post Stroke - A Proof-of-Principle,” in Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, 2018.

    [24] G. B. Prange-Lasonder, B. Radder, A. I. R. Kottink, A. Melendez-Calderon, J. H. Buurke, and J. S. Rietman, “Applying a soft-robotic glove as assistive device and training tool with games to support hand function after stroke: Preliminary results on feasibility and potential clinical impact,” in IEEE International Conference on Rehabilitation Robotics, 2017.

    [25] B. Radder, “The Wearable Hand Robot: Supporting Impaired Hand Function in Activities of Daily Living and Rehabilitation,” University of Twente, Enschede, 2018.

  • Supporting healthy eating

    Contact: Juliet Haarman (HMI – j.a.m.haarman@utwente.nl), Roelof de Vries (BSS – r.a.j.devries@utwente.nl),

    Project Summary:

    Eating is more than the consumption of food. Eating is often a social activity. We sit together with friends, family, colleagues and fellow students, to connect, share and celebrate aspects of life. Sticking to a personal diet plan can be challenging in these situations. The social uncomfortableness that is associated with having a different diet than the rest of a group greatly contributes to this. Additionally, it is well known that we unconsciously influence each other while we eat. Not just in the type of food that we choose, even the quantity of the food that we consume, or the speed with which we consume the food is affected by our eating partners.

    A variety of assignments is available that focuses on this topic. They are specified below.

    The interactive dining table

    The interactive dining table is created to open up the concept of healthy eating in a social context: where individual table members feel supported, yet still experience a positive group setting. The table is embedded with 199 load cells and 8358 LED lights, located below the table top surface. Machine learning can be applied to the sensor data from the table to detect weight shifts over the course of a meal, identify individual bite sizes and classify interactions between table members and food items. Simultaneously, the LEDs can be used to provide real-time feedback about eating behavior, give perspective regarding eating choices, or alter the ambience of the eating experience as a whole. Light interactions can change over time and between settings, depending on the composition of the table members at the table or the type of meal that is consumed at the table.

    An indication of the assignments that are possible within this topic:

    -          Adding intelligence to the table. Are we able to track the course of the meal over time? This includes questions such as: How much has been put on the plates of all table members? At what times have they taken a bite? How much are the putting on their fork? Are they going in for seconds? Etc. Keywords: Machine learning, sensor fusion and finding signal characteristics

     

    -          Creating LED interactions that provide the user with feedback about his/her behavior. Which type of interactions work for a variety of target groups? How should interactions be shaped, such that a single subject in a group feels supported? How can we implicitly steer people towards healthy(er) behavior, without coercion, or without putting the emphasis of the meal on it?

    Keywords: HCI, user experience, co-design, Unity, sensor signals

     

    -          Togetherness around eating or `commensality' is a relatively new direction for HCI research. Recent work has distinguished `digital commensality', eating together through digital technology, and `computational commensality', physical or mediated multimodal interaction around eating. We are currently exploring how commensality mediated by technology can be used to support dietary behavior change, in a broader concept than the interactive table alone. How does commensality influence dietary habits and how can this influence of commensality be used and acknowledged in dietary behavior change technology?

    Keywords: behavior change strategies, behavior change technology, commensality, technology-mediated commensality

     

    Wearables to automatically log eating behavior

    Gaining insight into the current eating behavior of a person is a first step in accomplishing better health. Professionals still use conventional methods, such as the use of logbooks, for this. They ask the user to manually report on their eating behavior throughout the day. Memory and logging bias are not uncommon factors associated with this method. Often, users simply forget to write down what and when they have been eating. Also, the presence of unknown ingredients in the food, difficulties in estimating portion size or social discomfort while logging the food affect the reliability of this method.

    One way to lower the chance for bias, is to use technology that automatically detects events of food intake. Accelerometers on the wrist, strain gauges on the jaw or RIP sensors that monitor the breathing signal of the subject are examples of technologies that are used to identify intake gestures and chewing/swallowing movements – indicating that the user is eating. Many of these technologies are not tested outside of a standardized, laboratory environment yet and therefore their practical validity is often unknown – and should be investigated.

    -          We are currently investigating several detection methods individually, and as a combination. Which methods work well in what type of situations? What type of data processing steps should be taken to get there? We are still measuring at lab-level and want to bring this to an in-the-wild setting.

    Keywords: data gathering, user testing, data processing, machine learning

    Cooking skills and kitchen habits

    Eating is often the end-point of preparing a meal. Eating healthy often starts by cooking healthy and picking your ingredients. But what if you do not have the advanced cooking skills that are needed to follow a certain recipe or prepare the ingredients correctly? What if your cooking skills hold you back from trying out new recipes? What if your perception of your eating habits differ from your actual eating habits? For instance, what if your kitchen habits are such that you always grab a bag of chips once you arrive home from work, or that you very often consume snacks while you might think this only happens occasionally during the week?

    By tracking and processing data that is gathered in and around the kitchen area, we could gain better insights in the eating habits of individuals. This might be an important step in supporting the individual towards healthier behavior.

    -          We are currently investigating several technologies and measurement set-ups that are needed to support this type of research. What type of sensors should be placed at what locations in the house? How do they communicate together? What type of information should be gathered and could serve as a trigger for feedback towards the user? In what way can we support the user to choose different ingredients, try out new recipes or break his unwanted eating habits?

    Keywords: Design of sensor systems, prototyping, data gathering, data processing, user interactions

     

    Operationalizing behavior change strategies

    Many strategies to try to support or influence us in changing our behaviors are exposed to us daily. Just think of the app that wants you to set ‘goal for the week’ (or sets it for you), or the website that informs you that ‘there is only one left!’. These features are usually based on a theorical understanding of what influences us. For example, a goal setting feature can be based on goal setting theory. However, goal setting theory argues that for goals to motivate us, they have to be feasible as well as challenging. Another theory that is used as a theorical underpinning of a feature is social comparison theory, which argues that can be motivated be comparing themselves to others (either, upward, downward, or lateral comparison). An example of how this is implemented is a leaderboard, where you can see how you are doing with respect to a certain statistic. However, is a leaderboard really a good operationalization of social comparison theory? And is an app with a textbox where you can set a goal, really a good operationalization of goal setting theory? What can we learn from the features about the theory they are based on when these features work or do not work?

     

    -          These are questions that we would like investigated, for theories and features used as an example, but also for other features and theories.

    Keywords: behavior change theory, design, behavior change strategies, behavior change technology   

  • Posture and movement feedback in sport @ HMI - Enschede, NL

    Haptics as a research field concerns touch and applying it in human computer interaction. Think about research into touch perception, human-human touch interaction, human-computer touch interaction, mediated social touch etc.

    Touch feedback is a great way to relay or strengthen a message. One way to give touch feedback to users is through wearables, for example to give haptic feedback for posture correction in sports.

    When you are exercising or participating in sports you may perform some movements in an incorrect way. These incorrect movements may cause you to be less efficient or harm yourself.

    If you join a sports club there usually is a coach or advanced player who can help you master the techniques without injuring yourself, by giving instructions and correcting your pose and movement with a gentle touch.

    If you are a beginner at an individual sport like running or you want to practice more outside of the schedule of your club you risk injuring yourself. Posture correction and feedback on movements by a coach can be mimicked by haptic stimuli from a wearable. This way you can receive the gentle correction at any time and improve your posture.

    The assignment is to design a haptic wearable for sports that is able to give real time feedback on posture and helps the user to correct their posture and movement. Pick any sport you like and have affinity with. It could be useful to focus on a limited amount of movements. During this project it is key to explore a variation touch patterns and end-users responses. Consider talking to a sports coach and practicing the sport yourself, as a part of your preliminary research.

    In this assignment you will:

    * use vibration motors to design a wearable that is able to give real time feedback on the posture and movements of an athlete or hobbyist in the chosen sport.

    * design different touch patterns to indicate that a movement is performed incorrectly and how to correct the movement.

    * perform an analysis of correct posture of the chosen techniques.

    Creative own ideas and input to this assignment are more than welcome!

    Contact: Judith Weda

  • CHATBOTS FOR HEALTHCARE – THE eCG FAMILY CLINIC @ HMI - Enschede, NL in cooperation with UMCU - Utrecht, NL

    In collaboration with Universitair Medisch Centrum Utrecht we will design and develop the eCG family clinic: the electronic Cardiovascular Genetic family clinic to facilitate genetic screening in family members. In inherited cardiovascular diseases, first-degree relatives are at 50% risk of inheriting the disease-causing mutation. For these diseases, preventive measures and treatment options are readily available and effective. Relatives may undergo predictive DNA testing to find out whether they carry the mutation. More than half of at-risk relatives do not attend genetic counselling and/or cardiac evaluation.

    In order to increase the group of people that will attend the genetic counseling and/or cardiac evaluation the eCG family clinic will be developed. eCG Family Clinic is an online platform where family members are provided with general information (e.g. on the specific family disease, mode of inheritance, pros and cons of genetic testing and the testing procedure). The users of the platform will be able to interact with a chatbot.

    Within this research project we have student assignments available such as:
    ·       Designing and developing of a chatbot and its functions and roles within the platform
    ·       Translating current treatment protocols into prototypes of the chatbot
    ·       Evaluating user experience and user satisfaction

    We are open for alternative assignments or perspectives on the example assignments above.

    Contact person: Randy Klaassen, r.klaassen@utwente.nl

  • Smart Information System for Ethics Assessments @ HMI - Enschede, NL

    Whenever we do research with humans, we have to deal with consent forms, Ethics Committee assessment requests, and related things. For people who are not very familiar with the procedures and rules this process can be daunting; even though often it should only take 2 minutes to figure everything out, the maze of information causes people to struggle when navigating this task.

    Over the past month, we have been developing prototype information bundles that should help people do this taks quicker as well as with higher quality. In this assignment you will turn this information bundle into an smart interactive interface that helps people carry out their Ethics Assessment request, including building consent forms and information brochures that are complete and concise. Initial steps have already been taken; a illustrative prototype has been developed in an earlier student project. Next steps involve an in depth validation of the problem statement (clarifying the actual form in which researchers and students struggle with parts of the process), turning the illustrative prototype into a usable and flexible tool, and validating the tool with students and researchers who have to carry out research in various domains.

    This assignment lends itself well to a BSc or MSc thesis.

    contact person: Dennis Reidsma

  • Touch Interactions and Haptics @ HMI - Enschede, NL

    In daily life, we use our sense of touch to interact with the world and everything in it. Yet, in Human-Computer Interaction, the sense of touch is somewhat underexposed; in particular when compared with the visual and auditory modalities. To advance the use of our sense of touch in HCI, we have defined three broad themes in which several assignments (Capita Selecta, Research Topics, Graduation Projects) can be defined. 

    Designing haptic interfaces

    Many devices use basic vibration motors to provide feedback. While such motors are easy to work with and sufficient for certain applications, the advances in current manufacturing technologies (e.g. 3D printing) and in electronics provide opportunities for creating new forms of haptic feedback. Innovative forms of haptic feedback may even open up complete new application domains. The challenge for the students is twofold: 1.) Exploring the opportunities and limitations of (combinations of) materials, textures, and (self-made) actuators, and 2.) coming up with potential use cases.

    Multimodal perception of touch

    The experience of haptic feedback may not only be governed by what is sensed through the skin, but may also be influenced by other modalities; in particular by the visual modality. VR and AR technologies are prime candidates for studying touch perception, and haptic feedback is even considered ‘the holy grail’ for VR. Questions surrounding for instance body ownership in VR, or visuo-haptic illusions in VR (e.g. elongated arms, a third arm) can be interesting starting points for developing valuable multimodal experiences, and for studying the multimodal perception of touch.

    Touch as a social cue

    Research in psychology has shown that social touch (i.e. being touched by another person) can profoundly influence both the toucher and the recipient of a touch (e.g. decreasing stress, motivating, or showing affect). Current technologies for remote communication could potentially be enriched by adding haptic technology that allows for social touch interactions to take place over a distance. In addition, with social robots becoming more commonplace in both research and everyday life, the question arises how we should engage in social touch with such social robots in a beneficial, appropriate and safe manner. Applications of social touch technology can range from applications related to training and coaching, to entertainment, and to providing care and intimacy. Potential projects in this domain could focus on the development of new forms of social touch technology (interactions), and/or on the empirical investigation of the effects such artificial social touch interactions can have on people.

    Contact: Dirk Heylen

  • Wearables and tangibles assisting young adults with autism in independent living @ IDE - Enschede, NL

    In this project we seek socially capable and technically smart students with interest in technology and health care to investigate how physical-digital technology may support young adults with autism (age 17-22) developing independence in daily living. In this project we build  further on insights from earlier projects such as Dynamic Balance and MyDayLight.

    (see more about both projects here: http://www.jellevandijk.org/embodiedempowerment/ )

    Your assignment is to engage in participatory design in order to conceptualize, prototype and evaluate a new assistive product concept, together with young adults with autism, their parents, and health professionals. You can focus more on the design of concepts, the prototyping of concepts, technological work on building an adaptive flexible platform that can be personalized by each individual user, or working on developing the ‘co-design’ methods we use with young adults with autism, their parents, and the care professionals.

    As a starting point we consider opportunities of wearables with bio-sensing in combination with ambient intelligent objects (internet-of-things e.g. interactive light, ambient audio) in the home.

    The project forms part of a research collaboration with Karakter, a large youth psychiatric health organization and various related organizations, who will provide participating families. One goal is to present a proof-of-concept of a promising assistive device – another goal is to explore the most suitable participatory design methods in this use context. Depending on your interests you can focus more on the product or on the method. The ultimate goal of the overall research project is to realize a flexible, adaptive interactive platform that can be tailored to the needs of each individual user – this master project is a first step into that direction.

    Contact: jelle.vandijk@utwente.nl

  • Interactive Surfaces and Tangibles for Creative Storytelling @ HMI - Enschede, NL

    In the research project coBOTnity a collection of affordable robots (called surface-bots) was developed for use in collaborative creative storytelling. Surfacebots are moving tablets embodying a virtual character. Using a moving tablet allows us to show a digital representation of the character’s facial expressions and intentions on screen while also allowing it to move around in a physical play area. 

    The surfacebots offer diverse student assignment opportunities in the form of Capita Selecta, HMI project, BSc Research or Design project, or MSc thesis research. These assignments can deal with technology development aspects, empirical studies evaluating the effectiveness of some existing component, or a balance of both types of work (technology development + evaluation).

    Just as a sample of what could be done in these assignments, but not limited to, students could be interested in developing new AI for the surfacebot to become more intelligent and responsive in the interactive space, studying interactive storytelling with surfacebots, developing mechanisms to orchestrate multiple surfacebots as an expression means (e.g. to tell a story), evaluating strategies to make the use of surfacebots more effective, developing and evaluating an application to support users’ creativity/learning, etc.

    You can find more information about the coBOTnity project at: https://www.utwente.nl/ewi/hmi/cobotnity/

    Contact: Mariët Theune (m.theune@utwente.nl)

  • Interpersonal engagement in human-robot relations @ HMI - Enschede, NL

    Modern media technology enables people to have social interactions with the technology itself. Robots are a new form of media that people can communicate with as independent entities. Although robots are becoming naturalized in social roles involving companionship, customer service and education, little is known about the extent to which people can feel interpersonal closeness with robots and how social norms around close personal acts apply to robots. What behaviors do people feel comfortable to engage in with robots that have different types of social roles, like companion robot, customer service robot and teacher robot? Will robots that people can touch, talk to, lead and follow result in social acceptance of behaviors that express interpersonal closeness between a person and a robot? Are such behaviors intrinsically rewarding when done with a responsive robot?

    Contact: Dirk Heylen

  • Sports, Data, and Interaction: Interaction Technology for Digital-Physical Sports Training @ HMI - Enschede, NL

    The proposed project focuses on new forms of (volleyball and other) sports training. Athletes perform training exercises in a “smart sports hall” that provides high quality video display across the surface of the playing field and has unobtrusive pressure sensors embedded in the floor, or using smart sports setups such as immersive VR with a rowing machine. A digital-physical training system offers tailored, interactive exercise activities. Exercises incorporate visual feedback from the trainer as well as feedback given by the system. They can be tailored through a combination of selection of the most fitting exercises and setting the right parameters. This allows the exercises to be adapted in real time in response to the team’s behaviour and performance, and to be selected and parameterized fitting to their levels of competition and to demands of, e.g., youth sport. To this end, expertise from the domains of embodied gaming and instruction and pedagogy in sports training are combined. Computational models are developed for the automatic management of personalization and adaptation; initial validation of such models is done by repeatedly evaluating versions of the system with athletes of various levels. We collect, and automatically analyse, data from the sensors to build continuous models of the behaviour of individual athletes as well as the team. Based on this data, the trainer or system can instantly decide to change the ongoing exercises, or provide visual feedback to the team via the displays and other modalities. In extrapolation, we foresee future development towards higher competition performance for teams, by building upon the basic principles and systems developed in this project. 

    Assignments in this project can be done on user studies, automatic behaviour detection from sensors, novel interactive exercise design, and any other topic.

    Contact person: Dees Postma, d.b.w.postma@utwente.nl, Dennis Reidsma, d.reidsma@utwente.nl

  • Dialogue and Natural Language Understanding & Generation for Social and Creative Applications @ HMI - Enschede, NL

    Applications involving the processing and generation of human language have  become increasingly better and more popular in recent years; think for example of automatic translation and summarization, or of the virtual assistants that are becoming a part of everyday life. However, dealing with the social and creative aspects of human language is still challenging. We can ask our virtual assistant to check the weather, set an alarm or play some music, but we cannot have a meaningful conversation with it about what we want to do with our life. We can feed systems with big data to automatically generate texts such as business reports, but generating an interesting and suspenseful novel is another story.

    At HMI we are generally open to supervising different kinds of assignments in the area of dialogue and natural language understanding & generation, but we are specifically interested in research aimed at social and creative applications. Some possible assignment topics are given below.

    Conversational agents and social chatbots. The interaction with most current virtual assistants and chatbots (or 'conversational agents') is limited to giving them commands and asking questions. What we want is to develop agents you can have an actual conversation with, and that are interesting to engage with. An important question here is, how can we keep the interaction interesting over a longer period of time? Assignments in this area can include question generation for dialogue (so the agent can show some interest in what you are telling them), story generation for dialogue (so the agent make a relevant contribution to the current conversation topic) and user modeling via dialogue (so the agent can get to know you). The overall goal is to create (virtual) agents that show verbal social behaviours. In the case of embodied agents, such as robots or virtual characters, we are also interested in the accompanying non-verbal social behaviours.

    Affective language processing or generation. Emotions are part of everyday language, but detecting emotions in a text, or having the computer produce emotional language, are still challenging tasks. Assignments in this area include sentiment analysis in texts, for Dutch in particular, and generating emotional language in for example the context of games (emotional character dialogue or 'flavor text' as explained above) or in the context of automatically generated soccer reports.

    Creative language generation. Here we can think of generating creative language such as puns, jokes, and metaphors but also stories. It is already possible to generate reports from data (for example sports or game-play data) but such reports tend to be boring and factual. How can we give them a more narrative quality with a nice flow, atmosphere, emotions and maybe even some suspense? Instead of generating non-fiction based on real-world data, another area is generating fiction. An example is generating so-called 'flavor text' for use in games. This is text that is not essential to the main game narrative, but creates a feeling of immersion for the player, such as fictional newspaper articles and headlines or fake social media messages related to the game. Another example of fiction generation is the generation of novel-length stories. Here an important challenge is how to keep the story coherent, which is a lot more difficult for long texts than for short ones.

    Contact: Mariët Theune (m.theune@utwente.nl)

  • Explorative applications for Cozmo robot for use by social care professionals @ HMI - Enschede, NL

    This thesis project concerns a choice of several topics, namely:

    • explorative development of possible applications for social robots in social care context using the Anki Cozmo robot
    • exploration of the state of the art of teaching robotics to social care students and investigating, through robot codesign activities, how we can support social care professionals in obtaining the necessary skills and attitude to be able to work with social robots in their daily practice
    • exploration of activity design with social robots, in which social care workers (or students) use a simple robot platform with a simple set of behaviours to develop a rich and varied collection of activities for use with their clients.

    There are already many examples of applications of social robots in care (eg elderly homes) and often these have been evaluated for their impact on the client. However, a special challenge that is often overlooked is that, when such social robots become common place in social care setting, the care professionals also have to become engaged with these robots. But generally they are not taught in their educational program about robots -- what they are, what they are good for, and what you can do with them. 

    This challenge is addressed in the Prospero project ( http://prospero.via.dk/EN ). Besides the practical side of robots in care setting (e.g., lifting robots or medical delivery robots) many examples of social robots are more like "activities that one can do with a client". In this project, we aim to develop a number of activities one can do with a client, supported by a robot. The assignment involves designing the activities as well as the robot behavior and interaction. Based on related work, literature, and discussions with care workers in the Netherlands and Spain (some of the discussions done by the students, some by he researchers in the project) you may develop multiple prototypes of activities with the Anki Cozmo robot and evaluate these prototypes and their potential for rich activity design. Hopefully, depending on COVID19 circumstances, some of this can be done remotely with care workers and experts in the Netherlands and Spain (in Spain, the activities are carried out by the Spanish partner in the project) to gain more insight into the potential for engaging care professionals with the possibilities of using social robots in their work. By exploring multiple prototypes, the insights that we gather will hopefully go beyond the impact of a single application.

    Contact person: Dennis Reidsma

  • Group activity detection

    Building social interaction is necessary for both mental and physical health. Participating in group activities encourage social interaction. While there is the opportunity for attending a variety of different group activities, some prefer to do solitary activities. This project aims to design an algorithm to extract the pattern of group and solitary activities from GPS (Global Positioning System) and motion sensors including accelerometer, gyroscope, and magnetometer. The extracted pattern would able us to detect whether the individual is involved in a group or solitary activity.

    This project is defined within a larger project, namely the Schoolyard project. In the schoolyard project, we captured data from pupils in the school playground during the break via GPS and motion sensors. The collected data will be used to validate the designed algorithm. You need to be creative in designing the method to cover different types of group activities in the playground including parallel games (e.g., swings), ball games (e.g., football), tag games (e.g., catch and run), etc.

    The research involves steps such as:

    ·        Literature review 

    ·        Data preparation and identifying benchmark datasets

    ·        Designing an algorithm to identify the group activity patterns

    ·        Validate the result via ground truth/simulated data/benchmark datasets

    We are looking for candidates that match the following profile:

    ·        Having a creative mindset 

    ·        Strong programming skills in python 

     

    Many recent studies have focused on detecting group activities from videos. However, using videos to detect the activity is computationally expensive and has a high privacy concern. Below is a related paper to this topic which used the motion sensor with beacons to identify the group activity. 

    https://www.sciencedirect.com/science/article/pii/S0360132319303348

    For information about the Schoolyard project, you can contact Mitra Baratchi, Assistant Professor, email: m.baratchi@liacs.leidenuniv.nl.

    You will be jointly supervised by Dr. Gwenn Englebienne, Assistant Professor, University of Twente, and with the external supervision of Dr. Mitra Baratchi, Assistant Professor, Leiden institute of advanced science (LIACS), and Maedeh Nasri, Ph.D. Candidate of Leiden University.

  • A Framework for Longitudinal Influence Measurement between Spatial features and Social Networks

    The features of the environment may enhance or discourage social interactions among people. The question is how environmental features influence social participation and how the influence may vary over time. To answer this question, you need to design a framework that combines features of the spatial network with the parameters of the social network while addressing the longitudinal characteristics of such a combination.

    To the best of our knowledge, no study has been conducted on analyzing the longitudinal influence between social networks and spatial features of the environment.

    This project is defined within a larger project, namely the schoolyard project. In the schoolyard project, we observed the behavior of the children in the playground using RFID tags and GPS loggers. The RFIDs are used to build a social network. The longitudinal influence between the social network and spatial features may be analyzed in three stages: 1) before the renovation 2) after the renovation 3) after adaptation of the playground. The collected data can be used to validate the designed framework.

    We are looking for candidates that match the following profile:

    ·        Knowledge about network analysis

    ·        Knowledge about multilevel time series analysis

    ·        Strong programming skills in python 

     

    This paper presents a general framework for measuring the dynamic bidirectional influence between communication content and social networks. They used a publication database to make the social network and its relationship with the concept of connection is studied longitudinally. 

    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.208.4144&rep=rep1&type=pdf

     

    For information about the Schoolyard project, you can contact Mitra Baratchi, Assistant Professor, email: m.baratchi@liacs.leidenuniv.nl

    You will be jointly supervised by Dr. Shenghui Wang, Assistant Professor, University of Twente, and with external supervision of Dr. Mitra Baratchi, Assistant Professor, Leiden institute of advanced science (LIACS), and Maedeh Nasri, Ph.D. Candidate of Leiden University.

COMPANIES, EXTERNAL RESEARCH INSTITUTES, AND END USER ORGANISATIONS

Here you find some of the organisations that are willing to host master students from HMI. Keep in mind that you are not allowed to have both an external (non-research institute) internship and an external final assignment. If you work for a company that is interested in providing internships or final assignments please contact D.K.J.Heylen[at]utwente.nl

  • Adding Speech to Multi Agent Dialogues with a Council of Coaches

    Context

    In the EU project Council of Coaches (COUCH) we are developing a team of virtual coaches that can help older adults achieve their health goals. Each coach offers insight and advice based on their expertise. For example, the activity coach may talk about the importance of physical exercise, while the social coach may ask the user about their friends and family. Our system enables fluent multi-party interaction between multiple coaches and our users; in addition to talking directly with the user, the coaches may also have dialogues amongst themselves. Integration of full spoken interaction with the platform developed in COUCH will ake a major leap possible for our embodied agent projects.

    More information: https://council-of-coaches.eu/project/overview/

    Challenge

    Currently in COUCH the user interacts with the coaches by selecting one of several predefined multiple-choice responses on a tablet or computer interface. Although this is a reliable way to capture input from the user, it may not be ideal for our target user group of older adults. Perhaps spoken dialogue can offer a better user experience?

    In the past, researchers found that it was quite difficult to sustain dialogues that relied on automatic speech recognition (ASR) (e.g. see[1] Bickmore & Picard, 2005). However, recent commercial systems like Apple’s Siri and Amazon’s Alexa offer considerable improvements in recognising user’s speech. Such state of the art systems might now be sufficiently reliable for supporting high-quality spoken dialogues between our coaches and the user.

    Assignment

    In your project you will adapt the COUCH system to support spoken interactions. In addition to incorporating ASR, you will investigate smart ways to organise the dialog to facilitate adequate recognition in noisy and uncertain settings while keeping the conversation going. Finally, you will also evaluate the user experience and the quality of dialog progress in various settings and thereby the suitability of state of the art speech recognition for live, open-setting spoken conversation.

    You will carry out the work with collaboration between Roessingh Research and Development (http://www.rrd.nl/) and researchers at the Human Media Interaction group of the University of Twente.

    Contact

    Dennis Reidsma (d.reidsma@utwente.nl)
     


    [1] Timothy W. Bickmore and Rosalind W. Picard. 2005. Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact. 12, 2 (June 2005), 293–327. DOI: https://doi.org/10.1145/1067860.1067867

  • Large-scale data mining & NLP @ OCLC - Leiden, NL

    OCLC is a global library cooperative that provides shared technology services, original research and community programs for its membership and the library community at large. Collectively with member libraries, OCLC maintains WorldCat, the world’s most comprehensive database of information about library collections. WorldCat now hosts more than 460 million bibliographic records in 483 languages, aggregated from 18,000 libraries in 123 countries.

    As the WorldCat continues to grow in quantity, OCLC is actively exploring data science, advanced machine learning, linked data and visualisation technologies to improve data quality, transform bibliographic descriptions into actionable knowledge, as well as provide more functionality for professional cataloguers and develop more services for end users of the libraries. 

    OCLC is constantly looking for students who are enthusiastic to advance AI technologies for library and other cultural heritage data. Examples of student assignments are:

    • Fast and scalable semantic embedding for Information Retrieval
    • eXtreme
      Multi-label Text Classification (XMTC) for automatic subject prediction
    • Automatic
      image captioning for Cultural Heritage collections 
    • Entity extraction and disambiguation
    • Entity matching across different media (e.g. books, articles, cultural heritageobjects, etc) or across languages
    • Hierarchical clustering of bibliographic records
    • Constructing knowledge graphs around books, authors, subjects, publishers, etc.  
    • Interactive visualisation of library data on geographic maps and/or along a time dimension
    • Concept drift (i.e., how meaning changes over time) and its effects on Information Retrieval 
    • Scientometrics-related topics based on co-authoring networks and/or citation networks

    More details are available on request. 

    Contact: Shenghui Wang
    Email: shenghui.wang@utwente.nl

  • Robotics and mechatronics @ Heemskerk Innovative Technology (Delft)

    Company Information:

    Heemskerk Innovative Technology provides advice and support to innovative high-tech projects in the field of robotics and mechatronics. Our mission: Convert basic research into innovative business concepts and real-world applications by creating solutions for performing actions where people themselves can not reach: making the world smaller, better integrated and in an intuitive way.

    Focus areas:
    Haptics
    Dexterous manipulation
    Master-slave control
    Dynamic contact
    Augmented Reality

    https://heemskerk-innovative.nl

    Example assignments (to be carried out in the first half of 2021):

    Current assignments focus on user robot interaction, object detection and autonomous object manipulation in real-life settings, human detection and tracking for navigation in human-populated environments as part of developing the ROSE healthcare robot. Background on C++/Python and ROS is a pre for students working on these assignments.

    Contact:
    Mariët Theune (EEMCS) <m.theune@utwente.nl>

  • Tasty Bits 'n' Bytes: Food Technology @ het Lansink - Enschede, NL

    The current popularity of ICTs that offer augmented or virtual reality experiences, such as Oculus Rift, Google Glass, and Microsoft Hololens, suggests that these technologies will become increasingly more commonplace in our daily lives. With this the question arises of how these mixed reality technologies will be of benefit to us in our day-to-day activities. One such activity that could take advantage of mixed reality technologies is the consumption of food and beverages. Considering the fact that the perception of food is highly multisensory, being not only governed by taste and smell, but to a strong degree by our visual, auditory and tactile senses, mixed reality technologies could be used to enhance our dining experiences. In the Tasty Bits and Bytes project we will explore the use of mixed reality technology to digitally (using visual, auditory, olfactory, and tactile stimuli) enhance the experience of consuming food and beverages.

    The setting for these challenges and projects is a mixed reality restaurant table at Het Lansink that hosts a variety of technologies to provide a novel food and beverage experience.

    Assignments that can be carried out in collaboration with Het Lansink concern, for example: actuated plates; projection mapping on the table; force feedback; and multimodal taste sensations.

    Website: http://www.tastybitsandbytes.com/

    Contact: Dirk Heylen, Juliet Haarman

  • Addiction, Coaching and Games @ Tactus - Enschede, NL

    Tactus is specialized in the care and treatment of addiction. They offer help to people who suffer from problems as a result of their addiction to alcohol, drugs, medication, gambling or eating. They help by identifying addiction problems as well as preventing and breaking patterns of addiction. They also provide information and advice to parents, teachers and other groups on how to deal with addiction.

    Assignment possibilities include developing game-like support and coaching apps.

    Website: https://www.tactus.nl/enschede

    Contact: Randy Klaassen

  • Enhancing Music Therapy with Technology @ ArtEZ - Enschede, NL

    ArtEZ School of Music has a strong department in Neurologic Music Therapy, which does not only train new therapists but also engages in fundamental research towards evaluating and imrpoving the impact of Music Therapy.

    Music is a powerful tool for influencing people. Not only because music is nice, but also because music actually has neurological effects on the motor system. That is why music therapy is also used as revalidation instrument for people with various conditions. In this assignment you will work with professional music therapists, in developing interactive products to enrich the music therapy sessions for various purposes.

    Possibliities include design, development and research assignments on systems such as home practice applications, sound spaces for embodied training, sensing to provide insights to the therapist and/or feedback to the client, etcetera.

    Contact Dennis Reidsma

  • Stories and Language @ Meertens Institute - Amsterdam, NL

    The Meertens Institute, established in 1926, has been a research institute of the Royal Netherlands Academy of Arts and Sciences (KNAW) since 1952. They study the diversity in language and culture in the Netherlands, with a focus on contemporary research into factors that play a role in determining social identities in the Dutch society. Their main fields are:

    • ethnological study of the function, meaning and coherence of cultural expressions
    • structural, dialectological and sociolinguistic study of language variation within Dutch in the Netherlands, with the emphasis on grammatical and onomastic variation.

    Apart from research, the institute also concerns itself with documentation and providing information to third parties in the field of Dutch language and culture. We possess a large library, with numerous collections and a substantive documentation system, of which databases are a substantive part.

    Assignments include text mining and classification and language technology, but also usability and interaction design.

    Website of the institute: http://www.meertens.knaw.nl/cms/

    Contact: Mariët Theune

  • Language and Retrieval @ Elsevier - Amsterdam, NL

    Elsevier is the world's biggest scientific publisher, established in 1880. Elsevier publishes over 2,500 impactful journals including Tetrahedron, Cell and The Lancet. Flagship products include ScienceDirect, Scopus and Reaxys. Increasingly, Elsevier is becoming a major scientific information provider. For specific domains, structured scientific knowledge is extracted for querying and searching from millions of Elsevier and third-party scientific publications (journals, patents and books). In this way, Elsevier is positioning itself as the leading information provider for the scientific and corporate research community.

    Assignment possibilities include text mining, information retrieval, language technology, and other topics.

    Contact: Mariët Theune

  • Interactive Technology for Music Education @ ArtEZ - Enschede, NL

    The bachelor Music in Education of ArtEZ Academy of Music in Enschede increasingly profiles itself with a focus on technology in service to music education. Students and teachers apply digital learning methods for teaching music and they experiment with all kinds of digital instruments and music apps. Applying technology in music education goes beyond the application of these tools. Interactive music systems have potential in supporting (pre-service) teachers in teaching music in primary education. Still, much research needs to be done. 

    Current questions include: What is an optimal medium for presenting direct feedback on the quality of rhythmic music making? What should this feedback look like? 

    HMI students are warmly invited to contribute to this research by creating applications concerning feedback and visualisations for rhythmic music making in primary education. Design playful, interactive musical instruments to engage children to play rhythms together. Or come up with interactive (augmented) solutions that support teachers in guiding children making music. 

    You work in collaboration with one of the main teachers in the bachelor Music in Education who is doing his PhD project on this topic.

    Contact: Benno Spieker, Dennis Reidsma

  • Using (neuro)physiological signals @ TNO -- Soesterberg, NL

    At TNO Soesterberg (department of Perceptual and Cognitive Systems) we investigate how we can exploit physiological signals such as EEG brain signals, heart rate, skin conductance, pupil size and eye gaze in order to improve (human-machine) performance and evaluation. An example of a currently running project is predicting individual head rotations from EEG in order to reduce delays in streaming images in head mounted displays. Other running projects deal with whether and how different physiological measures reflect food experience. Part of the research is done for international customers. 

    More examples of projects as reflected in papers are on Google Scholar

    We welcome students with skills in machine learning and signal processing and/or who would like to setup experiments, work with human participants and advanced measurement technology.

    Contact: Jan van Erp <j.b.f.vanerp@utwente.nl>

  • Social VR User Experiences @ TNO -- Den Haag, NL

    In the TNO MediaLab (The Hague), we create innovative media technologies aimed at providing people with new and rich media experiences, which they can enjoy wherever, whenever and with whomever they want. To enable this, we mainly develop video streaming solutions, working from the capture side to the rendering side, looking at many of the aspects involved: coding, transport, synchronisation, orchestration, digital rights management, service delivery, etc.. In many cases, we do this work directly for customers such as broadcasters and content distributors.

    As part of this, we currently work on what we call Social VR, or VR conferencing. Virtual Reality applications excel in immersing the user into another reality. But passed the wow-effect, users may feel the lack of the social interactions that would happen in real life. TNO is exploring ways to bring in the virtual world the experience of sharing moments of life with friends and family. We do this using advanced video-based solutions.

    We are actively looking for students to contribute to:

    - evaluating, analysing and improving the user experience of the services developed, e.g. working on user embodiment, presence, HCI-aspects, etc.

    - the technical development of the platform (i.e. prototyping), e.g. working on spatial audio, 3D video, spatial orchestration, etc.

    Focus of assignments can be on one aspect or the other, or a combination of both.

    More info of what we do at TNO Medialab can be found here: https://tnomedialab.github.io/

    Contact: Jan van Erp <j.b.f.vanerp@utwente.nl>

  • Social Robots in Care @ Vierstroom - Gouda, NL

    Vierstroom is a care organisation offering care at people's homes (to help people live at home, independently, for a longer time) and in a care home. As part of that, Vierstroom also wants to explore how novel technology such as social robots, but also other kinds of technology, can improve the services they offer to their client.

    As part of this they are looking for creative and independent thesis students who can help them explore novel interventions with social robots for elderly people, for example in the setting of a "buurthuis". For this they have at least the availability of an iPal robot, and possibly other robot platforms. Assignments will include the design, implementation, and evaluation of the intervention. 

    Contact person: Dennis Reidsma

  • AR for Movement and Health @ Holomoves - Utrecht, NL

    Holomoves is a company in Utrecht that combines Hololens Augmented Reality with expertise in health and physiotherapy, to offer new interventions for rehabilitation and healthy movement in a medical setting. Students can work with them on a variety of assignments including design, user studies, and/or technology development.

    More information on the company: https://holomoves.nl/

    Contact person: Robby van Delden, Dennis Reidsma 

  • Artificial Intelligence & NLP @ Info Support - Veenendaal, NL

    Info Support is a software company that makes high-end custom technology solutions for companies in the financial technology, health, energy, public transport, and agricultural technology sectors. Info Support is located in Veenendaal/Utrecht, NL with research locations in Amsterdam, Den Bosch, and Mechelen, Belgium.

    Info Support has extensive experience when it comes to supervising graduation students. With assignments that do not only have added scientific value, but also impact the clients of Info Support and their clients’ clients. As a university-level graduating student, you will become part of the Research Center within Info Support. This is a group of colleagues who, on top of their job as a consultant, have a strong affinity with scientific research. The Research Center facilitates and stimulates scientific research, with the objective of staying ahead in Artificial Intelligence, Software Architecture, and Software Methodologies that most likely will affect our future.

    Various research assignments in Artificial Intelligence, Machine Learning and Natural Language Processing can be carried out at Info Support.

    Examples of assignments include:

    • finding a way to anonymize streaming data in such a way that it will not affect the utility of AI and Machine Learning models
    • improving the usability of Machine Learning model explanations to make them accessible for people without statistical knowledge
    • generating new scenarios for software testing, based on requirements written in a natural language and definitions of logical steps within the application

    More details are available on request.

    Contact: Mariët Theune

  • New doctor-patient conversations thanks to technology @ MST - Enschede, NL

    New technology makes a difference in medicine. Not just by transforming the education of professionals and by inventing new ways to treat patients, but also by changing the communication between doctor and patient. Patients know more than ever thanks to "Doctor Google"; the ways of communication by doctors nowadays already changes to account for such developments. At the same time, it remains challenging to explain to patients the intricacies of their condition and their treatment. In this assignment, you will explore how novel technology such as 3D printing or augmented reality can be used to transform the nature of doctor-patient conversations in the context of surgery.

    This assignment can be done as thesis or as internship, and will carried out in the context of the 3D print lab of the regional hospital MST.

    https://www.mst.nl/p/mst-opent-medisch-3d-print-laboratorium/

    Contact person: Dennis Reidsma