HMI Student Assignments

If you as a bachelor or master student are looking for a final thesis assignment, capita selecta, research topic, or internship, you can choose between a large number of internal (at HMI) or external (at a company, research institute, or user organisation) assigments. You can also choose to create one yourself. Searching a suitable location or a suitable topic often starts here: below you find a number of topics you can work on.  

Note that in preparation for a final MSc assignment you must carry out a Research Topics resulting in a final project proposal. For Research Topics, you need to hand in this form to the bureau of educational affairs (BOZ).

ASSIGNMENTS AND TOPICS THAT CAN BE CARRIED OUT INTERNALLY AT HMI

Repair behaviors for social robots @ HMI - Enschede, NL

When social robots make ‘mistakes’, that can influence how people think about them – can we change this effect by having a robot repair its mistakes?

Robots are becoming more prevalent in situations where they have to interact with people. At the University of Twente we investigate social robots within a variety of situations, from telepresence robot TERESA that mediates conversations between elderly, to EASEL supporting peer learning in elementary schools.

It is inevitable that robots will make mistakes in such social situations. If not because of technical limitations, then simply because social situations are inherently complex. Even we humans, despite our extensive experience, make mistakes quite commonly. Because of this, there is a whole range of behaviours used in human-human interaction that could repair such mistakes, from blushing to apologizing.

How could a robot try to repair mistakes, and what would be the effect of that? Preliminary results by one of our students, Derk Snijders, suggest that people think of a robot that apologizes after it invaded their personal space as more sensitive than one that does not invade their personal space at all. Well-chosen repair behaviours could thus mitigate the effect of social mistakes, and might even have positive effects on a variety of attitudes – from perceived reliability to perceived friendliness.

Your assignment, should you choose to accept it, will be to investigate this further. You will be expected to formulate relevant and feasible research questions within this topic based on existing literature, and to conduct a clean and well-designed experiment to investigate these research questions. 

In addition to conducting research, you will be expected to design the repair behaviours for the robot. Programming these behaviours will not be necessary, since you can instead remote control it (‘Wizard of Oz’-approach).

Contact: Jered Vroon <j.h.vroon@utwente.nl

Human-Robot proxemics @ HMI - Enschede, NL

Much of the social robotics research at the University of Twente focuses on mobile robots having conversations with people. Even though the applications are diverse (e.g. airport robot SPENCER, heritage site tour guide FROG, telepresence robot TERESA), a central question is that of social positioning; How should a social mobile robot position itself with respect to the people with which it interacts? Which distance should a robot keep to the people with which it tries to have a conversation?

Proxemics, first coined by the sociologist Edward Hall, provides a deceptively direct answer to this question. Within it, a range of social distances has been defined for different levels of intimacy. For example, the distance for interaction with acquaintances would be between 1.2 and 3.6 meters.

For this reason, proxemics has been actively used in a variety of mobile social robots. Some authors simply use a distance of approximately 1.2 meters as a minimum for their approach and navigation behaviour. Others have looked at different factors that influence the appropriate proxemic distance, such as ‘familiarity with the robot’, ‘gender’, ‘properties of the robot’, and ‘history of pet ownership’.

At the same time, these findings, along with a variety of findings from psychology, suggest that there is more to proxemics than meets the eye; internal factors such as familiarity and self-construal can apparently also have an effect on what is appropriate proxemic behaviour. If that is the case, it should be possible to change what is perceived as appropriate proxemic behaviour of a robot with the right priming/framing. And if that is indeed the case, an interesting argument can be made against straightforwardly applying static proxemic distances to robotics.

Assignments in this area can include empirical studies (user experiments) and/or the design of proper robot behavior concerning proxemics.

Contact: Jered Vroon, Michiel Joosse, Gwenn Englebienne

Touch Interactions and Haptics @ HMI - Enschede, NL

In daily life, we use our sense of touch to interact with the world and everything in it. Yet, in Human-Computer Interaction, the sense of touch is somewhat underexposed; in particular when compared with the visual and auditory modalities. To advance the use of our sense of touch in HCI, we have defined three broad themes in which several assignments (Capita Selecta, Research Topics, Graduation Projects) can be defined. 

Designing haptic interfaces

Many devices use basic vibration motors to provide feedback. While such motors are easy to work with and sufficient for certain applications, the advances in current manufacturing technologies (e.g. 3D printing) and in electronics provide opportunities for creating new forms of haptic feedback. Innovative forms of haptic feedback may even open up complete new application domains. The challenge for the students is twofold: 1.) Exploring the opportunities and limitations of (combinations of) materials, textures, and (self-made) actuators, and 2.) coming up with potential use cases.

Multimodal perception of touch

The experience of haptic feedback may not only be governed by what is sensed through the skin, but may also be influenced by other modalities; in particular by the visual modality. VR and AR technologies are prime candidates for studying touch perception, and haptic feedback is even considered ‘the holy grail’ for VR. Questions surrounding for instance body ownership in VR, or visuo-haptic illusions in VR (e.g. elongated arms, a third arm) can be interesting starting points for developing valuable multimodal experiences, and for studying the multimodal perception of touch.

Touch as a social cue

Research in psychology has shown that social touch (i.e. being touched by another person) can profoundly influence both the toucher and the recipient of a touch (e.g. decreasing stress, motivating, or showing affect). Current technologies for remote communication could potentially be enriched by adding haptic technology that allows for social touch interactions to take place over a distance. In addition, with social robots becoming more commonplace in both research and everyday life, the question arises how we should engage in social touch with such social robots in a beneficial, appropriate and safe manner. Applications of social touch technology can range from applications related to training and coaching, to entertainment, and to providing care and intimacy. Potential projects in this domain could focus on the development of new forms of social touch technology (interactions), and/or on the empirical investigation of the effects such artificial social touch interactions can have on people.

Contact: Christian Willemse, Gijs Huisman, Merel Jung, Dirk Heylen

Wearables and tangibles assisting young adults with autism in independent living @ IDE - Enschede, NL

In this project we seek socially capable and technically smart students with interest in technology and health care to investigate how physical-digital technology may support young adults with autism (age 17-22) developing independence in daily living. In this project we build  further on insights from earlier projects such as Dynamic Balance and MyDayLight.

(see more about both projects here: http://www.jellevandijk.org/embodiedempowerment/ )

Your assignment is to engage in participatory design in order to conceptualize, prototype and evaluate a new assistive product concept, together with young adults with autism, their parents, and health professionals. You can focus more on the design of concepts, the prototyping of concepts, technological work on building an adaptive flexible platform that can be personalized by each individual user, or working on developing the ‘co-design’ methods we use with young adults with autism, their parents, and the care professionals.

As a starting point we consider opportunities of wearables with bio-sensing in combination with ambient intelligent objects (internet-of-things e.g. interactive light, ambient audio) in the home.

The project forms part of a research collaboration with Karakter, a large youth psychiatric health organization and various related organizations, who will provide participating families. One goal is to present a proof-of-concept of a promising assistive device – another goal is to explore the most suitable participatory design methods in this use context. Depending on your interests you can focus more on the product or on the method. The ultimate goal of the overall research project is to realize a flexible, adaptive interactive platform that can be tailored to the needs of each individual user – this master project is a first step into that direction.

Contact: jelle.vandijk@utwente.nl

Holodeck: Immersive VR and AR @ HMI - Enschede, NL

What You See Is How It’s Done -- Mixed VR Streaming & Recording for On-Site Support and Training

An invitation to look over the shoulder of an expert (or: having an expert look over your shoulder) is a useful and effective method for receiving assistance or tutoring. One gets to take the experts perspective while he/she is demonstrating how to solve a problem or perform a task, with the subject matter in front of both the expert and the learner. Or when roles are reversed, the expert can give feedback to the learners approach while he or she is actively engaged with the problem.

However, there are situations where this form of tutoring cannot be provided. There may not be enough experts for all learners, or the expert may not be able to be at the same location as a person requiring assistance.

One possible solution here is technology that allows one to be present at one or more remote locations at the same time: Telepresence. With our system, we look to explore the look over someone's shoulder as a design pattern for developing a new generation of such telepresence systems specifically for remote assistance and training of “hands-on” tasks. Think of bomb-defusing, archeological digging, doing repairs on an oil-rig or simply learning how to cook pancakes.

The setup we envisage, and for which a prototype is already being developed, consists of two main components. First, the “on-site kit”, consisting of a number of depth sensors and an Augmented Reality Wearable (Microsoft HoloLens). The sensors record and stream (in 3D) the environment of the user, who performs a task in that environment, to a remote viewer (or someone who reviews the recordings at a later time). The remote viewer uses the “remote kit”, consisting only of a common VR system such as the HTC Vive. With it, the recorded or streamed environment can be freely navigated. There is a voice connection between the two and there is the capability to perform (live) annotation in the environment.

With such a system, we can record and stream an expert performing a task using the on-site kit, with multiple learners looking over his or her shoulder at the same time using the remote kit. On the other hand, an expert could also use the remote kit to look over the shoulder of a novice in need of assistance with an on-site kit (think the bomb-defusal scenario).

There are several research challenges for students to work on within this context. On the one hand, there are computer vision & graphics oriented challenges to improve the performance of the streamed data and the visual quality of reconstructed scenes. On the other hand, there are questions on the user interface and experience of such a system, as well as on the evaluation given a scenario.

If you’re interested, get in contact with Dennis Reidsma <d.reidsma@utwente.nl> and Jan Kolkmeier <j.kolkmeier@utwente.nl>

Interactive Surfaces and Tangibles for Creative Storytelling @ HMI - Enschede, NL

Co-located interactive spaces allow people to carry out collaborative tasks in such a way that interaction is enhanced and mediated by technology. It typically involves touch displays and specialized tangible objects that effectively make use of the space around the users.

In this general context, a more concrete research project, coBOTnity, aims to provide an affordable community of hybrid artificial agents (called surface-bots) to be used in collaborative creative storytelling. Technologically speaking, it can be understood as a distributed user interface consisting of intelligent and moveable surfaces.

Thus, there are diverse assignment opportunities for students to make a significant contribution by means of Capita Selecta, Research Topics, Bachelorreferaat, HMI project, Design project, or Graduation Project. They can deal with technology development aspects, empirical studies evaluating the effectiveness of some existing component, or a balance of both types of work (technology development + evaluation).

Just as a sample of what could be done in these assignments, but not limited to, students could be interested in developing new AI for the surfacebot to become more intelligent and responsive in the interactive space, studying interactive storytelling with surfacebots, developing mechanisms to orchestrate multiple surfacebots as an expression means (e.g. to tell a story), evaluating strategies to make the use of surfacebots more effective, developing and evaluating an application to support users’ creativity/learning, etc.

You can find more information about the project at: https://www.utwente.nl/ewi/hmi/cobotnity/

Contact: Alejandro Catalá (a.catala@utwente.nl)

Interpersonal engagement in human-robot relations @ HMI - Enschede, NL

Modern media technology enables people to have social interactions with the technology itself. Robots are a new form of media that people can communicate with as independent entities. Although robots are becoming naturalized in social roles involving companionship, customer service and education, little is known about the extent to which people can feel interpersonal closeness with robots and how social norms around close personal acts apply to robots. What behaviors do people feel comfortable to engage in with robots that have different types of social roles, like companion robot, customer service robot and teacher robot? Will robots that people can touch, talk to, lead and follow result in social acceptance of behaviors that express interpersonal closeness between a person and a robot? Are such behaviors intrinsically rewarding when done with a responsive robot?

Contact: Jamy Li

Autonomous vehicles @ HMI - Enschede, NL

We study how people react to autonomous vehicles and how those vehicles should be designed for road users both inside and outside of the car. We want to do this using new methods like VR-in-a-car (i.e., active and passive virtual drive systems) and on-the-road field studies in the Twente region. Some of the topics we’re looking at are novelty effects, communication of the vehicle’s intent and different types of road users such as pedestrians.

Contact: David Goedicke, Vicky Charisi, Jamy Li

Social robotics for children on the autism spectrum @ HMI - Enschede, NL

Using a robot to teach autistic children social skills can be a fun way of doing so. Robots are after all novel and exciting, and give the child certain bragging rights afterwards. However, there might be some unique reasons why robots are particularly useful to autistic children that go beyond this superficial novelty.

Children on the autism spectrum are characterized with having difficulties in social communication and interaction, and show repetitive stereotyped behaviour. They may talk to Siri about the weather day in and day out, fascinated about Siri’s knowledge and ability to answer the same question over and over again. The deficits in social communication and interaction may mean that these children have difficulties understanding another’s emotions and how their behaviour affects others. Some autistic children will avoid eye-contact or are unable to communicate verbally. These are just some examples of what kind of behaviour to expect – the user group is notoriously heterogenic.

While social interaction can be very difficult for autistic children, which they often try to avoid, they do interact socially with robots. One explanation of why autistic children respond well to robots, is that robots can be very predictable in their behaviour, unlike humans. The robot can answer the same question over and over again, and always respond in the same manner. It doesn’t have all the, sometimes frightening, complexities of interacting with another human. But autistic children can interact with the robot in a similar manner as how we interact with another human being. Could it be that a robot can provide these children with a safe, more understandable, and less stressful environment for them to learn social skills? If so, how can we design an intervention with a robot where the children can learn social skills? How do we improve the robot’s technical systems so that it can respond to the special needs of autistic children?

Assignments in this area can include, but is not limited to (!):

  • designing interactive games with the robot.
  • interface design for therapist and/or child.
  • studying why these children perceive and respond to robots differently.
  • developing technical systems for the robot.

For more information, look at http://de-enigma.eu/

Contact: Bob Schadenberg, Pauline Chevalier, Vicky Charisi, Jamy Li

COMPANIES, EXTERNAL RESEARCH INSTITUTES, AND END USER ORGANISATIONS

Here you find some of the organisations that are willing to host master students from HMI. Keep in mind that you are not allowed to have both an external (non-research institute) internship and an external final assignment. If you work for a company that is interested in providing internships or final assignments please contact D.K.J.Heylen[at]utwente.nl

Tasty Bits 'n' Bytes: Food Technology @ het Lansink - Enschede, NL

The current popularity of ICTs that offer augmented or virtual reality experiences, such as Oculus Rift, Google Glass, and Microsoft Hololens, suggests that these technologies will become increasingly more commonplace in our daily lives. With this the question arises of how these mixed reality technologies will be of benefit to us in our day-to-day activities. One such activity that could take advantage of mixed reality technologies is the consumption of food and beverages. Considering the fact that the perception of food is highly multisensory, being not only governed by taste and smell, but to a strong degree by our visual, auditory and tactile senses, mixed reality technologies could be used to enhance our dining experiences. In the Tasty Bits and Bytes project we will explore the use of mixed reality technology to digitally (using visual, auditory, olfactory, and tactile stimuli) enhance the experience of consuming food and beverages.

The setting for these challenges and projects is a mixed reality restaurant table at Het Lansink that hosts a variety of technologies to provide a novel food and beverage experience.

Assignments that can be carried out in collaboration with Het Lansink concern, for example: actuated plates; projection mapping on the table; force feedback; and multimodal taste sensations.

Website: http://www.tastybitsandbytes.com/

Contact: Merijn Bruijnes, Gijs Huisman, Dirk Heylen

Addiction, Coaching and Games @ Tactus - Enschede, NL

Tactus is specialized in the care and treatment of addiction. They offer help to people who suffer from problems as a result of their addiction to alcohol, drugs, medication, gambling or eating. They help by identifying addiction problems as well as preventing and breaking patterns of addiction. They also provide information and advice to parents, teachers and other groups on how to deal with addiction.

Assignment possibilities include developing game-like support and coaching apps.

Website: https://www.tactus.nl/enschede

Contact: Randy Klaassen

VR, AR, and Interactive Play @ 100%FAT - Enschede, NL

100%FAT creatively merges the latest technological innovations to create new applications and experiences. The result is a unique mix of sensors and actuators that transform interactivity into a sensory experience. They transform corporate, educational and/or cultural messages in eye-catching, innovative presentations and installations. They have expertise in the design and realization of interactive media concepts including augmented- and virtual reality, projection mapping, positional audio, computer vision, motion graphics, 3D art, light, 2D animation and video.

Assignments concern outdoor play, interactive museum experiences, and virtual and augmented reality.

Website: http://100fat.nl/

Contact: Dennis Reidsma

Outdoor Interactive Play @ YALP - Goor, NL

YALP (a daughter company of Lappset) is world leader in interactive outdoor play. Over 400 of their interactive systems (Sona dance arch, Sutu soccer wall, Fono DJ booth, Toro sports court and Memo activity zone) are placed in over 20 countries.

YALP has its own R&D department with three academically schooled designers. They collaborated in multiple research projects with several institutes, including research on using playgrounds with the elderly, measuring movement intensity on a Sona, and testing sports equipment (TNO, 2008-2009); field research on a Memo playground in field lab setting (TU Delft / Profit, since 2014); research on play intensity on a Toro at the Marc Lammers Plaza (Hogeschool InHolland, 2012); and research on the Fono DJ booth (Hogeschool Saxion, 2014). With these projects, they also created opportunities for student projects, several of whom ended up in the creative industry.

Assignments include topics such as the design and implementation of new game concepts, systematic evaluation of play, and exploration of smartphone enhanced outdoor play.

Website: https://www.yalpinteractive.com/

Contact: Dennis Reidsma, Robby van Delden

Enhancing Music Therapy with Technology @ ArtEZ - Enschede, NL

ArtEZ School of Music has a strong department in Neurologic Music Therapy, which does not only train new therapists but also engages in fundamental research towards evaluating and imrpoving the impact of Music Therapy.

Music is a powerful tool for influencing people. Not only because music is nice, but also because music actually has neurological effects on the motor system. That is why music therapy is also used as revalidation instrument for people with various conditions. In this assignment you will work with professional music therapists, in developing interactive products to enrich the music therapy sessions for various purposes.

Possibliities include design, development and research assignments on systems such as home practice applications, sound spaces for embodied training, sensing to provide insights to the therapist and/or feedback to the client, etcetera.

Contact Dennis Reidsma

Stories and Language @ Meertens Institute - Amsterdam, NL

The Meertens Institute, established in 1926, has been a research institute of the Royal Netherlands Academy of Arts and Sciences (KNAW) since 1952. They study the diversity in language and culture in the Netherlands, with a focus on contemporary research into factors that play a role in determining social identities in the Dutch society. Their main fields are:

  • ethnological study of the function, meaning and coherence of cultural expressions
  • structural, dialectological and sociolinguistic study of language variation within Dutch in the Netherlands, with the emphasis on grammatical and onomastic variation.

Apart from research, the institute also concerns itself with documentation and providing information to third parties in the field of Dutch language and culture. We possess a large library, with numerous collections and a substantive documentation system, of which databases are a substantive part.

Assignments include text mining and classification and language technology, but also usability and interaction design.

Website of the institute: http://www.meertens.knaw.nl/cms/

Contact: Mariët Theune

Language, Retrieval, and User Interfaces @ Elsevier - Amsterdam, NL

Elsevier is the world's biggest scientific publisher, established in 1880. Elsevier publishes over 2,500 impactful journals including Tetrahedron, Cell and The Lancet. Flagship products include ScienceDirect, Scopus and Reaxys. Increasingly, Elsevier is becoming a major scientific information provider. For specific domains, structured scientific knowledge is extracted for querying and searching from millions of Elsevier and third-party scientific publications (journals, patents and books). In this way, Elsevier is positioning itself as the leading information provider for the scientific and corporate research community.

Assignment possibilities include text mining, information retrieval, language technology, and other topics.

Contact: Mariët Theune

Interactive Technology for Music Education @ ArtEZ - Enschede, NL

The educational program for school music teacher at ArtEZ School of Music increasingly profiles itself with a focus on technology in service to music education. One of the main teachers in the program recently started his PhD project on this topic, in which he explores how interactive technology can improve joint music making in music education in primary schools. To this end he developed prototypes of a music system for playing rhytmically together.

In this context, students can do assignments concerning feedback and visualisations for rhythmic music making, but also on new types of musical instruments and on other kinds of creative applications of technology for music education. Important aspect in these projects is that systems have to be tested with actual students in primary schools.

Contact: Benno Spieker, Dennis Reidsma

Using (neuro)physiological signals @ TNO -- Soesterberg, NL

At TNO Soesterberg (department of Perceptual and Cognitive Systems) we investigate how we can exploit physiological signals such as EEG brain signals, heart rate, skin conductance, pupil size and eye gaze in order to improve (human-machine) performance and evaluation. An example of a currently running project is predicting individual head rotations from EEG in order to reduce delays in streaming images in head mounted displays. Other running projects deal with whether and how different physiological measures reflect food experience. Part of the research is done for international customers. 

More examples of projects as reflected in papers are on Google Scholar

We welcome students with skills in machine learning and signal processing and/or who would like to setup experiments, work with human participants and advanced measurement technology.

Contact: Jan van Erp <j.b.f.vanerp@utwente.nl>

Social VR User Experiences @ TNO -- Den Haag, NL

In the TNO MediaLab (The Hague), we create innovative media technologies aimed at providing people with new and rich media experiences, which they can enjoy wherever, whenever and with whomever they want. To enable this, we mainly develop video streaming solutions, working from the capture side to the rendering side, looking at many of the aspects involved: coding, transport, synchronisation, orchestration, digital rights management, service delivery, etc.. In many cases, we do this work directly for customers such as broadcasters and content distributors.

As part of this, we currently work on what we call Social VR, or VR conferencing. Virtual Reality applications excel in immersing the user into another reality. But passed the wow-effect, users may feel the lack of the social interactions that would happen in real life. TNO is exploring ways to bring in the virtual world the experience of sharing moments of life with friends and family. We do this using advanced video-based solutions.

We are actively looking for students to contribute to:

- evaluating, analysing and improving the user experience of the services developed, e.g. working on user embodiment, presence, HCI-aspects, etc.

- the technical development of the platform (i.e. prototyping), e.g. working on spatial audio, 3D video, spatial orchestration, etc.

Focus of assignments can be on one aspect or the other, or a combination of both.

More info of what we do at TNO Medialab can be found here: https://tnomedialab.github.io/

Contact: Jan van Erp <j.b.f.vanerp@utwente.nl>

Emodied interaction with VR & AR for health @ Philips Healthcare - Eindhoven, NL

Philips Eindhoven deals with the subject of “Connected Care”. Connected care is about allowing medical professionals to maintain an effective and efficient connection to their patients. Providing clear information about the patient is key in this process. 

At this R&D group of Philips there is a keen interest in applying VR or AR in such a health care setting. On the longer run this might be combined with several of Philip’s existing and future appliances. Being part of the world-wide research and development department of Philips their interest is on the exploration of possibilities of AR&VR for settings such as physical rehabilitation and MRI & CT visualisation.

Your assignment is to research and eventually design a stimulating or informative VR application, starting of course by making yourself familiar with the current state of the art and then specifying a better framed project. It has to include a well-designed experiment to investigate fitting research questions.

In addition to conducting research, you will be expected to design a virtual evironment with Unity. Therefore prior knowledge and skills of Unity and/or modelling is essential. Help can be provided from the host institute and tutors for the VR related specifics and (serious) game design mechanics.

Contact: Robby van Delden <r.w.vandelden@utwente.nl>

Automatic subject indexing of text @ OCLC - Leiden, NL

Because of the ever-increasing number of documents that information systems deal with, automatic subject indexing, i.e., identifying and describing the subject(s) of documents to increase their findability, is one of the most desirable features for many such systems. Subject index terms are normally taken from knowledge organization systems (e.g., thesauri, subject headings systems) and classification systems (e.g., dewey decimal classification) which easily contain thousands or tens of thousands terms or codes, making the problem a massively multi-class classification problem. Automatically assigning the correct labels to a text is therefore very challenging.

The goal of this project is to explore deep learning methods to train a classifier which can assign a document a small subset of relevant subject labels from tens of thousands target labels [1]. Different from traditional binary or multi-class classification problems, this problem of extreme multi-label text classification does not assume that the target labels are independent or mutually exclusive. The training process suffers from data sparsity, i.e., there is a long tail of labels with small number of positive training instances. Scalability is another challenge too.

The dataset used in this project is a subset of WorldCat which contains the metadata (title, authors, abstract, etc.) of 27 million Medline articles, mostly indexed with the Medical Subject Headings (MeSH) which contains over 90,000 entry terms [2]. The student is expected to apply or adapt the state-of-the-art methods to automatically assign MeSH terms to Medline articles, as well as design evaluation experiments to measure the performance of different methods.

[1] Liu, Jingzhou, Wei-Cheng Chang, Yuexin Wu and Yiming Yang. “Deep Learning for Extreme Multi-label Text Classification.” SIGIR (2017).

[2] https://www.nlm.nih.gov/pubs/factsheets/mesh.html

Contact: Gwenn Englebienne