HFC - Tools for Society

Human Factors and Cognition - Tools and Research Outputs for Society

Please explore our non-article research outputs, including databases, surveys, and extra materials.

1.    Guided Survey for Advancing Psychological Interventions with Extended Reality (XR)

Description:

In the rapidly evolving field of psychological research, Extended Reality (XR) technologies offer unprecedented opportunities for innovative intervention design. To harness this potential, the Guided Survey for Researchers: Designing Psychological Interventions with XR was developed at the University of Twente. This survey provides researchers with a structured tool to refine their XR-based psychological intervention ideas through critical considerations and research-driven feedback. Drawing from systematic reviews, it integrates a dual-phase structure addressing foundational and advanced design aspects, culminating in a personalized PDF report with actionable recommendations. Designed to take only 5–10 minutes, the survey inspires novel approaches, identifies key elements, and fosters methodological rigor, contributing to the broader advancement of XR in psychological science.

For questions, contact:

·         Dr. Funda Yıldırım (f.yildirim@utwente.nl

Link to Survey:

Access the Survey

2.    Immersive Environments for Human Factors and Cognition Research

Description:

Interested in experiencing the virtual environments our group uses for experiments? On this pilot webpage, developed by our intern Evelien Hartog, you can learn about past student projects in this area, find information on VR-related courses offered at the University of Twente, and more. 

For questions, contact:

·         Funda Yildirim (f.yildirim@utwente.nl)

Access Link here

3.    The ChatBot Usability Scale (BUS-11)

Description:

The BUS-11 is a comprehensive scale for assessing user experience with chatbots and conversational service systems. It measures usability across key dimensions, helping researchers and designers improve the interaction quality of their conversational agents. The scale is available in English, Dutch, Spanish, German, Italian, Turkish and Chinese. 

For questions, contact:

·         Dr. Simone Borsci (s.borsci@utwente.nl)

Access the Chatbot Usability Scale (BUS-11) 

4.    The Human Robot Collaboration Interaction Scale (UX-HRCI Scale)


Description:
The UX-HRCI Scale is designed to evaluate the quality of interactions between humans and collaborative robots (cobots). It helps identify user satisfaction, usability, and areas for improving human-robot interaction in industrial and research settings from the experiential point of view. The scale was pre-validated and it is open for testing. You find more information in this pre-print article: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4922500.

For questions, contact:

·         Dr. Simone Borsci (s.borsci@utwente.nl)

Access the UX-HRCI Scale *

*The scale is available for use for free. Please reference even partial usage as - Borsci, S., Prati, E., Willemse, C., Stefánsdóttir, Á., & Peruzzini, M. (2024). Development and Face Validity of The User Experience in Human Robot Collaboration Interaction Scale (UX-HRCI Scale). Available at SSRN.

5.    Human Factor Toolkit for Railways – Human-in-the-Loop Simulation Studies

Description:

This toolkit focuses on enhancing human factors research in railway systems. It provides tools and guidelines for conducting human-in-the-loop simulation studies, ensuring safety, efficiency, and user-centered design in railway operations. This is version 1 of the toolkit.

For questions, contact:

·         Dr. Simone Borsci (s.borsci@utwente.nl)

Access the Human Factors Toolkit in Railways for HITL studies*.

*The Toolkit is available for use for free. Please reference even partial usage as - Kusumastuti, S. A., Kolkman, T. H., Lo, J. C., & Borsci, S. (2025). Charting the landscape of rail human factors and automation: A systematic scoping review. Transportation Research Interdisciplinary Perspectives, 30, 101350.

6.    Naturalistic Motor Learning Paradigm


Description:

Most motor learning paradigms are performed on the keyboard. To investigate motor learning naturalistically, I’ve adapted one of the most popular motor sequence learning task - the Discrete Sequence Performance Task, into a dance. Now it incorporates measurements of kinematics, behavioural and EEG for a more complete picture of motor and cognitive functions during motor learning.

For questions, contact:

·         Dr. Russell Chan (r.w.chan@utwente.nl)

Video protocol is here.

The E-Prime script paradigm is here.

The associated preprint is here.

*The paradigm is for free. Please reference even partial usage as – Chan, R. W., Wiechmann, E., & Verwey, W. (2022, October 1). Motor Sequencing Learning from Dance Step: A whole-body version of the Discrete Sequence Production Task. https://doi.org/10.31234/osf.io/ypt7n.

7.    Y LAB: Your Lab to Setup Sensors and Collect Data

Description:

The goal of YLab is to create an educational platform to learn about physiological measure and sensors arrays. The idea is to create a low budget software/hardw3are platform, that allows students to own their devices. 
        
The platform can be used to teach students all about collecting, analysing and interpreting physiological data, like EDA or EMG. At the same time it offers a way to teach students programming in a fun and easy way.

For questions, contact:

·         Martin Schmettow (m.schmettow@utwente.nl)

Tutorial and codes can be found on GitHub.

8.    YET: Your (Ultra-Low-Cost) Eye Tracker


Description:

The goal of YLab is to create an educational platform Your Eye Tracker is a low-budget eye tracking platform based on a cheap endoscope camera, 3d printed parts and OpenCV image processing.

For questions, contact:

·         Martin Schmettow (m.schmettow@utwente.nl)

Tutorial and codes can be found on GitHub