Bachelor thesis

Human Factors

https://upload.wikimedia.org/wikipedia/commons/e/e6/Flickr_-_Official_U.S._Navy_Imagery_-_Doctors_perform_surgery_together..jpg

BHF1 - REDEFINE EXPLICIT MEASURE OF TRUST BEFORE THE USE: REDESIGN PHASE OF AN ONLINE SURVEY

SUPERVISOR: DR. SIMONE BORSCI


Introduction

Every day people use multiple technologies to perform complex tasks, such as buying products online, informing their decision making, or supporting their work activities. Several independent evidences in literature converge on the idea that multiple elements affect people’s expectations toward the use of a technology, including individual attitudes, skills and capabilities and technology related aspects, such as: product’s aesthetics and usability perceived before the use, fluency, brand and price etc.

In many cases, (high risk) processes are dependent on the technology to deliver the appropriate service. It is perhaps reasonable to assume that the implicit agreement of this technology-driven world is that: people trust the technology they are using to perform task and decision making in terms of: performance, functionalities and reliability of outcomes. Trust towards technology does not happen immediately, but rather, it is built throughout the relationship between user and artefacts. This is a set of beliefs about a product’s characteristics – i.e., functioning, reliability, safety, etc. And it derives from the gained experience of people in the use of different technologies over time. User’s overall trust is, therefore, strongly related to the concept of user experience, i.e., experience with (and the exposition to) different products enable people to develop a set of general attitudes and beliefs toward those technology, including the overall trust.

Aim

Redesign a survey on the basis of previous experimental data, and perform a new expert and usability evaluation.

References

Borsci, S., Lawson, G., Salanitri, D., & Jha, B. (2016). When simulated environments make the difference: the effectiveness of different types of training of car service procedures. Virtual Reality, 20(2), 83-99. doi: 10.1007/s10055-016-0286-8

Corbitt, B. J., Thanasankit, T., & Yi, H. (2003). Trust and e-commerce: a study of consumer perceptions. Electronic Commerce Research and Applications, 2(3), 203-215. doi: https://doi.org/10.1016/S1567-4223(03)00024-3

Fruhling, A. L., & Lee, S. M. (2006). The influence of user interface usability on rural consumers' trust of e-health services. International Journal of Electronic Healthcare, 2(4), 305-321. doi: 10.1504/ijeh.2006.010424

Gefen, D. (2000). E-commerce: the role of familiarity and trust. Omega, 28(6), 725-737. doi: https://doi.org/10.1016/S0305-0483(00)00021-9

Karat, C. M., Brodie, C., Karat, J., Vergo, J., & Alpert, S. R. (2003). Personalizing the user experience on ibm.com. IBM Syst. J., 42(4), 686-701. doi: 10.1147/sj.424.0686

Lankton, N. K., & McKnight, D. H. (2011). What does it mean to trust facebook?: examining technology and interpersonal trust beliefs. SIGMIS Database, 42(2), 32-54. doi: 10.1145/1989098.1989101

Lawson, G., Salanitri, D., & Waterfield, B. (2016). Future directions for the development of virtual reality within an automotive manufacturer. Applied Ergonomics, 53(Part B), 323-330. doi: https://doi.org/10.1016/j.apergo.2015.06.024

Lippert, S. K., & Swiercz, P. M. (2005). Human resource information systems (HRIS) and technology trust. Journal of Information Science, 31(5), 340-353. doi: 10.1177/0165551505055399

Marie Christine, R., Olivier, D., & Benoit, A. A. (2001). The impact of interface usability on trust in Web retailers. Internet Research, 11(5), 388-398. doi: 10.1108/10662240110410165

Mcknight, D. H., Carter, M., Thatcher, J. B., & Clay, P. F. (2011). Trust in a specific technology: An investigation of its components and measures. ACM Trans. Manage. Inf. Syst., 2(2), 1-25. doi: 10.1145/1985347.1985353

Montague, E. N. H., Winchester, W. W., & Kleiner, B. M. (2010). Trust in medical technology by patients and healthcare providers in obstetric work systems. Behaviour & Information Technology, 29(5), 541-554. doi: 10.1080/01449291003752914

Pennington, R., Wilcox, H. D., & Grover, V. (2003). The Role of System Trust in Business-to-Consumer Transactions. Journal of Management Information Systems, 20(3), 197-226. doi: 10.1080/07421222.2003.11045777

Salanitri, D., Hare, C., Borsci, S., Lawson, G., Sharples, S., & Waterfield, B. (2015). Relationship Between Trust and Usability in Virtual Environments: An Ongoing Study. In M. Kurosu (Ed.), Human-Computer Interaction: Design and Evaluation: 17th International Conference, HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015, Proceedings, Part I (pp. 49-59). Cham: Springer International Publishing.

Salanitri, D., Lawson, G., & Waterfield, B. (2016). The Relationship Between Presence and Trust in Virtual Reality. Paper presented at the Proceedings of the European Conference on Cognitive Ergonomics, Nottingham, United Kingdom.

Shin, D.-H. (2013). User experience in social commerce: in friends we trust. Behaviour & Information Technology, 32(1), 52-67. doi: 10.1080/0144929x.2012.692167

Ziefle, M., Rocker, C., & Holzinger, A. (2011, 18-22 July 2011). Medical Technology in Smart Homes: Exploring the User's Perspective on Privacy, Intimacy and Trust. Paper presented at the 2011 IEEE 35th Annual Computer Software and Applications Conference Workshops.

BHF2 - WHO’S GONNA FALL INTO THE UNCANNY VALLEY?

SUPERVISOR: DR. MARTIN SCHMETTOW


It is expected that robots will soon appear in the areas of senior and health care, where they support staff (e.g., lifting of patients) and act as social companions (e.g., for elderly and handicapped persons). Yet, people are sometimes skeptical towards technology, especially when they don’t understand it. Therefore, an important condition of success is that robots are accepted by the clients.

It is commonly assumed, that making a social robot more human-like in appearance and behavior will improve acceptance. However, there is a problem with that approach: The emotional response towards a robot increases only up to a certain point. When a robot face reaches close resemblance with a human, without being undistinguishable, the emotional response takes a sudden drop. This is called the Uncanny Valley, and the cognitive mechanisms of this strange phenomenon are currently unclear. Mathur & Reichling (2016) provide evidence that the Uncanny Valley exists. However, in their data analysis, they estimated the relation between human likeness and emotional response on average. At the same time, Macdorman & Entezari (2015) found individual differences in sensitivity to the Uncanny Valley.

For the clarification of underlying cognitive mechanisms it is crucial to understand whether the Uncanny Valley effect is universal, that is everyone falls for it (not just on average), or whether it only happens to a particular group of people.

In this study, you will replicate the study of Mathur & Reichling (2016) using a more powerful experimental design and an extended set of stimuli. By means of multi-level analysis you will answer the question: Does everyone fall into the Uncanny Valley? Depending on the results you will evaluate common theories on the Uncanny Valley, of how likely they are to be true.

Interested? Ask Martin Schmettow (m.schmettow@utwente.nl)

Macdorman, K. F., & Entezari, S. O. (2015). Individual differences predict sensitivity to the uncanny valley. Interaction Studies, 2(May 2016), 1–47. https://doi.org/10.1075/is.16.2.01mac

Mathur, M. B., & Reichling, D. B. (2016). Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition, 146, 22–32. https://doi.org/10.1016/j.cognition.2015.09.008