Abstract
Trust plays a critical role when operating a robotic system in terms of both acceptance and usage. Considering trust is a multidimensional context dependent construct, the differences and common themes were examined to identify critical considerations within human–robot interaction (HRI). In order to examine the role of trust within HRI, a measurement tool was generated based on five attributes: team configuration, team processes, context, task, and system (Yagoda in Human Factors and Ergonomics Society Annual Meeting, San Francisco, CA, pp. 304–308, 2010). The HRI trust scale was developed based on two studies. The first study conducts a content validity assessment of preliminary items generated, based on a review of previous research within HRI and automation, using subject matter experts (SMEs). The second study assesses the quality of each trust scale item derived from the first study. The results were then compiled to generate the HRI trust measurement tool.
Similar content being viewed by others
References
Barber B (1983) The logic and limits of trust. Rutgers University Press, New Brunswick
Biros D, Daly M, Gunsch G (2004) The influence of task load and automation trust on deception detection. Group Decis Negot 1983:173–189. doi:10.1023/B:GRUP.0000021840.85686.57
Butler JK, Cantrell RS (1984) A behavioral decision theory approach to modeling dyadic trust in superiors and subordinates. Psychol Rep 55:19–28
Casper J, Murphy RR (2003) Human-robot interactions during the robot-assisted urban search and rescue response at the World Trade Center. IEEE Trans Syst Man Cybern, Part B, Cybern 33(3):367–385. doi:10.1109/TSMCB.2003.811794
Cook J, Wall T (1980) New work attitude measures of trust, organizational commitment and personal need non-fulfilment. J Occup Psychol 53:39–52
Deutschi M (1960) The effect of motivational orientation upon trust and suspicion. Hum Relat 13(2):123–139. doi:10.1177/001872676001300202
Gabarro JJ (1978) The development of trust influence and expectations. In: Athos AG, Gabarro JJ (eds) Interpersonal behavior: communication and understanding in relationships. Prentice Hall, Englewood Cliffs, pp 230–290
Groom V, Nass C (2007) Can robots be teammates? Benchmarks in human–robot teams. Interact Stud 8(3):483–500
Hancock PA, Billings DR, Schaefer KE, Chen JYC, de Visser EJ, Parasuraman R (2011) A meta-analysis of factors affecting trust in human-robot interaction. Hum Factors 53(5):517–527. doi:10.1177/0018720811417254
Hovland CI (1953) Communication and persuasion: psychological studies of opinion change. Yale University Press, New Haven
Jennings EE (1967) The mobile manager: a study of the new generation of top executives Bureau of Industrial Relations. University of Michigan, Ann Arbor
Kauppinen S, Brain C, Moore M (2002) European medium-term conflict detection field trials. In: Proceedings the 21st digital avionics systems conference. IEEE Press, New York, vol 1, pp 2C1-1–2C1-12. doi:10.1109/DASC.2002.1067918
Langan-Fox J, Sankey MJ, Canty JM (2009) Human factors measurement for future air traffic control systems. Hum Factors 51(5):595–637. doi:10.1177/0018720809355278
Lawshe CH (1975) A quantitative approach to content validity. Pers Psychol 28(4):563–575. doi:10.1111/j.1744-6570.1975.tb01393.x
Lee JD, See KA (2004) Trust in automation: designing for appropriate reliance. Hum Factors 46:50–80. doi:10.1518/hfes.46.1.50
Mayer RC, Davis JH, Schoorman FD (1995) An integrative model of organizational trust. Acad Manag Rev 20(3):709–734
Merritt SM, Ilgen DR (2008) Not all trust is created equal: dispositional and history-based trust in human-automation interactions. Hum Factors 50(2):194–210. doi:10.1518/001872008X288574
Mishra AK (1996) Organizational response to crisis: the centrality of trust. In: Tyler TR, Kramer RM (eds) Trust in organizations: frontiers of theory and research. Sage, Thousand Oaks, pp 261–287
Moorman C, Deshpande R, Zaltman G (1993) Factors affecting trust in market research relationships. J Mark 57(1):81–101. doi:10.2307/1252059
Muir BM (1988) Trust between humans and machines, and the design of decision aids. In: Hollnagel E, Mancini G, Woods DD (eds) Cognitive engineering in complex dynamic worlds. Academic Press, London, pp 71–83
Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors 39(2):230–253
Rempel JK, Holmes JG, Zanna MP (1985) Trust in close relationships. J Pers Soc Psychol 49(1):95–112
Ross JM (2008) Moderators of trust and reliance across multiple decision aids. University of Central Florida
Rotter JB (1967) A new scale for the measurement of interpersonal trust. J Pers 35(4):651–665. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/4865583
Rotter JB (1971) Generalized expectancies for interpersonal trust. Am Psychol 26(5):443–452. doi:10.1037/h0031464
Sheridan TB (1975) Considerations in modeling human supervisory controller. In: IFAC 6th world congress, Laxenburg, Austria, pp 1–6
Sheridan TB, Hennessy RT (1984) Research and modeling of supervisor control behavior. National Academy, Washington
Sitkin SB, Roth NL (2011) Explaining the limited effectiveness of legalistic “remedies” for trust/distrust. Organ Sci 4(3):367–392
Taboada M, Martínez-Tomás R, Ferrández JM (2011) New perspectives on the application of expert systems. Expert Syst 28(4):285–287. doi:10.1111/j.1468-0394.2011.00599.x
Yagoda RE (2010) Development of the human robot interaction workload measurement tool. In: Human factors and ergonomics society annual meeting, San Francisco, CA, pp 304–308. doi:10.1177/154193121005400408
Yagoda RE, Coovert MD (2012) How to work and play with robots: an approach to modeling human–robot interaction. Comput Hum Behav 28(1):60–68. doi:10.1016/j.chb.2011.08.011
Yagoda R, Coovert MD (2009) Modeling human-robot interaction with Petri-nets. In: Human factors and ergonomics society annual meeting, San Antonio, TX, pp 1413–1417. doi:10.1177/154193120905301852
Zuboff S (1988) In the age of smart machines: the future of work technology and power. Basic Books, New York
Acknowledgements
This research was funded by the United States Army Research Lab Human Research and Engineering Directorate (ARL-HRED) through the United States Army Research Office (ARO), #W911NF-07-R-0001-04.
Author information
Authors and Affiliations
Corresponding author
Electronic Supplementary Material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Yagoda, R.E., Gillan, D.J. You Want Me to Trust a ROBOT? The Development of a Human–Robot Interaction Trust Scale. Int J of Soc Robotics 4, 235–248 (2012). https://doi.org/10.1007/s12369-012-0144-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-012-0144-0