Abstract
Advances in artificial intelligence and robotics development are providing the technical abilities that will allow autonomous systems to perform complex tasks in uncertain situations. Despite these technical advances, a lack of human trust leads to inefficient system deployment, increases supervision workload and fails to remove humans from harm’s way. Conversely, excessive trust in autonomous systems may lead to increased risks and potentially catastrophic mission failure. In response to this challenge, trusted autonomy is the emerging scientific field aiming at establishing the foundations and framework for developing trusted autonomous systems.
This paper investigates the use of modelling and simulation (M&S) to advance research into trusted autonomy. The work focuses on a comprehensive M&S-based synthetic environment to monitor operator inputs and provide outputs in a series of interactive, end-user driven events designed to better understand trust and autonomous systems.
As part of this analysis, a suite of prototype model-based planning, simulation and analysis tools have been designed, developed and tested in the first of a series of distributed interactive events. In each of these events, the applied M&S methodologies were assessed for their ability to answer the question; what are the key mechanisms that affect trust in autonomous systems?
The potential shown by M&S throughout this work paves the way for a wide range of future applications that can be used to better understand trust in autonomous systems and remove a key barrier to their wide-spread adoption in the future of defense.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
de Rosa, F., Mansfield, T., Jousselme, A.-L., Tremori, A.: Modelling key performance indicators for improved performance assessment in persistent maritime surveillance projects. In: Ahram, T.Z., Karwowski, W., Kalra, J. (eds.) AHFE 2021. LNNS, vol. 271, pp. 295–303. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-80624-8_37
Harper, A., Navonil, M., Yearworth, M.: Facets of trust in simulation studies. Eur. J. Oper. Res. 189, 197–213 (2021)
Abbass, H.A., Scholz, J., Reid, D.J. (eds.): Foundations of Trusted Autonomy. SSDC, vol. 117. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-64816-3
Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors J. Hum. Factors Ergon. Soc. 461(1), 50–80 (2004)
Bindewald, J.M., Rusnock, C.F., Miller, M.E.: Measuring human trust behavior in human-machine teams. In: Cassenti, D.N. (ed.) AHFE 2017. AISC, vol. 591, pp. 47–58. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-60591-3_5
Evans, A.M., Revelle, W.: Survey and behavioral measurements of interpersonal trust. J. Res. Pers. 46(2), 1585–1593 (2008)
Robinette, P., Wagner, A.R., Howard, A.M.: Investigating human-robot trust in emergency scenarios: methodological lessons learned. In: Mittu, R., Sofge, D., Wagner, A., Lawless, W.F. (eds.) Robust Intelligence and Trust in Autonomous Systems, pp. 143–166. Springer, Boston (2016). https://doi.org/10.1007/978-1-4899-7668-0_8
Rusnock, C.F., Miller, M.E., Bindewald, J.M.: Framework for trust in human-automation teams. In: Industrial and Systems Engineering Conference, Pittsburg, USA (2017)
Johnson, N., Patron, P., Lane, D.: The importance of trust between operator and AUV: crossing the human/computer language barrier. In: OCEANS 2007, Aberdeen, UK (2007)
Wu, X., Stuck, R.E., Rekleitis, I., Beer, J.M.: Towards a framework for human factors in underwater robotics. Proc. Hum. Factors Ergon. Soc. Ann. Meet. 59(1), 1115–1119 (2016)
OECD: Measuring trust. In: OECD Guidelines on Measuring Trust, Paris, France, pp. 115–154. OECD (2017)
Hu, W.-L., Akash, K., Jain, N., Reid, T.: Real-time sensing of trust in human-machine interactions. Cyber-Phys. Hum.-Syst. 49(32), 48–53 (2016)
Khawaji, A., Chen, F., Zhou, J., Marcus, N.: Using galvanic skin response (GSR) to measure trust and cognitive load in the text-chat environment. In: 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, Seoul (2015)
Andre, H., Sihombing, P.P., Sfenrianto S., Wang, G.: Measuring consumer trust in online booking application. In: ICITISEE (2019)
Feigh, K.M., Dorneich, M.C., Hayes, C.C.: Toward a characterization of adaptive systems: a framework for researchers and system designers. Hum. Factors J. Hum. Factors Ergon. Soc 56(4), 1008–1024 (2012)
NATO Allied Command Transformation: Disruptive Technology Assessment Game Handbook. NATO ACT, Norfolk, USA
North Atlantic Treaty Organization (NATO): NATO Architecture Framework. Architecture Capability Team Consultation. Command and Control Board, Brussels, Belgium (2018)
Google: Google - Create effortless forms. Google. https://www.google.com/forms/about/. Accessed 21 July 2021
Acknowledgements
The work reported in this paper has been funded by NATO Allied Command Transformation (ACT) Innovation Hub.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Mansfield, T. et al. (2022). Building Trust in Autonomous Systems: Opportunities for Modelling and Simulation. In: Mazal, J., et al. Modelling and Simulation for Autonomous Systems. MESAS 2021. Lecture Notes in Computer Science, vol 13207. Springer, Cham. https://doi.org/10.1007/978-3-030-98260-7_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-98260-7_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-98259-1
Online ISBN: 978-3-030-98260-7
eBook Packages: Computer ScienceComputer Science (R0)