Abstract
Much of the practical work conducted to minimise risk of human error and loss of human reliability, at least in oil and gas, chemicals and other process industries over the past 30 or so years, has been based on the model of basic error types known as the Generic Error Modelling System (GEMS). Over roughly the same period, psychologists and behavioural economists, have developed a rich understanding of the nature and characteristics of what, in simplified terms, are widely considered to be two styles of thinking – often referred to as “System 1” and “System 2”. This paper explores the relationship between the GEMS model and what is known of the functioning of the two styles of thinking, and in particular the characteristics and biases associated with System 1 thinking. While many of the ideas behind the two styles of thinking are embedded in the GEMS model, there are some important omissions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kahneman, D.: Thinking, Fast and Slow. Allen Lane, London (2011)
Reason, J.: Human Error. Cambridge University Press, Cambridge (1990)
Kahneman, D., Klein, G.: Conditions for intuitive expertise: a failure to disagree. Am. Psychol. 64(6), 515–526 (2009)
CSB: Investigation report Volume 3: Drilling rig explosion and fire at the Macondo well. Report no 2010-10-I-OS: Investigation report Volume 3. Chemical Safety Board (2016)
Swain, A.D., Guttman, H.E.: Handbook of human reliability analysis with emphasis on nuclear power plant applications. Final report, NUREG/CR-1278, USNRC (1983)
McLeod, R.W.: Implications of styles of thinking for risk awareness and decision-making in safety critical operations. Cognitia 22(3), Human Factors and Ergonomics Society (2016)
McLeod, R.W.: Human factors in barrier management: hard truths and challenges. Process Saf. Environ. Perform (In Press). IChemE
RAIB: Fatal accident involving a track worker near Newark North gate station, 22 January 2014. Report 01/2015, Rail Accident Investigation Branch (2015)
Thaler, R.: Misbehaving: The Making of Behavioural Economics. Allen Lane, London (2015)
McLeod, R.W.: Designing for Human Reliability: Human Factors Engineering for the Oil, Gas and Process Industries. Gulf Professional Publishing, Houston (2015)
McLeod, R.W.: The impact of styles of thinking and cognitive bias on how people assess risk and make real world decisions in oil and gas companies. Oil and Gas Facilities, Society of Petroleum Engineers (2016)
Office of Nuclear Regulatory Research: Building a Psychological Foundation for Human Reliability Analysis. NUREG-2114, INL/EXT-11-23898 (2012)
IFE: The PetroHRA guideline. IFE/HR/E-2017/001, Institute for Energy Technology (2017)
Weick, K.E., Sutcliffe, K.M.: Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd edn. Jossey-Bass, San Francisco (2007)
Dekker, S.: The Field Guide to Understanding Human Error. Ashgate, Farnham (2006)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
McLeod, R.W. (2018). From Reason and Rasmussen to Kahneman and Thaler: Styles of Thinking and Human Reliability in High Hazard Industries. In: Boring, R. (eds) Advances in Human Error, Reliability, Resilience, and Performance. AHFE 2017. Advances in Intelligent Systems and Computing, vol 589. Springer, Cham. https://doi.org/10.1007/978-3-319-60645-3_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-60645-3_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60644-6
Online ISBN: 978-3-319-60645-3
eBook Packages: EngineeringEngineering (R0)