Abstract
Drones are often used in a context where they interact with human users. They, however, lack the social cues that their robotic counterparts have. If drones would possess such cues, would people respond to them more positively? This paper investigates people’s evaluations of a drone with eyes versus one without. Results show mainly positive effects, i.e. a drone with eyes is seen as more social and human-like than a drone without eyes, and that people are more willing to interact with it. These findings imply that adding eyes to a drone that is designed to interact with humans may make this interaction more natural, and as such enable a successful introduction of social drones.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ajzen, I., Fishbein, M.: The influence of attitudes on behavior. Handb. Attitudes 173(221), 31 (2005)
Bartneck, C., Kulić, D., Croft, E., Zoghbi, S.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 1(1), 71–81 (2009)
Bhattacherjee, A., Premkumar, G.: Understanding changes in belief and attitude toward information technology usage: a theoretical model and longitudinal test. MIS Q. 28(2), 229–254 (2004)
Breazeal, C., Scassellati, B.: Robots that imitate humans. Trends Cogn. Sci. 6(11), 481–487 (2002)
Broadbent, E.: Interactions with robots: the truths we reveal about ourselves. Annu. Rev. Psychol. 68, 627–652 (2017)
Carpinella, C.M., Wyman, A.B., Perez, M.A., Stroessner, S.J.: The robotic social attributes scale (rosas): development and validation. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 254–262. ACM (2017)
Cauchard, J.R., Zhai, K.Y., Spadafora, M., Landay, J.A.: Emotion encoding in human-drone interaction. In: 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 263–270. IEEE (2016)
Colley, A., Virtanen, L., Knierim, P., Häkkilä, J.: Investigating drone motion as pedestrian guidance. In: Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia, pp. 143–150. ACM (2017)
Cuijpers, R.H., Knops, M.A.M.H.: Motions of robots matter! the social effects of idle and meaningful motions. Social Robotics. LNCS (LNAI), vol. 9388, pp. 174–183. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25554-5_18
Dautenhahn, K.: Socially intelligent robots: dimensions of human-robot interaction. Philos. Trans. Royal Soc. B: Biol. Sci. 362(1480), 679–704 (2007)
Dautenhahn, K., Woods, S., Kaouri, C., Walters, M.L., Koay, K.L., Werry, I.: 2005 IEEE/RSJ International Conference on What is a robot companion-friend, assistant or butler? In: Intelligent Robots and Systems, IROS 2005, pp. 1192–1197. IEEE (2005)
Duffy, B.R.: Anthropomorphism and the social robot. Robot. Auton. Syst. 42(3), 177–190 (2003)
Eyssel, F., De Ruiter, L., Kuchenbrandt, D., Bobinger, S., Hegel, F.: ‘If you sound like me, you must be more human’: on the interplay of robot and user features on human-robot acceptance and anthropomorphism. In: 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 125–126. IEEE (2012)
Fink, J.: Anthropomorphism and human likeness in the design of robots and human-robot interaction. In: Ge, S.S., Khatib, O., Cabibihan, J.-J., Simmons, R., Williams, M.-A. (eds.) ICSR 2012. LNCS (LNAI), vol. 7621, pp. 199–208. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34103-8_20
Kleinke, C.L.: Gaze and eye contact: a research review. Psychol. Bull. 100(1), 78–100 (1986)
Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Cengage Learning, Boston (2013)
Levinson, S.C.: Turn-taking in human communication-origins and implications for language processing. Trends Cogn. Sci. 20(1), 6–14 (2016)
Merriam-Webster Online: Merriam-Webster Online Dictionary (2018). http://www.merriam-webster.com
Nass, C., Brave, S.: Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship. MIT Press, Cambridge (2005)
Reeves, B., Nass, C.: How People Treat Computers, Television, and New Media Like Real People and Places. CSLI Publications and Cambridge University Press, Stanford and Cambridge (1996)
Rogers, E.M.: Diffusion of Innovations. Simon and Schuster, New York (2010)
van Schendel, J.A., Cuijpers, R.H.: Turn-yielding cues in robot-human conversation. New Front. Human-Robot Interact. 85 (2015)
Stivers, T., et al.: Universals and cultural variation in turn-taking in conversation. Proceed. Nat. Acad. Sci., 10587–10592 (2009)
Torta, E., van Heumen, J., Cuijpers, R.H., Juola, J.F.: How can a robot attract the attention of its human partner? a comparative study over different modalities for attracting attention. In: Ge, S.S., Khatib, O., Cabibihan, J.-J., Simmons, R., Williams, M.-A. (eds.) ICSR 2012. LNCS (LNAI), vol. 7621, pp. 288–297. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34103-8_29
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Ruijten, P.A.M., Cuijpers, R.H. (2018). If Drones Could See: Investigating Evaluations of a Drone with Eyes. In: Ge, S., et al. Social Robotics. ICSR 2018. Lecture Notes in Computer Science(), vol 11357. Springer, Cham. https://doi.org/10.1007/978-3-030-05204-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-05204-1_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-05203-4
Online ISBN: 978-3-030-05204-1
eBook Packages: Computer ScienceComputer Science (R0)