Abstract
Our goal is to enable robots to communicate information about their internal state using expressive light signals. Since nonverbal cues are typically given context by a robot’s other actions, we combined light signals, varying their color, pattern, and frequency, with robot base motion and investigated the effects of these nonverbal signals on human observers’ attribution of certain task-related properties (e.g., urgency and safety) to the robot. The results show that variations in light signal parameters affect human attribution of common notification properties. Using these findings, we present a first step towards systematically generating new light signals that can convey different state-related properties relevant for human-robot interaction, validated in a video-based study.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Arrabito, G.R., Mondor, T.A., Kent, K.J.: Judging the urgency of non-verbal auditory alarms: a case study. Ergonomics 47(8), 821–840 (2004)
Baraka, K., Paiva, A., Veloso, M.: Expressive lights for revealing mobile service robot state. In: Robot 2015: Second Iberian Robotics Conference, pp. 107–119. Springer, Cham (2015)
Baraka, K., Rosenthal, S., Veloso, M.: Enhancing human understanding of a mobile robot’s state and actions using expressive lights. In: IEEE International Symposium on Robot and Human Interactive Communication, pp. 652–657. IEEE (2016)
Cha, E., Kim, Y., Fong, T., Matarić, M.J.: A survey of nonverbal signaling methods for non-humanoid robots. Found. Trends Robot. 6(4), 211–323 (2018)
Cha, E., Matarić, M.: Using nonverbal signals to request help during human-robot collaboration. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5070–5076. IEEE (2016)
Cohn, T.E.: Method and apparatus for enhancing visual perception of display lights, warning lights and the like, and of stimuli used in testing for ocular disease, 20 January 1998. US Patent 5,710,560 (1998)
Dautenhahn, K., Walters, M., Woods, S., Koay, K.L., Nehaniv, C.L., Sisbot, A., Alami, R., Siméon, T.: How may I serve you? a robot companion approaching a seated person in a helping context. In: ACM/IEEE International Conference on Human-Robot Interaction, pp. 172–179. ACM (2006)
Dragan, A.D., Lee, K.C.T., Srinivasa, S.S.: Legibility and predictability of robot motion. In: ACM/IEEE International Conference on Human-Robot Interaction, pp. 301–308. IEEE (2013)
Guillaume, A., Drake, C., Rivenez, M., Pellieux, L., Chastres, V.: Perception of urgency and alarm design. In: International Conference on Auditory Display. Elsevier (2002)
Harrison, C., Horstman, J., Hsieh, G., Hudson, S.: Unlocking the expressivity of point lights. In: ACM Conference on Human Factors in Computing Systems, pp. 1683–1692. ACM (2012)
Lewis, C.F., McBeath, M.K.: Bias to experience approaching motion in a three-dimensional virtual environment. Perception 33(3), 259–276 (2004)
McCrickard, D.S., Chewar, C.M., Somervell, J.P., Ndiwalana, A.: A model for notification systems evaluation-assessing user goals for multitasking activity. ACM Trans. Comput. Hum. Interact. 10(4), 312–338 (2003)
Pousman, Z., Stasko, J.: A taxonomy of ambient information systems: four patterns of design. In: Working Conference on Advanced Visual Interfaces, pp. 67–74. ACM (2006)
Song, S., Yamada, S.: Effect of expressive lights on human perception and interpretation of functional robot. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, p. LBW629. ACM (2018)
Szafir, D., Mutlu, B., Fong, T.: Communicating directionality in flying robots. In: ACM/IEEE International Conference on Human-Robot Interaction, pp. 19–26. ACM (2015)
Szafir, D.J.: Human interaction with assistive free-flying robots. PhD thesis, The University of Wisconsin-Madison (2015)
Wool, L.E., Komban, S.J., Kremkow, J., Jansen, M., Li, X., Alonso, J.M., Zaidi, Q.: Salience of unique hues and implications for color theory. J. Vis. 15(2), 10 (2015)
Zhou, A., Hadfield-Menell, D., Nagabandi, A., Dragan, A.D.: Expressive robot motion timing. In: ACM/IEEE International Conference on Human-Robot Interaction, pp. 22–31. ACM (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Cha, E., Fitter, N.T., Kim, Y., Fong, T., Matarić, M. (2020). Generating Expressive Light Signals for Appearance-Constrained Robots. In: Xiao, J., Kröger, T., Khatib, O. (eds) Proceedings of the 2018 International Symposium on Experimental Robotics. ISER 2018. Springer Proceedings in Advanced Robotics, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-030-33950-0_51
Download citation
DOI: https://doi.org/10.1007/978-3-030-33950-0_51
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33949-4
Online ISBN: 978-3-030-33950-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)