Can Robots Elicit Different Comfortability Levels? | SpringerLink
Skip to main content

Can Robots Elicit Different Comfortability Levels?

  • Conference paper
  • First Online:
Social Robotics (ICSR 2020)

Abstract

Social interactions entail often complex and dynamic situations that follow non-explicit, unwritten rules. Comprehending those signals and knowing how to respond becomes the key to the success of any social communication. Thus, in order to integrate a robot into a social context it should be capable of (at least) understanding others’ emotional states. Nonetheless, mastering such skill is beyond reach for current robotics which is why we introduce the single internal state which we believe reveals the most regarding interactive communications. We named it Comfortability and defined it as (disapproving of or approving of) the situation that arises as a result of a social interaction which influences one’s own desire of maintaining or withdrawing from it.

Consequently, in this paper we aim to show that Comfortability can be evoked by robots, investigating at the same time its connection with other emotional states. To do that, we performed two online experiments on 196 participants asking them to imagine being interviewed by a reporter on a sensitive topic. The interviewer’s actions were presented in two different formats: the first experiment (the Narrative Context) presented the actions as text; whereas the second experiment (the Visual Context) presented the actions as videos performed by the humanoid robot iCub. The actions were designed to evoke different Comfortability levels. According to the experimental results, Comfortability differs from the other reported emotional and affective states and more importantly, it can be evoked by both, humans and robots in an imaginary interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ball, A., Silvera-Tawil, D., Rye, D., Velonaki, M.: Group comfortability when a robot approaches. In: Beetz, M., Johnston, B., Williams, M.A. (eds.) Social Robotics. ICSR 2014. Lecture Notes in Computer Science, vol. 8755, pp. 44–53. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-11973-1_5

    Chapter  Google Scholar 

  2. Carpinella, C.M., Wyman, A.B., Perez, M.A., Stroessner, S.J.: The robotic social attributes scale (RoSAS) development and validation. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 254–262 (2017)

    Google Scholar 

  3. Ekman, P.: Facial expression and emotion. Am. Psychol. 48(4), 384 (1993)

    Article  Google Scholar 

  4. Golleman, D.: Social Intelligence: The Revolutionary New Science of Human Relationships, p. 544 (2006)

    Google Scholar 

  5. Koay, K.L., Walters, M.L., Dautenhahn, K.: Methodological issues using a comfort level device in human-robot interactions. In: ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005, pp. 359–364. IEEE (2005)

    Google Scholar 

  6. Matsufuji, A., Shiozawa, T., Hsieh, W.F., Sato-Shimokawara, E., Yamaguchi, T., Chen, L.H.: The analysis of nonverbal behavior for detecting awkward situation in communication. In: 2017 Conference on Technologies and Applications of Artificial Intelligence (TAAI), pp. 118–123. IEEE (2017)

    Google Scholar 

  7. Menne, I.M.: Yes, of course? An investigation on obedience and feelings of shame towards a robot. In: Kheddar, A., et al. (eds.) Social Robotics. ICSR 2017. Lecture Notes in Computer Science, vol. 10652, pp. 365–374. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70022-9_36

    Chapter  Google Scholar 

  8. Metta, G., Sandini, G., Vernon, D., Natale, L., Nori, F.: The ICUB humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 50–56 (2008)

    Google Scholar 

  9. Pettinati, M.J., Arkin, R.C.: Identifying opportunities for relationship-focused robotic interventions in strained hierarchical relationships*. In: IROS, pp. 297–304 (2019)

    Google Scholar 

  10. Redondo, M.E.L.: Comfortability detection for adaptive human-robot interactions. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), pp. 35–39. IEEE (2019)

    Google Scholar 

  11. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161 (1980)

    Article  Google Scholar 

  12. Samrose, S., et al.: Visual cues for disrespectful conversation analysis. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 580–586. IEEE (2019)

    Google Scholar 

  13. Schneeberger, T., Ehrhardt, S., Anglet, M.S., Gebhard, P.: Would you follow my instructions if i was not human? Examining obedience towards virtual agents. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 1–7. IEEE (2019)

    Google Scholar 

  14. Schneeberger, T., Scholtes, M., Hilpert, B., Langer, M., Gebhard, P.: Can social agents elicit shame as humans do? In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 164–170. IEEE (2019)

    Google Scholar 

  15. Steunebrink, B.R., Dastani, M., Meyer, J. J. C.: The OCC model revisited. In: Proceedings of the 4th Workshop on Emotion and Computing. Association for the Advancement of Artificial Intelligence (2009)

    Google Scholar 

  16. Sun, M., et al.: Adaptive user-centered design for safety and comfort of physical human nursing-care robot interaction. In: Duffy, V.G. (ed.) Digital Human Modeling and Applications in Health, Safety, Ergonomics, and Risk Management. Healthcare and Safety of the Environment and Transport. DHM 2013. Lecture Notes in Computer Science, vol. 8025, pp. 365–372. Springer, Berlin, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39173-6_43

    Chapter  Google Scholar 

Download references

Acknowledgements

Alessandra Sciutti is supported by a Starting Grant from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program. G.A. No. 804388, wHiSPER.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maria Elena Lechuga Redondo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Redondo, M.E.L., Vignolo, A., Niewiadomski, R., Rea, F., Sciutti, A. (2020). Can Robots Elicit Different Comfortability Levels?. In: Wagner, A.R., et al. Social Robotics. ICSR 2020. Lecture Notes in Computer Science(), vol 12483. Springer, Cham. https://doi.org/10.1007/978-3-030-62056-1_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-62056-1_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-62055-4

  • Online ISBN: 978-3-030-62056-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics