Abstract
Virtual Reality (VR) environments have been used for training, education, and entertainment due to the interactive and embodied experiences the technology provides. Studies have shown that VR can be used to help support individuals with intellectual disabilities in various aspects of their lives. Likewise, conversational agents such as chatbots can be used to bolster competence training and well-being management for this user population. This paper addresses the need for inclusive job interview practice for individuals with intellectual disabilities by discussing the development of a VR application, AllyChat. A two-part development phase is presented with pilot testing included for each phase. First, a conversational AI chatbot is tested and with positive feedback iterated upon to develop an immersive mock job interview experience in VR. A second pilot study is conducted with university students to test the functionality of a high-fidelity prototype. Future work will include improvement upon the developed VR application and further testing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Erolin, C., Reid, L., McDougall, S.: Using virtual reality to complement and enhance anatomy education. J. Vis. Commun. Med. 42(3), 93–101 (2019). https://doi.org/10.1080/17453054.2019.1597626
Yee, N., Bailenson, J.N., Rickertsen, K.: A meta-analysis of the impact of the inclusion and realism of human-like faces on user experiences in interfaces. In: Conference Human Factors Computer System - Proceedings, pp. 1–10 (2007). https://doi.org/10.1145/1240624.1240626
Nabors, L., Monnin, J., Jimenez, S.: A scoping review of studies on virtual reality for individuals with intellectual disabilities. Adv. Neurodev. Disord. 4, 344–356 (2020)
de Filippis, M.L., et al.: Preliminary results of a systematic review: quality assessment of conversational agents (chatbots) for people with disabilities or special needs. In: Miesenberger, K., Manduchi, R., Covarrubias Rodriguez, M., Peňáz, P. (eds.) ICCHP 2020. LNCS, vol. 12376, pp. 250–257. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58796-3_30
Malinverni, L., Mora-Guiard, J., Padillo, V., Mairena, M.A., Hervás, A., Pares, N.: Participatory design strategies to enhance the creative contribution of children with special needs. In: ACM International Conference on Proceeding Series, pp. 85–94 (2014). https://doi.org/10.1145/2593968.2593981
González, H.S., Vega Córdova, V., Exss Cid, K., Jarpa Azagra, M., Álvarez-Aguado, I.: Including intellectual disability in participatory design processes: methodological adaptations and supports. In: ACM International Conference on Proceeding Series, vol. 1, pp. 55–63 (2020). https://doi.org/10.1145/3385010.3385023
Abascal, J., Arrue, M., Pérez, J.E.: Applying participatory design with users with intellectual disabilities, pp. 321–326 (2020). https://doi.org/10.18573/book3.ao
Narynov, S., Zhumanov, Z., Gumar, A., Khassanova, M., Omarov, B.: Chatbots and conversational agents in mental health: a literature review. In: International Conference on Control Automation System, vol. 2021-October, no. Iccas, pp. 353–358 (2021). https://doi.org/10.23919/ICCAS52745.2021.9649855
Nicolescu, L., Tudorache, M.T.: Human-computer interaction in customer service: the experience with AI chatbots—a systematic literature review. Electronics 11(10), 1579 (2022). https://doi.org/10.3390/electronics11101579
Zhang, S., Dinan, E., Urbanek, J., Szlam, A., Kiela, D., Weston, J.: Personalizing dialogue agents: I have a dog, do you have pets too?. In: ACL 2018 - 56th Annual Meeting Association Computer Linguistics Processing Conference, Long Paper, vol. 1, pp. 2204–2213 (2018). https://doi.org/10.18653/v1/p18-1205
Gao, J., Galley, M., Li, L.: Neural approaches to conversational AI. Found. Trends Inf. Retr. 13(2–3), 127–298 (2019). https://doi.org/10.1561/1500000074
Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. In: Proceedings of the 34th International Conference on Neural Information Processing Systems (NIPS 2020), pp. 1877–1901 (2020). http://arxiv.org/abs/2007.07582
Snell, J., Swersky, K., Zemel, R.: Prototypical networks for few-shot learning. In: Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS 2017), pp. 4080–4090 (2017). https://doi.org/10.12783/dtetr/mcee2017/15746
Mori, M., MacDorman, K.F., Kageki, N.: The uncanny valley. IEEE Robot. Autom. Mag. 19(2), 98–100 (2012). https://doi.org/10.1109/MRA.2012.2192811
Fadhil, A., Wang, Y., Reiterer, H.: Assistive conversational agent for health coaching: a validation study. Methods Inf. Med. 58(1), 9–23 (2019). https://doi.org/10.1055/s-0039-1688757
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Garcia-Pi, B. et al. (2023). AllyChat: Developing a VR Conversational AI Agent Using Few-Shot Learning to Support Individuals with Intellectual Disabilities. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14145. Springer, Cham. https://doi.org/10.1007/978-3-031-42293-5_43
Download citation
DOI: https://doi.org/10.1007/978-3-031-42293-5_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-42292-8
Online ISBN: 978-3-031-42293-5
eBook Packages: Computer ScienceComputer Science (R0)