Abstract
There has been a wave of interest in affect recognition among researchers in the field of affective computing. Most of these research use a context independent approach. Since humans may misunderstand other’s observed facial, vocal, or body behavior without any contextual knowledge, we question whether any of these human-centric affect-sensitive systems can be robust enough without any contextual knowledge. To answer this question, we conducted a study using previously studied audio files in three different settings; these include: no contextual indication, one level of contextual knowledge (either action or relationship/environment), and two levels of contextual knowledge (both action and relationship/environment). Our work confirms that indeed the contextual knowledge can improve recognition of human emotion.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ekman, P., Friesen, W.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Investigator’s Guide 2 Parts. Consulting Psychologists Press, Sunnyvale (1978)
Frijda, N.H.: The Emotions. Cambridge University Press, Cambridge (1986)
Scherer, K.R.: Vocal correlates of emotional arousal and affective disturbance. In: Wagner, H., Manstead, A. (eds.) Handbook of Social Psychophysiology, pp. 165–197. Wiley, New York (1989)
Parkinson, B., Fischer, A., Manstead, A.: Emotion in Social Relations: Cultural, Group, and İnterpersonal Processes. Psychology Press, New York (2005)
Kitayama, S., Markus, H.: Emotion and culture: Empirical studies of mutual influence. American Psychological Association, Washington, D.C. (1994). http://dx.doi.org/10.1037/10152-000
Chung, C., Pennebaker, J.: Linguistic ınquiry and word count (LIWC): pronounced ‘‘Luke,’’… and other useful facts: In: McCarthy, P., Boonthum-Denecke, C. (eds.) Applied Natural Language Processing: Identification, Investigation and Resolution, pp. 206–229 (2012)
Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)
Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)
Calvo, R.A., D’Mello, S.: Affect detection: an ınterdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)
Gunes, H., Hung, H.: Is automatic facial expression recognition of emotions coming to a dead end? The rise of the new kids on the block. Image Vis. Comput. 55, 6–8 (2016)
Cheng, W.C., Liao, H.C., Pan, M.H., Chen, C.C.: A fatigue detection system with eyeglasses removal. In: 15th International Conference on Advanced Communication Technology (ICACT). IEEE (2013)
Meng, H., Huang, D., Wang, H., Yang, H., AI-Shuraifi, M., Wang, Y.: Depression recognition based on dynamic facial and vocal expression features using partial least square regression. In: Proceedings of the 3rd ACM International Workshop on Audio/Visual Emotion Challenge (2013)
Khan, R.A., Meyer, A., Konik, H., Bouakaz, S.: Pain detection through shape and appearance features. In: 2013 IEEE International Conference on Multimedia and Expo (ICME). IEEE (2013)
Bousmalis, K., Marc, M., Maja, P.: Towards the automatic detection of spontaneous agreement and disagreement based on nonverbal behaviour: a survey of related cues, databases, and tools. Image Vis. Comput. 31(2), 203–221 (2013)
Cha, S., Wookhyun, K.: Analyze the learner’s concentration using detection of facial feature points (2015)
Thepade, S.D., Bidwai, P.V.: Contemplation of ımage based Iris recognition. Int. J. Eng. Res. Appl. 3(2), 1056–1066 (2013). ISSN-2248-9622
Bosch, N., Chen, Y., D’Mello, S.: It’s written on your face: detecting affective states from facial expressions while learning computer programming. In: Trausan-Matu, S., Boyer, K.E., Crosby, M., Panourgia, K. (eds.) ITS 2014. LNCS, vol. 8474, pp. 39–44. Springer, Cham (2014). doi:10.1007/978-3-319-07221-0_5
Vasiete, E., Tom, Y.: Multimodal frustration detection on smartphones. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM (2015)
Brown, M., Salverda, A.P., Gunlogson, C., Tanenhaus, M.K.: Interpreting prosodic cues in discourse context. Lang. Cogn. Neurosci. 30(1-2), 149–166 (2015)
Sudhakar, R., Manjare, C.A.: Analysis of speech features for emotion detection: a review. In: 2015 International Conference on Computing Communication Control and Automation (ICCUBEA). IEEE (2015)
Kakouros, S., Okko, R.: Statistical learning of prosodic patterns and reversal of perceptual cues for sentence prominence. In: Proceedings of the 38th Annual Conference of the Cognitive Science Society, Philadelphia, Pennsylvania (2016)
Laban, R., Ullmann, L.: The mastery of movement, 4th edn. Princeton Book Company Publishers, Hightstown (1988)
Scherer, K.R.: Personality inference from voice quality: the loud voice of extraversion. Eur. J. Soc. Psychol. 8, 467–487 (1978)
Hess, U., Hareli, S.: The impact of context on the perception of emotions. In: The Expression of Emotion Philosophical, Psychological and Legal Perspectives
Shields, S.A.: The Politics of emotion in everyday life appropriate emotion and claims on ıdentity. Rev. Gen. Psychol. 9(1), 3–15 (2005)
Algoe, S.B., Brenda, N., John, D.: Gender and job status as contextual cues for the interpretation of facial expression of emotion. Sex Roles 42(3–4), 183–208 (2000)
Matsumoto, D., Hwang, H.: Judging faces in context. Soc. Pers. Psychol. Compnass 4(6), 393–402 (2010)
Donges, U., Kersting, A., Suslow, T.: Women’s greater ability to perceive happy facial emotion automatically: gender differences in affective priming. PLoS ONE 7(7), e41745 (2012). doi:10.1371/journal.pone.0041745. Zalla, T. (ed.) Ecole Normale Supe’rieure, France
Kitayama, S., Markus, H., Kurakawa, M.: Culture, emotion, and well-being: good feelings in Japan and the United States. Cogn. Emot. 14(1), 93–124 (2000)
Hareli, S., Kafetsios, K., Hess, U.: A cross-cultural study on emotion expression and the learning of social norms. Front. Psychol. 6, 1501 (2015). http://dx.doi.org/10.3389/fpsyg.2015.01501
Kitaoka, N., Enami, D., Nakagawa, S.: Effect of acoustic and linguistic contexts on human and machine speech recognition. Comput. Speech Lang. 28, 769–787 (2014)
Hammal, Z., Suarez, M.T.: Towards context based affective computing. In: 2013 Humaine Association Conference on Affective Computing Intelligent Interaction, 2013, p. 802 (2013). ISBN: 9780769550480
Lazarus, R.S.: Psychological Stress and the Coping Process. McGraw Hill, New York (1966)
Scherer, K., Ellgring, H.: Multimodal expression of emotion: affect programs or componential appraisal patterns? Emotion 7, 158–171 (2007)
Johnstone, T., Reekum, C, and Scherer, K.: Vocal expression correlates of appraisal processes. In: Appraisal Processes in Emotion – Theory, Methods, and Research (2001)
Kaiser, S., and Thomas W.: Facial expressions as indicators of appraisal processes. In: Appraisal Processes in Emotion: Theory, Methods, Research, pp. 285–300 (2001)
Pecchinenda, A.: The psychophysiology of appraisals (2001)
Jürgens, U.: Neural pathways underlying vocal control. Neurosci. Biobehav. Rev. 26(2), 235–258 (2002)
Dey, A.K.: Understanding and using context. Pers. Ubiquit. Comput. Spec. Issue Situated Interact. Ubiquit. Comput. 5(1), 4–7 (2001)
Russell, J.A.: Core affect and the psychological construction of emotion. Psychol. Rev. 110, 145–172 (2003)
Panksepp, J.: Affective Neuroscience: The Foundations of Human and Animal Emotions. Oxford University Press, Cambridge (1998)
Keltner, D., Haidt, J.: Social functions of emotions. In: Mayne, T.J., Bonanno, G.A. (eds.) Emotions: Current Issues and Future Directions, pp. 192–213. Guilford Press, New York (2001)
Bachorowski, J., Owren, M.: Vocal expression of emotion—acoustic properties of speech are associated with emotional ıntensity and context. Psychol. Sci. 6, 219–224 (1995)
Bänziger, T., Mortillaro, M., Scherer, K.R.: Introducing the Geneva multimodal expression corpus for experimental research on emotion perception. Emotion (2011). Advance online publication. doi:10.1037/a0025827, http://www.affective-sciences.org/gemep/coreset
Google, Inc. https://www.google.com/
Marpaung, A., Gonzalez, A.: Toward building automatic affect recognition machine using acoustics features. In: FLAIRS Conference (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Marpaung, A., Gonzalez, A. (2017). Can an Affect-Sensitive System Afford to Be Context Independent?. In: Brézillon, P., Turner, R., Penco, C. (eds) Modeling and Using Context. CONTEXT 2017. Lecture Notes in Computer Science(), vol 10257. Springer, Cham. https://doi.org/10.1007/978-3-319-57837-8_38
Download citation
DOI: https://doi.org/10.1007/978-3-319-57837-8_38
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-57836-1
Online ISBN: 978-3-319-57837-8
eBook Packages: Computer ScienceComputer Science (R0)