Abstract
The HUMAINE project is concerned with developing interfaces that will register and respond to emotion, particularly pervasive emotion (forms of feeling, expression and action that colour most of human life). The HUMAINE Database provides naturalistic clips which record that kind of material, in multiple modalities, and labelling techniques that are suited to describing it.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Abrilian, S., Devillers, L., Buisine, S., Martin, J.-C.: EmoTV1: Annotation of Real-life Emotions for the Specification of Multimodal Affective Interfaces. In: 11th Int. Conf. Human-Computer Interaction (HCII 2005). Electronic proceedings, LEA, Las Vegas, USA (2005)
Ang, J., Dhillon, R., Krupski, A., Shriberg, E., Stolcke, A.: Prosody-based automatic detection of annoyance and frustration in human–computer dialog. In: Proceedings ICSLP, Denver, Colorado (2002)
Batliner, A., Hacker, C., Steidl, S., Noth, E., Haas, J.: From emotion to interaction: Lessons learned from real human–machine dialogues. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 1–12. Springer, Heidelberg (2004)
Bänziger, T., Tran, V., Scherer, K.R.: The Geneva Emotion Wheel: A tool for the verbal report of emotional reactions. Bari, Italy (2005)
Batliner, A., Fischer, K., Huber, R., Spilker, J., Nöth, E.: How to find trouble in communication. Speech Communication 40, 117–143 (2003)
Campbell, N.: Recording and storing of speech data. In: Proceedings LREC (2002)
Cowie, R., Douglas-Cowie, E., Apolloni, B., Taylor, J., Romano, A., Fellenz, W.: What a neural net needs to know about emotion words. In: Mastorakis, N. (ed.) Computational Intelligence and Applications. World Scientific Engineering Society, pp. 109–114 (1999)
Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon, E., Sawey, M., Schröder, M.: ’Feeltrace’: an instrument for recording perceived emotion in real time. In: Proceedings of the ISCA Workshop on Speech and Emotion, pp. 19–24 (2000)
Devillers, L., Cowie, R., Martin, J.-C., Douglas-Cowie, E., Abrilian, S., McRorie, M.: Real life emotions in French and English TV video clips: an integrated annotation protocol combining continuous and discrete approaches. In: 5th international conference on Language Resources and Evaluation (LREC 2006), Genoa, Italy (2006)
Douglas-Cowie, E., Campbell, N., Roach, P.: Emotional speech: Towards a new generation of databases. Speech Communication 40(1–2), 33–60 (2003)
Douglas-Cowie, E., et al.: The description of naturally occurring emotional speech. In: Proceedings of 15th International Congress of Phonetic Sciences, Barcelona (2003)
France, D., Shiavi, R., Silverman, S., Silverman, M., Wilkes, D.: Acoustical properties of speech as indicators of depression and suicidal risk. IEEE Transactions on Biomedical Engineering 47(7) (2000)
Greasley, P., Sherrard, C., Waterman, M.: Emotion in language and speech: Methodological issues in naturalistic approaches. Language and Speech 43, 355–375 (2000)
Ioannou, S V., Raouzaiou, A T., Tzouvaras, V A., Mailis, T P., Karpouzis, K C., Kollias., S D: Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Networks 18, 423–435 (2005)
Juslin, P., Laukka, P.: Communication of emotions in vocal expression and music performance. Psychological Bulletin 129(5), 770–814 (2002)
Kienast, M., Sendlmeier, W.F.: Acoustical analysis of spectral and temporal changes in emotional speech. In: Cowie, R., Douglas, E., Schroeder, M. (eds.) Speech and emotion: Proc ISCA workshop. Newcastle, Co. Down, pp. 92–97 (September 2000)
Kipp, M.: Anvil - A Generic Annotation Tool for Multimodal Dialogue. In: 7th European Conference on Speech Communication and Technology (Eurospeech 2001), Aalborg, Danemark (2001), http://www.dfki.uni-sb.de/~kipp/research/index.html
Kipp, M.: Gesture Generation by Imitation. From Human Behavior to Computer Character Animation. Florida, Boca Raton (2004), http://www.dfki.de/~kipp/dissertation.html
Kita, S., van Gijn, I., van der Hulst, H.: Movement phases in signs and co-speech gestures, and their transcription by human coders. In: Wachsmuth, I., Fröhlich, M. (eds.) Gesture and Sign Language in Human-Computer Interaction. LNCS (LNAI), vol. 1371, Springer, Heidelberg (1998)
Leinonen, L., Hiltunen, T.: Expression of emotional-motivational connotations with a one-word utterance. Journ Acoustical Society of America 102(3), 1853–1863 (1997)
Martin, J.-C., Abrilian, S., Devillers, L.: Annotating Multimodal Behaviors Occurring during Non Basic Emotions. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, Springer, Heidelberg (2005), http://www.affectivecomputing.org/2005
McMahon, E., Cowie, R., Kasderidis, S., Taylor, J., Kollias, S.: What chance that a DC could recognise hazardous mental states from sensor outputs? In: Proc, DC Tales conference, Sanotrini (June 2003)
McNeill, D.: Hand and mind - what gestures reveal about thoughts. University of Chicago Press, IL (1992)
McNeill, D.: Gesture and Thought. The University of Chicago Press, Chicago (2005)
Sander, D., Grandjean, D., Scherer, K.: A systems approach to appraisal mechanisms in emotion. Neural Networks 18, 317–352 (2005)
Scherer, K.R., et al.: Preliminary plans for exemplars: Theory HUMAINE deliverable D3c (2004), http://emotion-research.net/deliverables/D3c.pdf
Schröder, M., Devillers, L., Karpouzis, K., Martin, J.-C., Pelachaud, C., Peter, C., Pirker, H., Schuller, B., Tao, J., Wilson, I.: What should a generic emotion markup language be able to represent? In: Paiva, A., Prada, R., Picard, R.W (eds.) ACII 2007. LNCS, vol. 4738, pp. 440–451. Springer, Heidelberg (2007)
Whissell, C.: The dictionary of affect in language. In: Plutchnik, R. (ed.) Emotion: Theory and research, pp. 113–131. Harcourt Brace, New York (1989)
Yacoub, S., Simske, S., Lin, X., Burns, J.: Recognition of emotions in interactive voice response systems. In: Proceedings of the Eurospeech, Geneva (2003)
Zara, A., Maffiolo, V., Martin, J.C., Devillers, L.: Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics. ACII (submitted, 2007)
Zara, A.: Modélisation des Interactions Multimodales Emotionnelles entre Utilisateurs et Agents Animés.Rapport de stage de Master. Ecole Doctorale Paris XI. LIMSI-CNRS (8 September, 2006)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Douglas-Cowie, E. et al. (2007). The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2007. Lecture Notes in Computer Science, vol 4738. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74889-2_43
Download citation
DOI: https://doi.org/10.1007/978-3-540-74889-2_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74888-5
Online ISBN: 978-3-540-74889-2
eBook Packages: Computer ScienceComputer Science (R0)