Abstract
Facial expressions have always attracted considerable attention as a form of nonverbal communication. In visual applications such as movies, games, and animations, people tend to be interested in exaggerated expressions rather than regular expressions since the exaggerated ones deliver more vivid emotions. In this paper, we propose an automatic method for exaggeration of facial expressions from motion-captured data with a certain personality type. The exaggerated facial expressions are generated by using the exaggeration mapping (EM) that transforms facial motions into exaggerated motions. As all individuals do not have identical personalities, a conceptual mapping of the individual’s personality type for exaggerating facial expressions needs to be considered. The Myers–Briggs type indicator, which is a popular method for classifying personality types, is employed to define the personality-type-based EM. Further, we have experimentally validated the EM and simulations of facial expressions.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Ahn S, Ozawa S (2004) Generating facial expressions based on estimation of muscular contraction parameters from facial feature points. In: Proceedings of IEEE international conference on systems, man and cybernetics, vol 1, pp 660–665
Bregler C, Loeb L, Chuang E, Deshpande H (2002) Turning to the masters: motion capturing cartoons. ACM SIGGRAPH 2002 Trans Graph 21(3):399–407
Brennan SE (1982) Caricature generator: the dynamic exaggeration of faces by computer. Leonardo, The MIT Press, 18(3):170–178
Calder AJ, Young AW, Rowland D, Perrett DI (1997) Computer-enhanced emotion in facial expressions. In: Proceedings of the royal society b: biological sciences, vol 264, no 1383, pp 919–925
Caldera AJ, Rowland D, Young AW, Nimmo-Smith I, Keane J, Perrett DI (2000) Caricaturing facial expressions. Cognition 76(2):105–146
Cao Y, Faloutsos P, Pighin F (2003) Unsupervised learning for speech motion editing. In: Proceedings of ACM SIGGRAPH/eurographics symposium on Computer animation, pp 225–231
Chai JX, Xiao J, Hodgins J (2003) Vision-based control of 3D facial animation. In: Eurographics/SIGGRAPH symposium on computer animation
Chen H, Liu Z, Rose C, Xu Y, Shum H, Salesin D (2004) Example-Based Composite Sketching of Human Portraits. In: Proceedings of the 3rd international symposium on non-photorealistic animation and rendering, pp 95–153
Chin S, Kim KY (2009) Emotional intensity-based facial expression cloning for low polygonal applications. IEEE Trans Syst Man Cybern C 39(3):315–330
Chin S, Lee CY (2010) Exaggeration of facial expressions from motion capture data. Chin Opt Lett 8(1):29–32
Chin S, Lee CY (2013) Personality trait and facial expression filter-based brain-computer interface. Int J Adv Rob Syst 10:138
Chin S, Lee CY, Lee J (2009) Personal Style and non-negative matrix factorization based exaggerative expressions of face. In: Proceedings of the 2009 international conference on computer graphics and virtual Reality, pp 91–96
Chuang ES (2004) Analysis, synthesis, and retargeting of facial expressions. Ph.D. dissertation, Stanford University, Palo Alto, CA
Clarke L, Chen M (2011) Automatic generation of 3D caricatures based on artistic deformation styles. IEEE Trans Vis Comput Graph 17(6):808–821
Cohen I, Sebe N, Garg A, Chen LS, Huang TS (2003) Facial expression recognition from video sequences: temporal and static modeling. Comput Vis Imag Underst 91(1–2):160–187
Cootes TF, Edwards GJ, Taylor CJ (2001) Active appearance models. IEEE Trans Pattern Anal Mach Intell 23(6):681–685
Deng Z, Neumann U (2006) eFASE: expressive facial animation synthesis and editing with phoneme-isomap controls. In: Proceedings of ACM SIGGRAPH/EG symposium on computer animation, pp 251–259
Ekman P (1972) Universal and cultural differences in facial expressions of emotion. In: Nebraska symposium on motivation, vol 38, pp 207–283
Ekman P, Friesen WV (2003) Unmasking the face: a guide to recognizing emotions from facial clues. Malor Books, Cambridge, MA
Gardner W, Martinko M (1996) Using the Meyers–Briggs Type Indicator to study managers: a literature review and research agenda. J Manag 22(1):45–83
Hoyer PO (2004) Non-negative Matrix Factorization with Sparseness Constraints. J Mach Learn Res 5:1457–1469
Keltner D, Ekman P (1996) Affective intensity and emotional responses. Cogn Emot 10(3):323–328
Keltner D, Ekman P (2004) Emotional expression, and the art of empirical epiphany. J Res Pers 38:37–44
Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Imag Process 16(1):172–187
Kshirsagar S, Magnenat-Thalmann N (2002) A multilayer personality model. In: Proceedings of 2nd international symposium on smart graphics, ACM Press, pp 107–115
Lee DD, Seung HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401:788–791
Lewiner T, Vieira T, Martínez D, Peixoto A, Mello V, Velho L (2011) Interactive 3D caricature from harmonic exaggeration. Comput Graph 35(3):586–595
Lewis JP, Cordner M, Fong N (2000) Pose space deformation: a unified approach to shape interpolation and skeleton-driven deformation. In: Proceedings of ACM SIGGRAPH 2000 international conference on computer graphics and interactive techniques, New Orleans, LO, pp 165–172
Liang L, Chen H, Xu Y, Shum H (2002) Example-based caricature generation with exaggeration. In: Proceedings of the 10th Pacific conference of computer graphics and application, p 386
Liu S, Wang J, Zhang M, Wang Z (2012) Three-dimensional cartoon facial animation based on art rules. The visual computer, ISSN: 0178-2789, pp 1–15
Ma X, Le B, Deng Z (2009) Style learning and transferring for facial animation editing. In: Proceedings of ACM SIGGRAPH/Eurographics symposium on computer animation, pp 123–132
Martin RA, Berry GE, Dobranski T, Horne M, Dodgson PG (1996) Emotion perception threshold: individual differences in emotional sensitivity. J Res Pers 30(2):290–305
Miranda JC, Alvarez X, Orvalho J, Gutierrez D, Sousa AA, Orvalho V (2012) Sketch express: a sketching interface for facial animation. Comput Graph 36(6):585–595
Mo Z, Lewis JP, Neumann U (2004) Improved Automatic Caricature by Feature Normalization and Exaggeration. In: Proceedings of ACM SIGGRAPH 2004 sketches international conference on computer graphics and interactive techniques, pp 57–59
Morand DA (2001) The emotional intelligence of managers: assessing the construct validity of a nonverbal measure of people skill. J Bus Psychol 16(1):21–33
Myers I, McCaulley M, Quenk N, Hammer A (1998) In MBTI manual: a guide to the development and use of the Myers–Briggs type indicator. Consulting Psychologists Press, Palo Alto, CA
Noh JY, Neumann U (2001) Expression cloning. In: Proceedings of ACM SIGGRAPH 2001 international conference on computer graphics and interactive techniques, pp 277–288
Paatero P, Tapper U (1994) Positive matrix factorization: a non-negative factor model with optimal utilization of error estimates of data values. Environmetrics 5(2):111–126
Pandzic IS, Forchheimer R (2002) In MPEG-4 facial animation the standard, implementation and application. Wiley, Southern Gate
Parke FI (1972) Computer generated animation of faces. In: Proceedings of ACM annual conference, vol 1, pp 451–457
Parke FI, Waters K (2008) Computer facial animation, 2nd edn. Wellesley, MA, A K Peters
Pighin F, Hecker J, Lischinski D, Szeliski R, Salesin D (1995) “Synthesizing realistic facial expressions from photographs. In: Proceedings of SIGGRAPH 1998 international conference on computer graphics and interactive techniques, San Antonio, TX, pp 75–84
Platt SM, Badler NI (1981) Animating facial expression. In: Proceedings of ACM SIGGRAPH computer graphics, vol 15, no 3, pp 245–252
Ratner P (2003) 3-D human modeling and animation, 2nd edn. Wiley, New York
Redman L (1984) How to draw caricatures. McGraw-Hill, New York
Rhodes G (1997) Superportraits. Psychology Press, Hove, East Sussex, UK
Salovey P, Mayer JD (1990) Emotional intelligence. Imaginat Cognit Pers 9(3):185–211
Scherer KR (1979) Nonlinguistic vocal indicators of emotion and psychopathology. Emot Pers Psychopathol, New York, pp 493–529
Sifakis E, Neverov I, Fedkiw R (2005) Automatic Determination of Facial Muscle Activations from Sparse Motion Capture Marker Data. ACM SIGGRAPH 2005 Trans Graph 24(3):417–425
Soon A, Lee WS (2006) Shape-based detail-preserving exaggeration of extremely accurate 3D faces. Vis Comput 22(7):478–492
Tarantili VV, Halazonetis DJ, Spyropoulos MN (2005) The spontaneous smile in dynamic motion. Am J Orthod Dentofac Orthop 128(1):8–15
Terzopoulos D, Waters K (1990) Physically-based facial modeling, analysis, and animation. J Vis Computer Anim 1(4):73–80
Wang SF, Lai SH (2010) Manifold-based 3D face caricature generation with individualized facial feature extraction. Pac Graph 29(7):2161–2168
Wang Y, Huang X, Lee CS, Zhang S, Li Z, Samaras D, Metaxas D, Elgammal A, Huang P (2004) High resolution acquisition, learning and transfer of dynamic 3-D facial expressions. Eurograph 2004 Comput Graph Forum 23(3):677–686
Waters K (1987) A muscle model for animating three-dimensional facial expressions. In: Proceedings of ACM SIGGRAPH 1987 computer graphics, vol 21, no 4, pp 17–24
Zhang Q, Liu Z, Guo B, Terzopoulos D, Shum H (2006a) Geometry-driven photorealistic facial expression synthesis. IEEE Trans Vis Comput Graph 12(1):48–60
Zhang Q, Liu Z, Guo B, Terzopoulos D, Shum HY (2006b) Geometry-driven photorealistic facial expression synthesis. IEEE Trans Vis Comput Graph 12(1):48–60
Acknowledgments
This research was partially supported by the Korea Research Foundation Grant fund (KRF-521-D00398).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chin, S., Lee, C.Y. & Lee, J. An automatic method for motion capture-based exaggeration of facial expressions with personality types. Virtual Reality 17, 219–237 (2013). https://doi.org/10.1007/s10055-013-0227-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10055-013-0227-8