Abstract
Synthesizing human motions from existing motion capture data is the approach of choice in most applications requiring high- quality visual results. Usually to synthesize motion, short motion segments are concatenated into longer sequences by finding transitions at points where character poses are similar. If similarity is only a measure of posture correlation, without consideration for the stylistic variations of movement, the resulting motion might have unnatural discontinuities. Particularly prone to this problem are highly stylized motions, such as dance performances. This work presents a motion analysis framework, based on Laban Movement Analysis, that also accounts for stylistic variations of the movement. Implemented in the context of Motion Graphs, it is used to eliminate potentially problematic transitions and synthesize style-coherent animation, without requiring prior labeling of the data. The effectiveness of our method is demonstrated by synthesizing contemporary dance performances that include a variety of different emotional states. The algorithm is able to compose highly stylized motions that are reminiscent to dancing scenarios using only plausible movements from existing clips.







Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
The volume features (\(f^{19}{-}f^{23}\)), apart from describing the LMA Shape component, as given in [3], could also give intimations of the Space component, as they additionally reveal the character’s kinesphere.
previous Motion Graph implementations have suggested using a shorter window, however we set it at 35 to make it comparable to the LMA window.
References
Alexiadis, D.S., Daras, P.: Quaternionic signal processing techniques for automatic evaluation of dance performances from mocap data. IEEE Trans. Multimedia 99, 1–16 (2014)
Arikan, O., Forsyth, D.A.: Interactive motion generation from examples. ACM Trans. Gr. 21(3), 483–490 (2002)
Aristidou, A., Charalambous, P., Chrysanthou, Y.: Emotion analysis and classification: understanding the performers emotions using the LMA entities. Comput. Gr. Forum 34(6), 262–276 (2015)
Aristidou, A., Stavrakis, E., Charalambous, P., Chrysanthou, Y., Himona, S.L.: Folk dance evaluation using laban movement analysis. ACM J. Comput. Cult. Herit. 8(4), 20:1–20:19 (2015)
Aristidou, A., Zeng, Q., Stavrakis, E., Yin, K., Cohen-or, D., Chrysanthou, Y., Chen, B.: Emotion control of unstructured dance movements. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’17. Eurographics Association, Aire-la-Ville, Switzerland (2017)
Brand, M., Hertzmann, A.: Style machines. In: Proceedings of SIGGRAPH ’00, pp. 183–192. ACM Press/Addison-Wesley Publishing Co., New York (2000)
Calvert, T., Wilke, W., Ryman, R., Fox, I.: Applications of computers to dance. IEEE Comput. Gr. Appl. 25(2), 6–12 (2005)
Chan, J., Leung, H., Tang, J., Komura, T.: A virtual reality dance training system using motion capture technology. IEEE Trans. Learn. Technol. 4(2), 187–195 (2011)
Chao, M.W., Lin, C.H., Assa, J., Lee, T.Y.: Human motion retrieval from hand-drawn sketch. IEEE Trans. Visual. Comput. Gr. 18(5), 729–740 (2012)
Chi, D., Costa, M., Zhao, L., Badler, N.: The emote model for effort and shape. In: Proceedings of SIGGRAPH ’00, pp. 173–182. ACM, New York (2000)
Durupinar, F., Kapadia, M., Deutsch, S., Neff, M., Badler, N.I.: PERFORM: perceptual approach for adding OCEAN personality to human motion using laban movement analysis. ACM Trans. Gr. 36(1), 6:1–6:16 (2016)
Hartmann, B., Mancini, M., Pelachaud, C.: Formational parameters and adaptive prototype instantiation for mpeg-4 compliant gesture synthesis. In: Proceedings of the Computer Animation, CA ’02, pp. 111–120. IEEE Computer Society, Washington, DC, USA (2002)
Hartmann, B., Mancini, M., Pelachaud, C.: Implementing expressive gesture synthesis for embodied conversational agents. In: Proceedings of GW ’05, pp. 188–199. Springer, Berlin (2006)
Hartmann, S., Trunz, E., Krüger, B., Klein, R., Hullin, M.B.: Efficient multi-constrained optimization for example-based synthesis. Vis. Comput./Proc. Comput. Gr. Int. (CGI 2015) 31(6–8), 893–904 (2015)
Holden, D., Saito, J., Komura, T.: A deep learning framework for character motion synthesis and editing. ACM Trans. Gr. 35(4), 138:1–138:11 (2016)
Hsu, E., Pulli, K., Popović, J.: Style translation for human motion. ACM Trans. Gr. 24(3), 1082–1089 (2005)
Kapadia, M., Chiang, I.k., Thomas, T., Badler, N.I., Kider Jr., J.T.: Efficient motion retrieval in large motion databases. In: Proceedings of I3D ’13, pp. 19–28. ACM, New York (2013)
Kovar, L., Gleicher, M.: Automated extraction and parameterization of motions in large data sets. ACM Trans. Gr. 23(3), 559–568 (2004)
Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. ACM Trans. Gr. 21(3), 473–482 (2002)
Krüger, B., Tautges, J., Weber, A., Zinke, A.: Fast local and global similarity searches in large motion capture databases. In: Proceedings of SCA ’10, pp. 1–10 (2010)
Laban, R., Ullmann, L.: The Mastery of Movement, 4th edn. Dance Books Ltd, Binsted (2011)
Lee, J., Chai, J., Reitsma, P.S.A., Hodgins, J.K., Pollard, N.S.: Interactive control of avatars animated with human motion data. ACM Trans. Gr. 21(3), 491–500 (2002)
Li, Y., Wang, T., Shum, H.Y.: Motion texture: a two-level statistical model for character motion synthesis. ACM Trans. Gr. 21(3), 465–472 (2002)
Luo, P., Neff, M.: A perceptual study of the relationship between posture and gesture for virtual characters. In: Motion in Games, pp. 254–265 (2012)
Min, J., Chai, J.: Motion graphs++: a compact generative model for semantic motion analysis and synthesis. ACM Trans. Gr. 31(6), 153:1–153:12 (2012)
Min, J., Liu, H., Chai, J.: Synthesis and editing of personalized stylistic human motion. In: Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, I3D ’10, pp. 39–46. ACM, NY (2010)
Müller, M., Röder, T.: Motion templates for automatic classification and retrieval of motion capture data. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’06, 137–146 (2006)
Müller, M., Röder, T., Clausen, M.: Efficient content-based retrieval of motion capture data. ACM Trans. Gr. 24(3), 677–685 (2005)
Nakata, T., Mori, T., Sato, T.: Analysis of impression of robot bodily expression. J. Robot. Mechatron. 14(1), 27–36 (2002)
Okajima, S., Wakayama, Y., Okada, Y.: Human motion retrieval system based on LMA features using IEC method. In: Innovations in Intelligent Machines, pp. 117–130 (2012)
Oliveira, J.L., Naveda, L.A., Gouyon, F., Leman, M., Reis, L.P.: Synthesis of variable dancing styles based on a compact spatiotemporal representation of dance. In: Proceedings of IROS ’10 (2010)
Pearson, K.: Notes on the history of correlation. Biometrika 13(1), 25–45 (1920)
Pejsa, T., Pandzic, I.S.: State of the art in example-based motion synthesis for virtual characters in interactive applications. Comput. Gr. Forum 29(1), 202–226 (2010)
Ren, L., Patrick, A., Efros, A.A., Hodgins, J.K., Rehg, J.M.: A data-driven approach to quantifying natural human motion. ACM Trans. Gr. 24(3), 1090–1097 (2005)
Russell, J.A.: A circumplex model of affect. J. Personal. Soc. Psychol. 39, 1161–1178 (1980)
Senecal, S., Cuel, L., Aristidou, A., Magnenat-Thalman, N.: Continuous body emotion recognition system during theater performances. Comput. Anim. Virtual Worlds 27(3–4), 311–320 (2016)
Shapiro, A., Cao, Y., Faloutsos, P.: Style components. In: Proceedings of Graphics Interface, GI ’06, pp. 33–39. Canadian Information Processing Society, Toronto, Canada (2006)
Shiratori, T., Nakazawa, A., Ikeuchi, K.: Dancing-to-music character animation. Comput. Gr. Forum 25(3), 449–458 (2006)
Tang, J.K.T., Chan, J.C.P., Leung, H.: Interactive dancing game with real-time recognition of continuous dance moves from 3d human motion capture. In: Proceedings of ICUIMC ’11, pp. 50:1–50:9. ACM, New York (2011)
Torresani, L., Hackney, P., Bregler, C.: Learning motion style synthesis from perceptual observations. In: Proceedings of NIPS’06, pp. 1393–1400 (2006)
Urtasun, R., Glardon, P., Boulic, R., Thalmann, D., Fua, P.: Style-based motion synthesis. Comput. Gr. Forum 23(4), 799–812 (2004)
Vasilescu, M.A.O.: Human motion signatures: analysis, synthesis, recognition. In: Proceedings of the International Conference on Pattern Recognition, ICPR ’02, pp. 456–460. IEEE, Washington, DC (2002)
Wakayama, Y., Okajima, S., Takano, S., Okada, Y.: IEC-based motion retrieval system using laban movement analysis. In: Proceedings of KES’10, pp. 251–260 (2010)
Wilke, L., Calvert, T., Ryman, R., Fox, I.: From dance notation to human animation: the labandancer project: motion capture and retrieval. Comput. Anim. Virtual Worlds 16(3–4), 201–211 (2005)
Xia, S., Wang, C., Chai, J., Hodgins, J.: Realtime style transfer for unlabeled heterogeneous human motion. ACM Trans. Gr. 34(4), 119:1–119:10 (2015)
Yang, Y., Leung, H., Yue, L., Deng, L.: Generating a two-phase lesson for guiding beginners to learn basic dance movements. Comput. Educ. 61, 1–20 (2013)
Yumer, M.E., Mitra, N.J.: Spectral style transfer for human motion between independent actions. ACM Trans. Graph. 35(4), 137 (2016)
Zhao, L., Badler, N.I.: Acquiring and validating motion qualities from live limb gestures. Gr. Models 67(1), 1–16 (2005)
Acknowledgements
This work is co-financed by the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation Under Contract DIDAKTOR/0311/73
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Supplementary material 1 (mp4 16901 KB)
Rights and permissions
About this article
Cite this article
Aristidou, A., Stavrakis, E., Papaefthimiou, M. et al. Style-based motion analysis for dance composition. Vis Comput 34, 1725–1737 (2018). https://doi.org/10.1007/s00371-017-1452-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00371-017-1452-z