{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,2,21]],"date-time":"2025-02-21T23:36:15Z","timestamp":1740180975114,"version":"3.37.3"},"reference-count":51,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2022,10,14]],"date-time":"2022-10-14T00:00:00Z","timestamp":1665705600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100020963","name":"Moonshot Research and Development Program","doi-asserted-by":"publisher","award":["JPMJMS2011"],"id":[{"id":"10.13039\/501100020963","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Artif. Intell."],"abstract":"Emotion recognition is useful in many applications such as preventing crime or improving customer satisfaction. Most of current methods are performed using facial features, which require close-up face information. Such information is difficult to capture with normal security cameras. The advantage of using gait and posture over conventional biometrics such as facial features is that gaits and postures can be obtained unobtrusively from faraway, even in a noisy environment. This study aims to investigate and analyze the relationship between human emotions and their gaits or postures. We collected a dataset made from the input of 49 participants for our experiments. Subjects were instructed to walk naturally in a circular walking path, while watching emotion-inducing videos on Microsoft HoloLens 2 smart glasses. An OptiTrack motion-capturing system was used for recording the gaits and postures of participants. The angles between body parts and walking straightness were calculated as features for comparison of body-part movements while walking under different emotions. Results of statistical analyses show that the subjects' arm swings are significantly different among emotions. And the arm swings on one side of the body could reveal subjects' emotions more obviously than those on the other side. Our results suggest that the arm movements together with information of arm side and walking straightness can reveal the subjects' current emotions while walking. That is, emotions of humans are unconsciously expressed by their arm swings, especially by the left arm, when they are walking in a non-straight walking path. We found that arm swings in happy emotion are larger than arm swings in sad emotion. To the best of our knowledge, this study is the first to perform emotion induction by showing emotion-inducing videos to the participants using smart glasses during walking instead of showing videos before walking. This induction method is expected to be more consistent and more realistic than conventional methods. Our study will be useful for implementation of emotion recognition applications in real-world scenarios, since our emotion induction method and the walking direction we used are designed to mimic the real-time emotions of humans as they walk in a non-straight walking direction.<\/jats:p>","DOI":"10.3389\/frai.2022.989860","type":"journal-article","created":{"date-parts":[[2022,10,14]],"date-time":"2022-10-14T07:27:05Z","timestamp":1665732425000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":6,"title":["Emotional characteristic analysis of human gait while real-time movie viewing"],"prefix":"10.3389","volume":"5","author":[{"given":"Nitchan","family":"Jianwattanapaisarn","sequence":"first","affiliation":[]},{"given":"Kaoru","family":"Sumi","sequence":"additional","affiliation":[]},{"given":"Akira","family":"Utsumi","sequence":"additional","affiliation":[]},{"given":"Nirattaya","family":"Khamsemanan","sequence":"additional","affiliation":[]},{"given":"Cholwich","family":"Nattee","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2022,10,14]]},"reference":[{"key":"B1","first-page":"46","article-title":"Score and rank-level fusion for emotion recognition using genetic algorithm,","volume-title":"2018 IEEE 17th International Conference on Cognitive Informatics &Cognitive Computing (ICCI* CC)","author":"Ahmed","year":"2018"},{"key":"B2","doi-asserted-by":"publisher","DOI":"10.48550\/arXiv.2102.04204","article-title":"The rise of technology in crime prevention: opportunities, challenges and practitioners perspectives","author":"Anderez","year":"2021","journal-title":"arXiv[Preprint].arXiv:2102.04204"},{"key":"B3","doi-asserted-by":"publisher","first-page":"634","DOI":"10.1016\/j.gaitpost.2015.01.012","article-title":"Wavelet-based characterization of gait signal for neurological abnormalities","volume":"41","author":"Baratin","year":"2015","journal-title":"Gait Posture"},{"key":"B4","doi-asserted-by":"publisher","first-page":"159","DOI":"10.1007\/s00221-012-3357-4","article-title":"Expression of emotion in the kinematics of locomotion","volume":"225","author":"Barliya","year":"2013","journal-title":"Exp. Brain Res"},{"key":"B5","doi-asserted-by":"crossref","first-page":"77","DOI":"10.1109\/ACII.2015.7344554","article-title":"Deep learning vs. kernel methods: Performance for emotion prediction in videos,","volume-title":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","author":"Baveye","year":"2015"},{"key":"B6","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1007\/978-3-319-68533-5_1","article-title":"A survey of using biometrics for smart visual surveillance: gait recognition,","volume-title":"Surveillance in Action","author":"Bouchrika","year":"2018"},{"key":"B7","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/ICoCS.2019.8930761","article-title":"Appreciation of customer satisfaction through analysis facial expressions and emotions recognition,","volume-title":"2019 4th World Conference on Complex Systems (WCCS)","author":"Bouzakraoui","year":"2019"},{"key":"B8","doi-asserted-by":"crossref","first-page":"205","DOI":"10.1145\/1027933.1027968","article-title":"Analysis of emotion recognition using facial expressions, speech and multimodal information,","volume-title":"Proceedings of the 6th International Conference on Multimodal Interfaces","author":"Busso","year":"2004"},{"key":"B9","doi-asserted-by":"crossref","first-page":"800","DOI":"10.1109\/PERCOMW.2018.8480374","article-title":"Emotion recognition through gait on mobile devices,","volume-title":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","author":"Chiu","year":"2018"},{"key":"B10","doi-asserted-by":"publisher","first-page":"201","DOI":"10.1016\/S0167-9457(96)00051-6","article-title":"Principal component models of knee kinematics and kinetics: normal vs. pathological gait patterns","volume":"16","author":"Deluzio","year":"1997","journal-title":"Hum. Movement Sci"},{"key":"B11","doi-asserted-by":"crossref","first-page":"7452","DOI":"10.1109\/EMBC.2013.6611281","article-title":"The influences of emotional intensity for happiness and sadness on walking,","volume-title":"2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)","author":"Destephe","year":"2013"},{"key":"B12","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/ICETST49965.2020.9080735","article-title":"Age estimation and gender classification based on human gait analysis,","volume-title":"2020 International Conference on Emerging Trends in Smart Technologies (ICETST)","author":"Gillani","year":"2020"},{"key":"B13","doi-asserted-by":"publisher","first-page":"202","DOI":"10.1016\/j.humov.2011.05.001","article-title":"Effort-shape and kinematic assessment of bodily expression of emotion during gait","volume":"31","author":"Gross","year":"2012","journal-title":"Hum. Movement Sci"},{"key":"B14","doi-asserted-by":"publisher","first-page":"478","DOI":"10.1016\/j.humov.2017.11.008","article-title":"Not all is noticed: kinematic cues of emotion-specific gait","volume":"57","author":"Halovic","year":"2018","journal-title":"Hum. Movement Sci"},{"key":"B15","doi-asserted-by":"publisher","first-page":"41","DOI":"10.1016\/j.patrec.2018.04.020","article-title":"Multiview gait-based gender classification through pose-based voting","volume":"126","author":"Isaac","year":"2019","journal-title":"Pattern Recogn. Lett"},{"key":"B16","doi-asserted-by":"publisher","first-page":"317","DOI":"10.1016\/S0021-9290(98)00171-7","article-title":"Discrete wavelet transform: a tool in smoothing kinematic data","volume":"32","author":"Ismail","year":"1999","journal-title":"J. Biomech"},{"key":"B17","doi-asserted-by":"publisher","first-page":"79","DOI":"10.1007\/s10919-007-0045-3","article-title":"Recognition of emotions in gait patterns by means of artificial neural nets","volume":"32","author":"Janssen","year":"2008","journal-title":"J. Nonverbal Behav"},{"key":"B18","doi-asserted-by":"publisher","first-page":"341","DOI":"10.1016\/j.humov.2015.01.009","article-title":"Emotional influences on sit-to-walk in healthy young adults","volume":"40","author":"Kang","year":"2015","journal-title":"Hum. Movement Sci"},{"key":"B19","doi-asserted-by":"publisher","first-page":"4022","DOI":"10.1016\/j.jbiomech.2016.10.044","article-title":"The effect of emotion on movement smoothness during gait in healthy young adults","volume":"49","author":"Kang","year":"2016","journal-title":"J. Biomech"},{"key":"B20","doi-asserted-by":"publisher","first-page":"1050","DOI":"10.1109\/TSMCB.2010.2044040","article-title":"Recognition of affect based on gait patterns","volume":"40","author":"Karg","year":"2010","journal-title":"IEEE Trans. Syst. Man Cybernet. B"},{"key":"B21","doi-asserted-by":"publisher","first-page":"119","DOI":"10.1109\/TIFS.2017.2738611","article-title":"Human identification from freestyle walks using posture-based gait feature","volume":"13","author":"Khamsemanan","year":"2017","journal-title":"IEEE Trans. Inform. Forensics Sec"},{"key":"B22","doi-asserted-by":"publisher","first-page":"142","DOI":"10.1016\/j.jelekin.2018.02.007","article-title":"Impacts of using a head-worn display on gait performance during level walking and obstacle crossing","volume":"39","author":"Kim","year":"2018","journal-title":"J. Electromyogr. Kinesiol"},{"key":"B23","doi-asserted-by":"crossref","first-page":"485","DOI":"10.1109\/CIS-RAM47153.2019.9095797","article-title":"Gender classification from gait silhouette using observation angle-based geis,","volume-title":"2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM)","author":"Kitchat","year":"2019"},{"key":"B24","doi-asserted-by":"publisher","first-page":"1141","DOI":"10.3389\/fpsyg.2016.01141","article-title":"Inducing sadness and anxiousness through visual media: measurement techniques and persistence","volume":"7","author":"Kuijsters","year":"2016","journal-title":"Front. Psychol"},{"key":"B25","doi-asserted-by":"publisher","first-page":"277","DOI":"10.1016\/S0022-3956(00)00017-0","article-title":"Spatiotemporal gait patterns during over ground locomotion in major depression compared with healthy controls","volume":"34","author":"Lemke","year":"2000","journal-title":"J. Psychiatr. Res"},{"key":"B26","doi-asserted-by":"publisher","first-page":"585","DOI":"10.1109\/TAFFC.2016.2637343","article-title":"Identifying emotions from non-contact gaits information based on microsoft kinects","volume":"9","author":"Li","year":"2016","journal-title":"IEEE Trans. Affect. Comput"},{"key":"B27","doi-asserted-by":"publisher","first-page":"e2364","DOI":"10.7717\/peerj.2364","article-title":"Emotion recognition using kinect motion capture data of human gaits","volume":"4","author":"Li","year":"2016","journal-title":"PeerJ"},{"key":"B28","doi-asserted-by":"publisher","first-page":"3430","DOI":"10.1109\/TIFS.2020.2985535","article-title":"View-independent gait recognition using joint replacement coordinates (JRCs) and convolutional neural network","volume":"15","author":"Limcharoen","year":"2020","journal-title":"IEEE Trans. Inform. Forensics Sec"},{"key":"B29","doi-asserted-by":"publisher","first-page":"112057","DOI":"10.1109\/ACCESS.2021.3102936","article-title":"Gait recognition and re-identification based on regional lstm for 2-second walks","volume":"9","author":"Limcharoen","year":"2021","journal-title":"IEEE Access"},{"key":"B30","doi-asserted-by":"publisher","first-page":"761","DOI":"10.1109\/TIFS.2010.2069560","article-title":"Gait-based human age estimation","volume":"5","author":"Lu","year":"2010","journal-title":"IEEE Trans. Inform. Forensics Sec"},{"key":"B31","doi-asserted-by":"publisher","first-page":"580","DOI":"10.1097\/PSY.0b013e3181a2515c","article-title":"Embodiment of sadness and depression\u2013gait patterns associated with dysphoric mood","volume":"71","author":"Michalak","year":"2009","journal-title":"Psychosom. Med"},{"key":"B32","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1007\/BF00999605","article-title":"The identification of emotions from gait information","volume":"11","author":"Montepare","year":"1987","journal-title":"J. Nonverbal Behav"},{"key":"B33","doi-asserted-by":"publisher","first-page":"116","DOI":"10.1049\/iet-bmt.2016.0176","article-title":"Gait-based human age classification using a silhouette model","volume":"7","author":"Nabila","year":"2018","journal-title":"IET Biometrics"},{"key":"B34","doi-asserted-by":"publisher","first-page":"2647","DOI":"10.1016\/j.jbiomech.2005.08.014","article-title":"Classification of gait patterns in the time-frequency domain","volume":"39","author":"Nyan","year":"2006","journal-title":"J. Biomech"},{"key":"B35","doi-asserted-by":"publisher","first-page":"814","DOI":"10.1093\/ptj\/78.8.814","article-title":"Multivariate examination of data from gait analysis of persons with stroke","volume":"78","author":"Olney","year":"1998","journal-title":"Phys. Therapy"},{"key":"B36","doi-asserted-by":"crossref","DOI":"10.7551\/mitpress\/1140.001.0001","volume-title":"Affective Computing","author":"Picard","year":"2000"},{"key":"B37","doi-asserted-by":"publisher","first-page":"e10153","DOI":"10.2196\/10153","article-title":"Emotion recognition using smart watch sensor data: mixed-design study","volume":"5","author":"Quiroz","year":"2018","journal-title":"JMIR Mental Health"},{"key":"B38","doi-asserted-by":"publisher","first-page":"15","DOI":"10.1167\/9.6.15","article-title":"Critical features for the perception of emotion from gait","volume":"9","author":"Roether","year":"2009","journal-title":"J. Vision"},{"key":"B39","doi-asserted-by":"publisher","first-page":"243","DOI":"10.1016\/S0167-9457(96)00054-1","article-title":"Functional gait asymmetry in able-bodied subjects","volume":"16","author":"Sadeghi","year":"1997","journal-title":"Hum. Movement Sci"},{"key":"B40","doi-asserted-by":"publisher","first-page":"126","DOI":"10.1016\/j.gaitpost.2020.07.014","article-title":"A head-worn display (\u201csmart glasses\u201d) has adverse impacts on the dynamics of lateral position control during gait","volume":"81","author":"Sedighi","year":"2020","journal-title":"Gait Posture"},{"key":"B41","doi-asserted-by":"publisher","first-page":"e0195106","DOI":"10.1371\/journal.pone.0195106","article-title":"Information presentation through a head-worn display (\u201csmart glasses\u201d) has a smaller influence on the temporal structure of gait variability during dual-task gait compared to handheld displays (paper-based system and smartphone)","volume":"13","author":"Sedighi","year":"2018","journal-title":"PLoS ONE"},{"key":"B42","doi-asserted-by":"publisher","first-page":"605","DOI":"10.1007\/BF02442775","article-title":"Representing and clustering electromyographic gait patterns with multivariate techniques","volume":"19","author":"Shiavi","year":"1981","journal-title":"Med. Biol. Eng. Comput"},{"key":"B43","doi-asserted-by":"publisher","first-page":"617","DOI":"10.1007\/s12369-017-0427-6","article-title":"Automatic affect perception based on body gait and posture: a survey","volume":"9","author":"Stephens-Fripp","year":"2017","journal-title":"Int. J. Soc. Robot"},{"key":"B44","doi-asserted-by":"publisher","first-page":"428","DOI":"10.1016\/j.gaitpost.2017.09.001","article-title":"Self-esteem recognition based on gait pattern using kinect","volume":"58","author":"Sun","year":"2017","journal-title":"Gait Post"},{"key":"B45","doi-asserted-by":"crossref","first-page":"24","DOI":"10.1007\/978-3-030-22244-4_4","article-title":"Analysis and prediction of student emotions while doing programming exercises,","volume-title":"International Conference on Intelligent Tutoring Systems","author":"Tiam-Lee","year":"2019"},{"key":"B46","doi-asserted-by":"publisher","first-page":"621","DOI":"10.1007\/s12369-014-0243-1","article-title":"Recognizing emotions conveyed by human gait","volume":"6","author":"Venture","year":"2014","journal-title":"Int. J. Soc. Robot"},{"key":"B47","doi-asserted-by":"publisher","first-page":"247","DOI":"10.1002\/jor.1100080214","article-title":"Dynamic electromyography. I. Numerical representation using principal component analysis","volume":"8","author":"Wootten","year":"1990","journal-title":"J. Orthopaed. Res"},{"key":"B48","article-title":"Emotion recognition from gait analyses: current research and future directions","volume-title":"arXiv[Preprint].arXiv:2003.11461","author":"Xu","year":"2020"},{"key":"B49","doi-asserted-by":"crossref","first-page":"1230","DOI":"10.1109\/ICECA49313.2020.9297630","article-title":"Study of emotion recognition models for socially aware robots and subsequent path mapping,","volume-title":"2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA)","author":"Yelwande","year":"2020"},{"key":"B50","doi-asserted-by":"crossref","first-page":"3834","DOI":"10.1109\/ICPR.2010.934","article-title":"Age classification base on gait using HMM,","volume-title":"2010 20th International Conference on Pattern Recognition","author":"Zhang","year":"2010"},{"key":"B51","doi-asserted-by":"publisher","first-page":"e2258","DOI":"10.7717\/peerj.2258","article-title":"Emotion recognition based on customized smart bracelet with built-in accelerometer","volume":"4","author":"Zhang","year":"2016","journal-title":"PeerJ"}],"container-title":["Frontiers in Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2022.989860\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,10,14]],"date-time":"2022-10-14T07:27:35Z","timestamp":1665732455000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2022.989860\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,10,14]]},"references-count":51,"alternative-id":["10.3389\/frai.2022.989860"],"URL":"https:\/\/doi.org\/10.3389\/frai.2022.989860","relation":{},"ISSN":["2624-8212"],"issn-type":[{"type":"electronic","value":"2624-8212"}],"subject":[],"published":{"date-parts":[[2022,10,14]]}}}