High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors †
Abstract
:1. Introduction
2. Related Work
3. Methods
3.1. Setup
3.2. Environment
3.3. Feature Extraction
- Shoulder joint: flexion/extension, abduction/adduction, and internal/external rotation.
- Elbow joint: flexion/extension, pronation/supination.
- Hip joint: flexion/extension, abduction/adduction, and internal/external rotation.
- Knee joint: flexion/extension, and internal/external rotation.
Algorithm 1:Summary of the high-level features extraction |
Inputs: : segmented time-series comprising . : window size for searching high-level features. : type of high-level features searching . : magnitude value for templates for static mode, and for dynamic mode. Output: : high level features vector. Notation: , , , and refer to the human joints: shoulder, elbow, hip and knee. , , , , , , , are the term of movements of flexion, extension, abduction, adduction, internal rotation, external rotation, pronation and supination, respectively. is the distance calculated by the algorithm. is the set of high-level features of joints. Start Initialization: |
3.4. Action Classification
4. Results
5. Discussion
6. Conclusions
Acknowledgments
References
- Patel, S.; Hughes, R.; Hester, T.; Stein, J.; Akay, M.; Dy, J.G.; Bonato, P. A novel approach to monitor rehabilitation outcomes in stroke survivors using wearable technology. Proc. IEEE 2010, 98, 450–461. [Google Scholar] [CrossRef]
- Chen, L.; Hoey, J.; Nugent, C.; Cook, D.; Yu, Z. Sensor-Based Activity Recognition. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 790–808. [Google Scholar] [CrossRef]
- Cicirelli, F.; Fortino, G.; Giordano, A.; Guerrieri, A.; Spezzano, G.; Vinci, A. On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment. J. Med. Syst. 2016, 40, 200. [Google Scholar] [CrossRef]
- Chen, L.; Khalil, I. Activity recognition: Approaches, practices and trends. In Activity Recognition in Pervasive Intelligent Environments; Springer: Berlin, Germany, 2011; pp. 1–31. [Google Scholar]
- Patel, S.; Park, H.; Bonato, P.; Chan, L.; Rodgers, M. A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 2012, 9, 21. [Google Scholar] [CrossRef] [PubMed]
- Bulling, A.; Blanke, U.; Tan, D.; Rekimoto, J.; Abowd, G. Introduction to the Special Issue on Activity Recognition for Interaction. ACM Trans. Interact. Intell. Syst. 2015, 4, 16e:1–16e:3. [Google Scholar] [CrossRef]
- Sempena, S.; Maulidevi, N.; Aryan, P. Human action recognition using Dynamic Time Warping. In Proceedings of the International Conference on Electrical Engineering and Informatics, Bandung, Indonesia, 17–19 July 2011; pp. 1–5. [Google Scholar]
- López-Nava, I.H.; Arnrich, B.; Muñoz-Meléndez, A.; Güneysu, A. Variability Analysis of Therapeutic Movements using Wearable Inertial Sensors. J. Med. Syst. 2017, 41, 7. [Google Scholar] [CrossRef]
- López-Nava, I.H.; Muñoz-Meléndez, A. Wearable Inertial Sensors for Human Motion Analysis: A Review. IEEE Sens. J. 2016, 16, 7821–7834. [Google Scholar] [CrossRef]
- Zhang, S.; Xiao, K.; Zhang, Q.; Zhang, H.; Liu, Y. Improved extended Kalman fusion method for upper limb motion estimation with inertial sensors. In Proceedings of the 4th International Conference on Intelligent Control and Information Processing, Beijing, China, 9–11 June 2013; pp. 587–593. [Google Scholar]
- Altun, K.; Barshan, B.; Tunçel, O. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognit. 2010, 43, 3605–3620. [Google Scholar] [CrossRef]
- Jalal, A.; Kim, Y.H.; Kim, Y.J.; Kamal, S.; Kim, D. Robust human activity recognition from depth video using spatiotemporal multi-fused features. Pattern Recognit. 2017, 61, 295–308. [Google Scholar] [CrossRef]
- Lillo, I.; Niebles, J.C.; Soto, A. Sparse composition of body poses and atomic actions for human activity recognition in RGB-D videos. Image Vis. Comput. 2017, 59, 63–75. [Google Scholar] [CrossRef]
- Noor, M.H.M.; Salcic, Z.; Kevin, I.; Wang, K. Adaptive sliding window segmentation for physical activity recognition using a single tri-axial accelerometer. Pervasive Mob. Comput. 2017, 38, 41–59. [Google Scholar] [CrossRef]
- Wannenburg, J.; Malekian, R. Physical activity recognition from smartphone accelerometer data for user context awareness sensing. IEEE Trans. Syst. Man Cybern. Syst. 2017, 47, 3142–3149. [Google Scholar] [CrossRef]
- Wang, X.; Suvorova, S.; Vaithianathan, T.; Leckie, C. Using trajectory features for upper limb action recognition. In Proceedings of the IEEE 9th International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Singapore, 21–24 April 2014; pp. 1–6. [Google Scholar]
- Ahmadi, A.; Mitchell, E.; Destelle, F.; Gowing, M.; OConnor, N.E.; Richter, C.; Moran, K. Automatic Activity Classification and Movement Assessment During a Sports Training Session Using Wearable Inertial Sensors. In Proceedings of the 11th International Conference on Wearable and Implantable Body Sensor Networks, Zurich, Switzerland, 16–19 June 2014; pp. 98–103. [Google Scholar]
- Reiss, A.; Hendeby, G.; Bleser, G.; Stricker, D. Activity Recognition Using Biomechanical Model Based Pose Estimation. In 5th European Conference on Smart Sensing and Context; Springer: Berlin, Germany, 2010; pp. 42–55. [Google Scholar]
- López-Nava, I.H. Complex Action Recognition from Human Motion Tracking Using Wearable Sensors. PhD Thesis, Computer Science Department, Instituto Nacional de Astrofísica, Óptica y Electrónica, Puebla, Mexico, 2018. [Google Scholar]
- Marieb, E.N.; Hoehn, K. Human Anatomy & Physiology; Pearson Education: London, UK, 2007. [Google Scholar]
- Gu, T.; Wu, Z.; Tao, X.; Pung, H.K.; Lu, J. An emerging patterns based approach to sequential, interleaved and concurrent activity recognition. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications, Galveston, TX, USA, 9–13 March 2009; pp. 1–9. [Google Scholar]
- Attal, F.; Mohammed, S.; Dedabrishvili, M.; Chamroukhi, F.; Oukhellou, L.; Amirat, Y. Physical human activity recognition using wearable sensors. Sensors 2015, 15, 31314–31338. [Google Scholar] [CrossRef] [PubMed]
- Lara, O.D.; Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
- Banaee, H.; Ahmed, M.U.; Loutfi, A. Data mining for wearable sensors in health monitoring systems: A review of recent trends and challenges. Sensors 2013, 13, 17472–17500. [Google Scholar] [CrossRef]
- Müller, M. Dynamic time warping. In Information Retrieval for Music and Motion; Springer: Berlin/Heidelberg, Germany, 2007; pp. 69–84. [Google Scholar]
- Sun, Y.; Wong, A.K.; Kamel, M.S. Classification of imbalanced data: A review. Int. J. Pattern Recognit. Artif. Intell. 2009, 23, 687–719. [Google Scholar] [CrossRef]
- Liu, X.Y.; Wu, J.; Zhou, Z.H. Exploratory undersampling for class-imbalance learning. IEEE Trans. Syst. Man Cybern. Part B 2009, 39, 539–550. [Google Scholar]
- Friedman, N.; Geiger, D.; Goldszmidt, M. Bayesian network classifiers. Mach. Learn. 1997, 29, 131–163. [Google Scholar] [CrossRef]
- Mitchell, T.M. Machine Learning, 1st ed.; McGraw-Hill, Inc.: New York, NY, USA, 1997. [Google Scholar]
- Keller, J.M.; Gray, M.R.; Givens, J.A. A fuzzy k-nearest neighbor algorithm. IEEE Trans. Syst. Man Cybern. 1985, 4, 580–585. [Google Scholar] [CrossRef]
- Freund, Y.; Schapire, R.E. A desicion-theoretic generalization of on-line learning and an application to boosting. In European Conference on Computational Learning Theory; Springer: Berlin, Germany, 1995; pp. 23–37. [Google Scholar]
- Friedman, J.; Hastie, T.; Tibshirani, R. The Elements of Statistical Learning; Springer: Berlin, Germany, 2001; Volume 1. [Google Scholar]
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.; Howard, D.; Meijer, K.; Crompton, R. Activity identification using body-mounted sensors—A review of classification techniques. Physiol. Meas. 2009, 30, R1–R33. [Google Scholar] [CrossRef]
- López-Nava, I.H.; Muñoz-Meléndez, A. Complex human action recognition on daily living environments using wearable inertial sensors. In Proceedings of the 10th EAI International Conference on Pervasive Computing Technologies for Healthcare, Cancun, Mexico, 16–19 May 2016; pp. 138–145. [Google Scholar]
- Zhu, N.; Diethe, T.; Camplani, M.; Tao, L.; Burrows, A.; Twomey, N.; Kaleshi, D.; Mirmehdi, M.; Flach, P.; Craddock, I. Bridging e-Health and the Internet of Things: The SPHERE Project. IEEE Intell. Syst. 2015, 30, 39–46. [Google Scholar] [CrossRef]
- Tunca, C.; Alemdar, H.; Ertan, H.; Incel, O.D.; Ersoy, C. Multimodal Wireless Sensor Network-Based Ambient Assisted Living in Real Homes with Multiple Residents. Sensors 2014, 14, 9692–9719. [Google Scholar] [CrossRef] [PubMed]
Action | # of Instances | Action | # of Instances |
---|---|---|---|
Cooking | 7 | Ascending stairs | 3 |
Doing housework | 11 | Descending stairs | 4 |
Eating | 10 | sitting | 7 |
Grooming | 8 | Standing | 9 |
Mouth care | 8 | Walking | 23 |
NB | KNN | AB | |||||||
---|---|---|---|---|---|---|---|---|---|
Action | |||||||||
Cooking | 0.857 | 0.988 | 0.857 | 0.857 | 1.000 | 0.923 | 0.857 | 0.952 | 0.706 |
Doing housework | 0.909 | 0.987 | 0.909 | 0.909 | 0.962 | 0.833 | 0.727 | 1.000 | 0.842 |
Eating | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 0.700 | 0.962 | 0.700 |
Grooming | 0.750 | 0.976 | 0.750 | 0.625 | 0.976 | 0.667 | 0.750 | 0.988 | 0.800 |
Mouth care | 0.750 | 0.976 | 0.750 | 1.000 | 1.000 | 1.000 | 0.500 | 0.976 | 0.571 |
Ascending stairs | 0.333 | 0.874 | 0.133 | 0.333 | 1.000 | 0.500 | 0.667 | 1.000 | 0.800 |
Descending stairs | 0.750 | 0.953 | 0.545 | 0.500 | 0.988 | 0.571 | 1.000 | 0.988 | 0.889 |
Sitting | 0.857 | 1.000 | 0.923 | 0.571 | 0.964 | 0.571 | 0.857 | 1.000 | 0.923 |
Standing | 1.000 | 1.000 | 1.000 | 0.889 | 0.963 | 0.800 | 1.000 | 0.988 | 0.947 |
Walking | 0.435 | 0.970 | 0.571 | 0.913 | 0.955 | 0.894 | 1.000 | 0.955 | 0.939 |
NB | KNN | AB | |||||||
---|---|---|---|---|---|---|---|---|---|
Action | |||||||||
Cooking | 0.857 | 1.000 | 0.923 | 1.000 | 1.000 | 1.000 | 0.571 | 0.976 | 0.615 |
Doing housework | 0.818 | 0.962 | 0.783 | 0.727 | 0.924 | 0.640 | 0.727 | 0.949 | 0.696 |
Eating | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 0.800 | 0.962 | 0.762 |
Grooming | 0.625 | 0.951 | 0.588 | 0.625 | 0.963 | 0.625 | 0.500 | 0.951 | 0.500 |
Mouth care | 0.875 | 0.976 | 0.824 | 0.500 | 1.000 | 0.667 | 0.625 | 0.963 | 0.625 |
Ascending stairs | 0.333 | 1.000 | 0.500 | 0.667 | 0.954 | 0.444 | 0.667 | 1.000 | 0.800 |
Descending stairs | 0.500 | 1.000 | 0.667 | 0.500 | 0.977 | 0.500 | 0.750 | 0.988 | 0.750 |
Sitting | 0.429 | 0.964 | 0.462 | 0.429 | 0.952 | 0.429 | 0.286 | 0.928 | 0.267 |
Standing | 0.667 | 0.938 | 0.600 | 0.778 | 0.914 | 0.609 | 0.333 | 0.938 | 0.353 |
Walking | 0.913 | 0.955 | 0.894 | 0.522 | 0.940 | 0.615 | 0.913 | 0.970 | 0.913 |
NB | KNN | AB | |||||||
---|---|---|---|---|---|---|---|---|---|
Action | |||||||||
Cooking | 0.857 | 0.988 | 0.857 | 1.000 | 1.000 | 1.000 | 0.857 | 0.964 | 0.750 |
Doing housework | 0.909 | 0.975 | 0.870 | 1.000 | 0.975 | 0.917 | 0.727 | 1.000 | 0.842 |
Eating | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 0.600 | 0.962 | 0.632 |
Grooming | 0.625 | 0.976 | 0.667 | 0.750 | 0.976 | 0.750 | 0.625 | 0.951 | 0.588 |
Mouth care | 0.750 | 0.976 | 0.750 | 0.750 | 1.000 | 0.857 | 0.250 | 0.951 | 0.286 |
Ascending stairs | 0.333 | 0.977 | 0.333 | 0.333 | 1.000 | 0.500 | 0.667 | 0.989 | 0.667 |
Descending stairs | 0.750 | 0.988 | 0.750 | 0.500 | 0.988 | 0.571 | 0.750 | 0.988 | 0.750 |
Sitting | 0.714 | 0.988 | 0.769 | 0.429 | 0.964 | 0.462 | 0.857 | 1.000 | 0.923 |
Standing | 0.889 | 0.988 | 0.889 | 1.000 | 0.951 | 0.818 | 1.000 | 0.988 | 0.947 |
Walking | 0.870 | 0.940 | 0.851 | 0.870 | 0.955 | 0.870 | 1.000 | 0.955 | 0.939 |
Classified as | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Actual | act01 | act02 | act03 | act04 | act05 | act06 | act07 | act08 | act09 | act10 |
act01 Cooking | 6 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
act02 Doing housework | 0 | 10 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
act03 Eating | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
act04 Grooming | 0 | 2 | 0 | 5 | 1 | 0 | 0 | 0 | 0 | 0 |
act05 Mouth care | 1 | 0 | 0 | 1 | 6 | 0 | 0 | 0 | 0 | 0 |
act06 Ascending stairs | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
act07 Descending stairs | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 1 |
act08 Sitting | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 1 | 1 |
act09 Standing | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 8 | 0 |
act10 Walking | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 20 |
Classified as | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Actual | act01 | act02 | act03 | act04 | act05 | act06 | act07 | act08 | act09 | act10 |
act01 Cooking | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
act02 Doing housework | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
act03 Eating | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
act04 Grooming | 0 | 2 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 |
act05 Mouth care | 0 | 0 | 0 | 2 | 6 | 0 | 0 | 0 | 0 | 0 |
act06 Ascending stairs | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 |
act07 Descending stairs | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 2 |
act08 Sitting | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 4 | 0 |
act09 Standing | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 |
act10 Walking | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 2 | 0 | 20 |
Classified as | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Actual | act01 | act02 | act03 | act04 | act05 | act06 | act07 | act08 | act09 | act10 |
act01 Cooking | 6 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
act02 Doing housework | 0 | 8 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
act03 Eating | 2 | 0 | 6 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
act04 Grooming | 0 | 0 | 0 | 5 | 3 | 0 | 0 | 0 | 0 | 0 |
act05 Mouth care | 1 | 0 | 2 | 3 | 2 | 0 | 0 | 0 | 0 | 0 |
act06 Ascending stairs | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 0 |
act07 Descending stairs | 0 | 0 | 0 | 0 | 0 | 1 | 3 | 0 | 0 | 0 |
act08 Sitting | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 1 | 0 |
act09 Standing | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 |
act10 Walking | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 23 |
Algorithm | Metric | High-Level () | Frequency-Domain () | Time-Domain () |
---|---|---|---|---|
NB | 0.822 (0.136) | 0.778 (0.168) | 0.844 (0.255) | |
0.973 (0.021) | 0.981 (0.020) | 0.986 (0.029) | ||
0.821 (0.125) | 0.777 (0.091) | 0.825 (0.257) | ||
KNN | 0.833 (0.205) | 0.833 (0.146) | 0.911 (0.189) | |
0.975 (0.020) | 0.976 (0.015) | 0.986 (0.015) | ||
0.826 (0.166) | 0.829 (0.122) | 0.906 (0.142) | ||
AB | 0.778 (0.223) | 0.800 (0.156) | 0.800 (0.156) | |
0.971 (0.019) | 0.973 (0.014) | 0.980 (0.010) | ||
0.771 (0.198) | 0.795 (0.108) | 0.802 (0.159) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
López-Nava, I.H.; Muñoz-Meléndez, A. High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors. Proceedings 2018, 2, 1238. https://doi.org/10.3390/proceedings2191238
López-Nava IH, Muñoz-Meléndez A. High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors. Proceedings. 2018; 2(19):1238. https://doi.org/10.3390/proceedings2191238
Chicago/Turabian StyleLópez-Nava, Irvin Hussein, and Angélica Muñoz-Meléndez. 2018. "High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors" Proceedings 2, no. 19: 1238. https://doi.org/10.3390/proceedings2191238
APA StyleLópez-Nava, I. H., & Muñoz-Meléndez, A. (2018). High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors. Proceedings, 2(19), 1238. https://doi.org/10.3390/proceedings2191238