What Lies Beneath One’s Feet? Terrain Classification Using Inertial Data of Human Walk
Abstract
:1. Introduction
- We collected gait data of 40 healthy participants using body-mounted inertial sensors (embedded in smartphones) attached on two body locations i.e., chest and lower back.The data were collected on six different types of terrains: carpet, concrete floor, grass, asphalt, soil, and tiles (as explained in Section 2.4). The data can be freely accessed by sending email to the corresponding author.
- We propose a set of 194 saptio-spectral hand-crafted features per stride, which can be used to train different supervised learning classifiers (random forest and support vector machine) and predict terrains. The prediction accuracy remained above 90% for terrains under different classes such as indoor–outdoor, hard–soft, and a combination of binary, ternary, quaternary, quinary and senary terrain classes (details in Section 2.4 and Section 3).
- From the experimental results, we found that the lower back location is more suitable for sensor placement than chest for the task of terrain classifications as it produced the highest classification accuracies (details are in Section 4.1).
2. Materials and Methods
2.1. Selection of Terrains
2.2. Characteristics of the Population
2.3. Placement of Sensors
2.4. Data Collection
2.5. Segmentation of Signals into Strides
2.6. Features Extraction
2.7. Classification
3. Results
3.1. Binary Classification
3.1.1. Indoor–Outdoor Classification
3.1.2. Hard–Soft Classification
3.1.3. Pair-Wise Classification
3.2. Senary Classification
4. Discussion
4.1. Summary of Findings
4.2. Comparison with Existing Approaches
4.3. Limitations
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A
Appendix A.1. Ternary Classification
Classification Category | Sensor | Accuracy (%) | Statistical Analysis | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
All Features | Temporal Features | Spectral Features | 3D Accelerations | 3D Angular Velocities | Precision | Recall | f1-Score | ||||||||||
SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | ||
Ternary | Lower Back | 93.77 | 93.55 | 93.31 | 93.87 | 89.48 | 89.23 | 90.49 | 90.11 | 85.48 | 88.74 | 93.8 | 94.0 | 93.7 | 93.9 | 93.7 | 93.9 |
Chest | 92.78 | 92.29 | 89.56 | 92.99 | 89.73 | 88.40 | 91.91 | 90.78 | 84.69 | 82.63 | 93.0 | 92.5 | 93.0 | 92.4 | 93.0 | 92.4 | |
Quaternary | Lower Back | 91.52 | 91.93 | 87.97 | 91.97 | 85.71 | 86.19 | 87.03 | 88.20 | 80.98 | 86.00 | 92.0 | 91.8 | 91.9 | 91.6 | 91.9 | 91.6 |
Chest | 90.08 | 89.93 | 85.70 | 90.82 | 84.40 | 82.43 | 87.35 | 89.02 | 77.21 | 76.77 | 90.3 | 89.6 | 90.2 | 89.6 | 90.2 | 89.6 | |
Quinary | Lower Back | 89.67 | 90.24 | 85.11 | 90.42 | 82.69 | 83.70 | 84.15 | 85.77 | 77.21 | 83.39 | 89.6 | 90.5 | 89.5 | 90.2 | 89.5 | 90.2 |
Chest | 87.69 | 87.88 | 82.25 | 88.96 | 80.94 | 79.21 | 84.28 | 86.75 | 72.91 | 72.86 | 87.3 | 87.0 | 87.3 | 87.0 | 87.3 | 87.0 |
Appendix A.2. Quaternary Classification
Appendix A.3. Quinary Classification
References
- Manduchi, R.; Castano, A.; Talukder, A.; Matthies, L. Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation. Auton. Robot. 2005, 18, 81–102. [Google Scholar] [CrossRef] [Green Version]
- Bedi, P.; Sinha, A.; Agarwal, S.; Awasthi, A.; Prasad, G.; Saini, D. Influence of terrain on modern tactical combat: Trust-based recommender system. Def. Sci. J. 2010, 60, 405–411. [Google Scholar] [CrossRef]
- Wu, X.A.; Huh, T.M.; Mukherjee, R.; Cutkosky, M. Integrated ground reaction force sensing and terrain classification for small legged robots. IEEE Robot. Autom. Lett. 2016, 1, 1125–1132. [Google Scholar] [CrossRef]
- Giguere, P.; Dudek, G. A simple tactile probe for surface identification by mobile robots. IEEE Trans. Robot. 2011, 27, 534–544. [Google Scholar] [CrossRef]
- Belter, D.; Skrzypczyński, P. Rough terrain mapping and classification for foothold selection in a walking robot. J. Field Robot. 2011, 28, 497–528. [Google Scholar] [CrossRef]
- Dornik, A.; Drăguţ, L.; Urdea, P. Classification of soil types using geographic object-based image analysis and Random Forest. Pedosphere 2017. [Google Scholar] [CrossRef]
- Laible, S.; Khan, Y.N.; Bohlmann, K.; Zell, A. 3D lidar-and camera-based terrain classification under different lighting conditions. In Autonomous Mobile Systems; Springer: Berlin/Heidelberg, Germany, 2012; pp. 21–29. [Google Scholar]
- Schilling, F.; Chen, X.; Folkesson, J.; Jensfelt, P. Geometric and visual terrain classification for autonomous mobile navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017; pp. 2678–2684. [Google Scholar]
- Ma, X.; Hao, S.; Cheng, Y. Terrain classification of aerial image based on low-rank recovery and sparse representation. In Proceedings of the IEEE 20th International Conference on Information Fusion, Xi’an, China, 10–13 July 2017; pp. 1–6. [Google Scholar]
- Weszka, J.S.; Dyer, C.R.; Rosenfeld, A. A comparative study of texture measures for terrain classification. IEEE Trans. Syst. Man Cybern. 1976. [Google Scholar] [CrossRef]
- Anantrasirichai, N.; Burn, J.; Bull, D. Terrain classification from body-mounted cameras during human locomotion. IEEE Trans. Cybern. 2015, 45, 2249–2260. [Google Scholar] [CrossRef]
- Peterson, J.; Chaudhry, H.; Abdelatty, K.; Bird, J.; Kochersberger, K. Online Aerial Terrain Mapping for Ground Robot Navigation. Sensors 2018, 18, 630. [Google Scholar] [CrossRef]
- Ojeda, L.; Borenstein, J.; Witus, G.; Karlsen, R. Terrain characterization and classification with a mobile robot. J. Field Robot. 2006, 23, 103–122. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Wu, R.; Li, C.; Zang, X.; Zhang, X.; Jin, H.; Zhao, J. A force-sensing system on legs for biomimetic hexapod robots interacting with unstructured terrain. Sensors 2017, 17, 1514. [Google Scholar] [CrossRef] [PubMed]
- Valada, A.; Spinello, L.; Burgard, W. Deep feature learning for acoustics-based terrain classification. In Robotics Research; Springer: Cham, Switzerland, 2018; pp. 21–37. [Google Scholar]
- Rothrock, B.; Kennedy, R.; Cunningham, C.; Papon, J.; Heverly, M.; Ono, M. Spoc: Deep learning-based terrain classification for mars rover missions. AIAA SPACE 2016. [Google Scholar] [CrossRef]
- Brooks, C.A.; Iagnemma, K. Vibration-based terrain classification for planetary exploration rovers. IEEE Trans. Robot. 2005, 21, 1185–1191. [Google Scholar] [CrossRef]
- Zhu, Y.; Jia, C.; Ma, C.; Liu, Q. SURF-BRISK–Based Image Infilling Method for Terrain Classification of a Legged Robot. Appl. Sci. 2019, 9, 1779. [Google Scholar] [CrossRef]
- DuPont, E.M.; Moore, C.A.; Collins, E.G.; Coyle, E. Frequency response method for terrain classification inautonomousground vehicles. Auton. Robot. 2008, 24, 337–347. [Google Scholar] [CrossRef]
- Lu, L.; Ordonez, C.; Collins, E.G.; DuPont, E.M. Terrain surface classification for autonomous ground vehicles using a 2D laser stripe-based structured light sensor. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017; pp. 2174–2181. [Google Scholar] [CrossRef]
- Delmerico, J.; Giusti, A.; Mueggler, E.; Gambardella, L.M.; Scaramuzza, D. “On-the-spot training” for terrain classification in autonomous air-ground collaborative teams. In International Symposium on Experimental Robotics; Springer: Cham, Switzerland, 2016; pp. 574–585. [Google Scholar]
- Christie, J.; Kottege, N. Acoustics based terrain classification for legged robots. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 3596–3603. [Google Scholar]
- Bao, L.; Intille, S.S. Activity recognition from user-annotated acceleration data. In International Conference on Pervasive Computing; Springer: Berlin/Heidelberg, Germany, 2004; pp. 1–17. [Google Scholar]
- Riaz, Q.; Vögele, A.; Krüger, B.; Weber, A. One small step for a man: Estimation of gender, age and height from recordings of one step by a single inertial sensor. Sensors 2015, 15, 31999–32019. [Google Scholar] [CrossRef] [PubMed]
- Zhang, K.; Gao, C.; Guo, L.; Sun, M.; Yuan, X.; Han, T.X.; Zhao, Z.; Li, B. Age Group and Gender Estimation in the Wild with Deep RoR Architecture. IEEE Access 2017, 5, 22492–22503. [Google Scholar] [CrossRef]
- Flora, J.; Lochtefeld, D.; Bruening, D.; Iftekharuddin, K. Improved gender classification using non pathological gait kinematics in full-motion video. IEEE Trans. Hum.-Mach. Syst. 2015, 45, 304–314. [Google Scholar] [CrossRef]
- Janssen, D.; Schöllhorn, W.I.; Lubienetzki, J.; Fölling, K.; Kokenge, H.; Davids, K. Recognition of emotions in gait patterns by means of artificial neural nets. J. Nonverbal Behav. 2008, 32, 79–92. [Google Scholar] [CrossRef]
- Khamsemanan, N.; Nattee, C.; Jianwattanapaisarn, N. Human Identification From Freestyle Walks Using Posture-Based Gait Feature. IEEE Trans. Inf. Forensics Secur. 2018, 13, 119–128. [Google Scholar] [CrossRef]
- Wu, Z.; Huang, Y.; Wang, L.; Wang, X.; Tan, T. A comprehensive study on cross-view gait based human identification with deep cnns. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 209–226. [Google Scholar] [CrossRef] [PubMed]
- Liew, C.S.; Wah, T.Y.; Shuja, J.; Daghighi, B. Mining personal data using smartphones and wearable devices: A survey. Sensors 2015, 15, 4430–4469. [Google Scholar]
- Son, D.; Lee, J.; Qiao, S.; Ghaffari, R.; Kim, J.; Lee, J.E.; Song, C.; Kim, S.J.; Lee, D.J. Multifunctional wearable devices for diagnosis and therapy of movement disorders. Nat. Nanotechnol. 2014, 9, 397. [Google Scholar] [CrossRef] [PubMed]
- Riaz, Q.; Tao, G.; Krüger, B.; Weber, A. Motion reconstruction using very few accelerometers and ground contacts. Graph. Model. 2015, 79, 23–38. [Google Scholar] [CrossRef]
- Hu, B.; Dixon, P.; Jacobs, J.; Dennerlein, J.; Schiffman, J. Machine learning algorithms based on signals from a single wearable inertial sensor can detect surface-and age-related differences in walking. J. Biomech. 2018, 71, 37–42. [Google Scholar] [CrossRef] [PubMed]
- Diaz, J.P.; da Silva, R.L.; Zhong, B.; Huang, H.H.; Lobaton, E. Visual Terrain Identification and Surface Inclination Estimation for Improving Human Locomotion with a Lower-Limb Prosthetic. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 1817–1820. [Google Scholar] [CrossRef]
- Riaz, Q.; Hashmi, M.Z.U.H.; Hashmi, M.A.; Shahzad, M.; Errami, H.; Weber, A. Move Your Body: Age Estimation Based on Chest Movement During Normal Walk. IEEE Access 2019, 7, 28510–28524. [Google Scholar] [CrossRef]
- Steven Eyobu, O.; Han, D. Feature representation and data augmentation for human activity classification based on wearable IMU sensor data using a deep LSTM neural network. Sensors 2018, 18, 2892. [Google Scholar] [CrossRef] [PubMed]
- Sztyler, T.; Stuckenschmidt, H. On-body localization of wearable devices: An investigation of position-aware activity recognition. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom), Sydney, Australia, 14–19 March 2016; pp. 1–9. [Google Scholar]
- Chung, S.; Lim, J.; Noh, K.J.; Kim, G.; Jeong, H. Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning. Sensors 2019, 19, 1716. [Google Scholar] [CrossRef]
- Multon, F.; France, L.; Cani-Gascuel, M.P.; Debunne, G. Computer animation of human walking: A survey. J. Vis. Comput. Animat. 1999, 10, 39–54. [Google Scholar] [CrossRef]
- Boenig, D.D. Evaluation of a clinical method of gait analysis. Phys. Ther. 1977, 57, 795–798. [Google Scholar] [CrossRef]
- Azami, H.; Mohammadi, K.; Bozorgtabar, B. An improved signal segmentation using moving average and Savitzky-Golay filter. J. Signal Inf. Process. 2012, 3, 39. [Google Scholar] [CrossRef]
- Guiñón, J.L.; Ortega, E.; García-Antón, J.; Pérez-Herranz, V. Moving average and Savitzki-Golay smoothing filters using Mathcad. In Proceedings of the International Conference on Engineering and Education 2007, Coimbra, Portugal, 3–7 September 2007; pp. 30–39. [Google Scholar]
- Li, F.; Zhao, C.; Ding, G.; Gong, J.; Liu, C.; Zhao, F. A Reliable and Accurate Indoor Localization Method Using Phone Inertial Sensors. In Proceedings of the ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; ACM: New York, NY, USA, 2012; pp. 421–430. [Google Scholar] [CrossRef]
- Derawi, M.O.; Nickel, C.; Bours, P.; Busch, C. Unobtrusive User-Authentication on Mobile Phones Using Biometric Gait Recognition. In Proceedings of the Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, Darmstadt, Germany, 15–17 October 2010; pp. 306–311. [Google Scholar] [CrossRef]
- Zijlstra, W. Assessment of spatio-temporal parameters during unconstrained walking. Eur. J. Appl. Physiol. 2004, 92, 39–44. [Google Scholar] [CrossRef] [PubMed]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Bengio, Y.; Grandvalet, Y. No unbiased estimator of the variance of k-fold cross-validation. J. Mach. Learn. Res. 2004, 5, 1089–1105. [Google Scholar]
- Libby, J.; Stentz, A.J. Using sound to classify vehicle-terrain interactions in outdoor environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN, USA, 14–18 May 2012; pp. 3559–3566. [Google Scholar]
- Anthony, D.; Basha, E.; Ostdiek, J.; Ore, J.P.; Detweiler, C. Surface classification for sensor deployment from UAV landings. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3464–3470. [Google Scholar]
- Yang, K.; Wang, K.; Bergasa, L.; Romera, E.; Hu, W.; Sun, D.; Sun, J.; Cheng, R.; Chen, T.; López, E. Unifying terrain awareness for the visually impaired through real-time semantic segmentation. Sensors 2018, 18, 1506. [Google Scholar] [CrossRef] [PubMed]
- Massad, I.; Dalyot, S. Towards the Crowdsourcing of Massive Smartphone Assisted-GPS Sensor Ground Observations for the Production of Digital Terrain Models. Sensors 2018, 18, 898. [Google Scholar] [CrossRef]
Category | Year | Sensor | Classifier | |
---|---|---|---|---|
Anastrasirichai [11] | Vision | 2015 | Camera | SVM |
Dornik [6] | Vision | 2017 | Camera | Random Forest |
Ma et al. [9] | Vision | 2017 | Camera | SRC |
Christie et al. [22] | Acoustic | 2016 | Microphone | SVM |
Valada et al. [15] | Acoustic | 2018 | Microphone | Deep Learning |
Ojeda [13] | Robotics | 2006 | IMU, Motor | ANN |
Giguere et al. [4] | Robotics | 2011 | Tactile | ANN |
Wu et al. [3] | Robotics | 2016 | Tactile | SVM |
Manduchi et al. [1] | Autonomous off-road driving | 2005 | Ladar & Camera | Gaussian Process |
Lu et al. [20] | Autonomous off-road driving | 2009 | Laser | PNN |
Hu et al. [33] | Human Gait | 2018 | IMU | LSTM |
Diaz et al. [34] | Human Gait | 2018 | Camera, IMU | BoW model |
Terrain | Type | Environment |
---|---|---|
Tiles | Hard | Indoor |
Carpet | Soft | Indoor |
Concrete Floor | Hard | Indoor |
Grass | Soft | Outdoor |
Asphalt | Hard | Outdoor |
Soil | Soft | Outdoor |
Variable | Characteristics |
---|---|
Participants | 40 |
Male:Female | 30:10 |
Age (year, ) | 29.2 ± 11.4 |
Height (cm, ) | 171.2 ± 8.2 |
Weight (kg, ) | 67.1 ± 13.3 |
Accelerometer | Gyroscope | |
---|---|---|
Axes | 3 | 3 |
Noise | 0.0029 m/s2/ | 0.01 deg/s/ |
Output Rate | 5 to 100 Hz | 5 to 100 Hz |
Range | ± 2 g, ± 4 g, ± 8 g, ± 16 g | ± 2000 deg/s |
Resolution | 16 bits | 16 bits |
Classification Category | Sensor | Accuracy (%) | Statistical Analysis | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
All Features | Temporal Features | Spectral Features | 3D Accelerations | 3D Angular Velocities | Precision (%) | Recall (%) | f1-Score (%) | ||||||||||
SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | ||
Indoor–Outdoor | Lower Back | 97.48 | 96.52 | 96.73 | 96.93 | 94.47 | 93.58 | 95.80 | 93.83 | 92.25 | 93.33 | 97.2 | 96.6 | 97.2 | 96.6 | 97.1 | 96.6 |
Chest | 97.11 | 95.72 | 95.88 | 95.89 | 93.83 | 91.32 | 96.11 | 96.66 | 89.90 | 88.61 | 97.5 | 95.5 | 97.5 | 95.5 | 97.5 | 95.5 | |
Hard–Soft | Lower Back | 90.64 | 92.08 | 86.69 | 91.81 | 85.19 | 87.16 | 85.19 | 88.46 | 79.87 | 86.39 | 89.1 | 91.8 | 89.0 | 91.7 | 89.1 | 91.7 |
Chest | 89.57 | 89.08 | 84.41 | 89.38 | 83.76 | 82.34 | 85.98 | 88.50 | 78.07 | 77.45 | 89.4 | 87.7 | 89.4 | 87.6 | 89.3 | 87.6 | |
Pair-wise | Lower Back | 96.53 | 96.00 | 95.05 | 96.10 | 94.04 | 93.13 | 94.70 | 94.33 | 91.56 | 93.62 | 96.3 | 95.9 | 96.3 | 95.9 | 96.3 | 95.9 |
Chest | 96.00 | 95.28 | 94.28 | 95.69 | 93.41 | 91.51 | 94.97 | 94.96 | 89.78 | 88.42 | 96.3 | 95.7 | 96.3 | 95.6 | 96.3 | 95.6 |
Classification Category | Sensor | Accuracy (%) | Statistical Analysis | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
All Features | Temporal Features | Spectral Features | 3D Accelerations | 3D Angular Velocities | Precision | Recall | f1-Score | ||||||||||
SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | SVM | RF | ||
Senary | Lower Back | 87.55 | 88.70 | 82.51 | 89.10 | 80.10 | 81.67 | 81.46 | 84.08 | 74.78 | 81.74 | 86.9 | 87.2 | 86.7 | 86.5 | 86.8 | 86.6 |
Chest | 85.78 | 85.99 | 79.54 | 87.56 | 77.91 | 76.76 | 81.45 | 84.76 | 69.13 | 69.67 | 85.7 | 86.1 | 85.7 | 86.2 | 85.6 | 86.1 |
Category | Classification Accuracy (%) | |||||||
---|---|---|---|---|---|---|---|---|
Binary | Ternay | Quaternary | Quinary | Senary | ||||
Indoor–Outdoor | Hard–Soft | Pair-Wise | ||||||
Anthony et al. [49] | Robot | - | 90.00 (Indoor) | - | - | 63.75, 62.34 (Indoor), (Outdoor) | - | - |
Hu et al. (2018) [33] | IMU | 96.3 (Indoor only) | - | - | - | - | - | - |
Anantrasirichai et al. (2015) [11] | Vision | - | - | - | 82.0 | - | - | - |
Diaz et al. (2018) [34] | Vision, IMU | - | - | - | - | - | - | 86.0 |
Libby et al. [48] | Acoustic | - | - | - | - | - | - | 92.00 |
Proposed Approach | IMU | 97.48 | 92.08 | 96.53 | 83.87 | 91.97 | 90.42 | 89.10 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hashmi, M.Z.U.H.; Riaz, Q.; Hussain, M.; Shahzad, M. What Lies Beneath One’s Feet? Terrain Classification Using Inertial Data of Human Walk. Appl. Sci. 2019, 9, 3099. https://doi.org/10.3390/app9153099
Hashmi MZUH, Riaz Q, Hussain M, Shahzad M. What Lies Beneath One’s Feet? Terrain Classification Using Inertial Data of Human Walk. Applied Sciences. 2019; 9(15):3099. https://doi.org/10.3390/app9153099
Chicago/Turabian StyleHashmi, Muhammad Zeeshan Ul Hasnain, Qaiser Riaz, Mehdi Hussain, and Muhammad Shahzad. 2019. "What Lies Beneath One’s Feet? Terrain Classification Using Inertial Data of Human Walk" Applied Sciences 9, no. 15: 3099. https://doi.org/10.3390/app9153099
APA StyleHashmi, M. Z. U. H., Riaz, Q., Hussain, M., & Shahzad, M. (2019). What Lies Beneath One’s Feet? Terrain Classification Using Inertial Data of Human Walk. Applied Sciences, 9(15), 3099. https://doi.org/10.3390/app9153099