Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review
Abstract
:1. Introduction
2. Methods
3. Presenting the Sensors
3.1. Kinect Devices
3.2. The LMC
4. Motion Tracking State of the Art
5. Discussion: Comparisons to Other Sensors
5.1. Kinect Sensor
5.2. Leap Motion Controller
6. Discussion: Accuracy and Precision
6.1. Kinect Sensor
6.1.1. Depth Sensor
- Technical Error of Measurement (TEM)
- Relative Technical Error of Measurement (%TEM)
- Intraclass Correlation Coefficient (ICC)
- Reliability Coefficient (R)
- Standard Error of Measurement (SEM)
- Coefficient of Variation (CV)
6.1.2. Skeleton Stream
- Global movements—where the whole body is used
- Bounded movements—where the movement only use a subset of the whole body
- Symmetric movements—where it is enough to measure “one half” of the body
- With the “Specialized Body Parts Analysis” method: They used three different strategies with tree different bounds to calculate the exact rate of correct movement prediction with four different movements, two of which can be done with each half of the body. Using only the arms or legs gave the worst results with a 52.45% as it could not track the legs almost every instance, using the whole body gave better results with a 92.42% and interestingly when using only the arms or legs with the trunk gave the best results with 97.37%.
- With the “Stricker Restrictions for the Most Easily Detectable Activities” method: This method tries to improve on the postural coincidences where movements are similar and sometimes inferred by other joints. To improve on this, some restrictions had to be applied by introducing a barrier value, a minimum limit of prediction which also had to be tested multiple times to see if they are too strict. After selecting the values which are sufficient for them, they concluded that the arm movements got good results with a 92% and 93.6% accuracy, and the leg movements got 22.4% and 24.8% respectively which are great improvements to the first method.
- With the “Combination of Body Parts and Stricter Limits” method: This method is the combination of the two. This method gives the best results, and they can be achieved by using only the arms or legs with the trunk. It has a 96%–100% accuracy rate for the arm movements and 92.8%–96% for the leg movements.
6.2. Leap Motion Controller
- Face-to-face: It helps when the palm does a 180° rotation. 90° is still susceptible to occlusion.
- Orthogonal: It helps with a 90° rotation. In this setup, the 180° rotation is susceptible to occlusion.
- 120° angle: This setup was considered the best—and mainly used—in the study. If the palm is under 180° it provides the best results, otherwise both sensors have a worse recognition performance.
6.3. Using the Two Devices Together
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Sherman, W.R.; Craig, A.B. Understanding Virtual Reality—Interface, Application, and Design. Presence Teleoperators Virtual Env. 2003. [Google Scholar] [CrossRef]
- Kipper, G.; Rampolla, J. Augmented Reality: An Emerging Technologies Guide to AR; Syngress: Waltham, MA, USA, 2012; ISBN 9781597497336. [Google Scholar]
- Tamura, H.; Yamamoto, H.; Katayama, A. Mixed reality: Future dreams seen at the border between real and virtual worlds. Ieee Comput. Graph. Appl. 2001. [Google Scholar] [CrossRef]
- Hantono, B.S.; Nugroho, L.E.; Santosa, P.I. Review of augmented reality agent in education. In Proceedings of the 2016 6th International Annual Engineering Seminar, Yogyakarta, Indonesia, 1–3 August 2016. [Google Scholar] [CrossRef]
- Larsen, E.; Ummiger, F.; Ye, X.; Rimon, N.; Stafford, J.R.; Lou, X. Methods and Systems for User Interaction within Virtual Reality Scene Using Head Mounted Display. U.S. Patent Application No 10/073,516, 2018. [Google Scholar]
- Meena, K.; Sivakumar, R. Human-Computer Interaction; PHI Learning Pvt. Ltd.: New Delhi, Delhi, India, 2014. [Google Scholar]
- What Are the Topics in Human Computer Interaction That Every Student in Human Computer Interaction Should Know? Available online: https://www.quora.com/What-are-the-top-topics-in-human-computer-interaction-that-every-student-in-human-computer-interaction-should-know (accessed on 5 November 2018).
- Zhao, W. A concise tutorial on human motion tracking and recognition with Microsoft Kinect. Sci. China Inf. Sci. 2016. [Google Scholar] [CrossRef]
- Wozniak, P.; Vauderwange, O.; Mandal, A.; Javahiraly, N.; Curticapean, D. Possible applications of the LEAP motion controller for more interactive simulated experiments in augmented or virtual reality. SPIE 2016. [Google Scholar] [CrossRef]
- Taylor, J.R. An introduction to error analysis. J. Acoust. Soc. Am. 1997. [Google Scholar] [CrossRef]
- E3: Microsoft Shows Off Gesture Control Technology for Xbox 360. Available online: https://latimesblogs.latimes.com/technology/2009/06/microsofte3.html (accessed on 4 November 2018).
- Leap Motion Launches Software Development Program, Sends Test Units. Available online: https://thenextweb.com/apple/2012/10/29/leap-motion-launches-software-developer-program-and-starts-sending-test-units-of-its-3d-controller/ (accessed on 4 November 2018).
- PRISMA Guidelines. Available online: http://prisma-statement.org/PRISMAStatement/FlowDiagram.aspx (accessed on 4 November 2018).
- Kinect Sales Reach 24 Million—GameSpot. Available online: https://www.gamespot.com/articles/kinect-sales-reach-24-million/1100-6403766/ (accessed on 3 December 2018).
- Why Xbox Kinect didn’t Take Off—Business Insider. Available online: https://www.businessinsider.com/why-microsoft-xbox-kinect-didnt-take-off-2015-9 (accessed on 14 February 2019).
- Leap Motion Lays Off 10% Of Its Workforce After Missing On First Year Sales Estimates. TechCrunch. Available online: https://techcrunch.com/2014/03/20/leap-motion-lays-off-10-of-its-workforce-after-missing-on-first-year-sales-estimates/?guccounter=1 (accessed on 3 December 2018).
- Report: Apple Nearly Acquired Leap Motion but the Deal Fell Through. Available online: https://www.roadtovr.com/report-apple-nearly-acquired-leap-motion-but-the-deal-fell-through/ (accessed on 14 February 2019).
- How Does the Kinect Work? – kinect.pdf. Available online: ftp://labattmot.ele.ita.br/ele/jricardo/Leitura/Kinect/kinect.pdf (accessed on 26 February 2019).
- Slide 1 – Lecture 22 – How the Kinect works – CP Fall 2017.pdf. Available online: https://courses.engr.illinois.edu/cs445/fa2017/lectures/Lecture%2022%20-%20How%20the%20Kinect%20Works%20-%20CP%20Fall%202017.pdf (accessed on 26 February 2019).
- Kinect Sensor for Xbox Gaming – download. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.476.2368&rep=rep1&type=pdf (accessed on 26 February 2019).
- Xbox One Kinect Teardown—iFixit. Available online: https://www.ifixit.com/Teardown/Xbox+One+Kinect+Teardown/19725 (accessed on 1 December 2018).
- How It Works: Xbox Kinect. Available online: https://www.jameco.com/jameco/workshop/howitworks/xboxkinect.html (accessed on 8 November 2018).
- Gamasutra: Daniel Lau’s Blog—The Science Behind Kinects or Kinect 1.0 versus Kinect 2.0. Available online: http://www.gamasutra.com/blogs/DanielLau/20131127/205820/The_Science_Behind_Kinects_or_Kinect_10_versus_20.php (accessed on 1 December 2018).
- What’s Inside?—Vol. 1: Leap Motion—Candemir Orsan—Medium. Available online: https://medium.com/@candemir/taking-things-apart-vol-1-leap-motion-36adaa137a0a (accessed on 2 December 2018).
- Wright, T.; de Ribaupierre, S.; Eagleson, R. Leap Motion Performance in an Augmented Reality Workspace: Integrating Devices with an Interactive Platform. IEEE Consum. Electron. Mag. 2019, 8, 36–41. [Google Scholar] [CrossRef]
- How Does the Leap Motion Controller Work? Available online: http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/ (accessed on 1 December 2018).
- Introducing the Skeletal Tracking Model—Leap Motion C# SDK v2.3 documentation. Available online: https://developer-archive.leapmotion.com/documentation/csharp/devguide/Intro_Skeleton_API.html (accessed on 2 December 2018).
- Gheran, B.F.; Cramariuc, G.; Rusu, I.; CrǍciun, E.G. Tools for collecting users’ touch and free-hand gestures on mobile devices in experimental settings. In Proceedings of the 13th International Conference on Development and Application Systems, Suceava, Romania, 19–21 May 2016. [Google Scholar] [CrossRef]
- Silva, E.S.; De Abreu, J.A.O.; De Almeida, J.H.P.; Teichrieb, V.; Ramalho, G.L. A Preliminary Evaluation of the Leap Motion Sensor as Controller of New Digital Musical Instruments. In Proceedings of the 14th SBCM, Brazilian Symposium on Computer Music, Brasília, Brazil, 31 October–2 November 2013. [Google Scholar]
- Understanding Latency: Part 1—Leap Motion Blog. Available online: http://blog.leapmotion.com/understanding-latency-part-1/ (accessed on 14 November 2018).
- Understanding Latency: Part 2—Leap Motion Blog. Available online: http://blog.leapmotion.com/understanding-latency-part-2/ (accessed on 14 November 2018).
- Zhou, H.; Hu, H. Human motion tracking for rehabilitation-A survey. Biomed. Signal Process. Control 2008. [Google Scholar] [CrossRef]
- Song, W.; Liu, L.; Tian, Y.; Sun, G.; Fong, S.; Cho, K. A 3D localisation method in indoor environments for virtual reality applications. Hum. Cent. Comput. Inf. Sci. 2017, 7, 39. [Google Scholar] [CrossRef]
- Hsu, H.J. The Potential of Kinect in Education. Int. J. Inf. Educ. Technol. 2011. [Google Scholar] [CrossRef]
- Bacca, J.; Baldiris, S.; Fabregat, R.; Graf, S. Kinshuk Augmented reality trends in education: A systematic review of research and applications. Educ. Technol. Soc. 2014, 17, 133–149. [Google Scholar]
- Mousavi Hondori, H.; Khademi, M. A review on technical and clinical impact of microsoft Kinect on physical therapy and rehabilitation. J. Med. Eng. 2014. [Google Scholar] [CrossRef] [PubMed]
- Reis, H.; Isotani, S.; Gasparini, I. Rehabilitation Using Kinect and an Outlook on Its Educational Applications: A Review of the State of the Art. In Brazilian Symposium on Computers in Education (Simpósio Brasileiro de Informática na Educação-SBIE); Sociedade Brasileira de Computação: Av. Bento Gonçalves, Brazil, 2015; Volume 26, p. 802. [Google Scholar]
- Da Gama, A.; Fallavollita, P.; Teichrieb, V.; Navab, N. Motor Rehabilitation Using Kinect: A Systematic Review. Games Health J. 2015. [Google Scholar] [CrossRef] [PubMed]
- Zhang, M.; Zhang, Z.; Chang, Y.; Aziz, E.S.; Esche, S.; Chassapis, C. Recent developments in game-based virtual reality educational laboratories using the microsoft kinect. Int. J. Emerg. Technol. Learn. 2018. [Google Scholar] [CrossRef]
- Kourakli, M.; Altanis, I.; Retalis, S.; Boloudakis, M.; Zbainos, D.; Antonopoulou, K. Towards the improvement of the cognitive, motoric and academic skills of students with special educational needs using Kinect learning games. Int. J. Child-Comput. Interact. 2017. [Google Scholar] [CrossRef]
- Amjad, I.; Toor, H.; Niazi, I.K.; Pervaiz, S.; Jochumsen, M.; Shafique, M.; Haavik, H.; Ahmed, T. Xbox 360 Kinect Cognitive Games Improve Slowness, Complexity of EEG, and Cognitive Functions in Subjects with Mild Cognitive Impairment: A Randomized Control Trial. Games Health J. 2018. [Google Scholar] [CrossRef] [PubMed]
- Matallaoui, A.; Koivisto, J.; Hamari, J.; Zarnekow, R. How effective is “exergamification”? A systematic review on the effectiveness of gamification features in exergames. In Proceedings of the 50th Hawaii International Conference on System Sciences, Hilton Waikoloa Village, HI, USA, 4–7 January 2017. [Google Scholar]
- Mateo, F.; Soria-Olivas, E.; Carrasco, J.; Bonanad, S.; Querol, F.; Pérez-Alenda, S. HemoKinect: A Microsoft Kinect V2 Based Exergaming Software to Supervise Physical Exercise of Patients with Hemophilia. Sensors 2018, 18, 2439. [Google Scholar] [CrossRef] [PubMed]
- Szczepaniak, O.; Sawicki, D. Gesture controlled human–computer interface for the disabled. Med. Pr. 2017. [Google Scholar] [CrossRef] [PubMed]
- Malone, L.A.; Rowland, J.L.; Rogers, R.; Mehta, T.; Padalabalanarayanan, S.; Thirumalai, M.; Rimmer, J.H. Active Videogaming in Youth with Physical Disability: Gameplay and Enjoyment. Games Health J. 2016. [Google Scholar] [CrossRef] [PubMed]
- Pool, S.M.; Hoyle, J.M.; Malone, L.A.; Cooper, L.; Bickel, C.S.; McGwin, G.; Rimmer, J.H.; Eberhardt, A.W. Navigation of a virtual exercise environment with Microsoft Kinect by people post-stroke or with cerebral palsy. Assist. Technol. 2016. [Google Scholar] [CrossRef] [PubMed]
- Sin, H.; Lee, G. Additional virtual reality training using Xbox kinect in stroke survivors with hemiplegia. Am. J. Phys. Med. Rehabil. 2013. [Google Scholar] [CrossRef] [PubMed]
- Lee, G. Effects of Training Using Video Games on the Muscle Strength, Muscle Tone, and Activities of Daily Living of Chronic Stroke Patients. J. Phys. Sci. 2013. [Google Scholar] [CrossRef] [PubMed]
- Bachmann, D.; Weichert, F.; Rinkenauer, G. Review of Three-Dimensional Human-Computer Interaction with Focus on the Leap Motion Controller. Sensors 2018, 18, 2194. [Google Scholar] [CrossRef] [PubMed]
- Chuan, C.-H.; Regina, E.; Guardino, C. American Sign Language Recognition Using Leap Motion Sensor. In Proceedings of the 13th International Conference on Machine Learning and Applications, Detroit, MI, USA, 3–5 December 2014. [Google Scholar] [CrossRef]
- Chong, T.-W.; Lee, B.-G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors 2018, 18, 3554. [Google Scholar] [CrossRef] [PubMed]
- Mohandes, M.; Aliyu, S.; Deriche, M. Arabic sign language recognition using the leap motion controller. In Proceedings of the IEEE International Symposium on Industrial Electronics, Istanbul, Turkey, 1–4 June 2014. [Google Scholar] [CrossRef]
- Elons, A.S.; Ahmed, M.; Shedid, H.; Tolba, M.F. Arabic sign language recognition using leap motion sensor. In Proceedings of the 9th IEEE International Conference on Computer Engineering and Systems, Cairo, Egypt, 22–23 December 2014. [Google Scholar] [CrossRef]
- Khelil, B.; Amiri, H.; Chen, T.; Kammüller, F.; Nemli, I.; Probst, C.W. Hand Gesture Recognition Using Leap Motion Controller for Recognition of Arabic Sign Language. Lect. Notes Comput. Sci. 2016. [Google Scholar] [CrossRef]
- Potter, L.E.; Araullo, J.; Carter, L. The Leap Motion controller: A view on sign language. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, Australia, 25–29 November 2013. [Google Scholar] [CrossRef]
- Simos, M.; Nikolaidis, N. Greek sign language alphabet recognition using the leap motion device. In Proceedings of the 9th Hellenic Conference on Artificial Intelligence, Thessaloniki, Greece, 18–20 May 2016. [Google Scholar] [CrossRef]
- Karthick, P.; Prathiba, N.; Rekha, V.B.; Thanalaxmi, S. Transforming Indian sign language into text using leap motion. Int. J. Innov. Res. Sci. Eng. Technol. 2014, 3, 10906–10910. [Google Scholar]
- Cohen, M.W.; Zikri, N.B.; Velkovich, A. Recognition of continuous sign language alphabet using leap motion controller. In Proceedings of the 11th International Conference on Human System Interaction, Gdańsk, Poland, 4–6 July 2018. [Google Scholar] [CrossRef]
- Nájera, L.O.R.; Sánchez, M.L.; Serna, J.G.G.; Tapia, R.P.; Llanes, J.Y.A. Recognition of Mexican Sign Language through the Leap Motion Controller. In Proceedings of the 14th International Conference on Scientific Computing, Las Vegas, NV, USA, 25–28 July 2016. [Google Scholar]
- Castañeda, M.A.; Guerra, A.M.; Ferro, R. Analysis on the gamification and implementation of Leap Motion Controller in the IED Técnico industrial de Tocancipá. Interact. Technol. Smart Educ. 2018, 15, 155–164. [Google Scholar] [CrossRef]
- Karashanov, A.; Manolova, A.; Neshov, N. Application for hand rehabilitation using leap motion sensor based on a gamification approach. Int. J. Adv. Res. Sci. Eng. 2016, 5., 61–69. [Google Scholar]
- Wang, Z.R.; Wang, P.; Xing, L.; Mei, L.P.; Zhao, J.; Zhang, T. Leap Motion-based virtual reality training for improving motor functional recovery of upper limbs and neural reorganization in subacute stroke patients. Neural Regen. Res. 2017, 12, 1823. [Google Scholar] [CrossRef] [PubMed]
- Alimanova, M.; Borambayeva, S.; Kozhamzharova, D.; Kurmangaiyeva, N.; Ospanova, D.; Tyulepberdinova, G.; Gaziz, G.; Kassenkhan, A. Gamification of hand rehabilitation process using virtual reality tools: Using leap motion for hand rehabilitation. In Proceedings of the 1st IEEE International Conference on Robotic Computing, Taichung, Taiwan, 10–12 April 2017. [Google Scholar] [CrossRef]
- Skraba, A.; Kolozvari, A.; Kofjac, D.; Stojanović, R. Wheelchair maneuvering using leap motion controller and cloud based speech control: Prototype realization. In Proceedings of the 4th Mediterranean Conference on Embedded Computing, Budva, Montenegro, 14–18 June 2015. [Google Scholar] [CrossRef]
- Bassily, D.; Georgoulas, C.; Güttler, J.; Linner, T.; Bock, T.; München, T.U. Intuitive and adaptive robotic arm manipulation using the leap motion controller. In Proceedings of the 41st International Symposium on Robotics, Munich, Germany, 2–3 June 2014. [Google Scholar]
- Chen, S.; Ma, H.; Yang, C.; Fu, M. Hand gesture based robot control system using leap motion. In Proceedings of the 8th International Conference on Intelligent Robotics and Applications, Portsmouth, UK, 24–27 August 2015. [Google Scholar] [CrossRef]
- Travaglini, T.A.; Swaney, P.J.; Weaver, K.D.; Webster, R.J. Initial experiments with the leap motion as a user interface in robotic endonasal surgery. In Robotics and Mechatronics; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Aditya, K.; Chacko, P.; Kumari, D.; Kumari, D.; Bilgaiyan, S. Recent Trends in HCI: A survey on Data Glove, LEAP Motion and Microsoft Kinect. In Proceedings of the 2018 IEEE International Conference on System, Computation, Automation and Networking (ICSCA), Pondicherry, India, 6–7 July 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Unveiling Project North Star. Available online: http://blog.leapmotion.com/northstar/ (accessed on 13 February 2019).
- Download Kinect Gesture Data Set from Official Microsoft Download Center. Available online: https://www.microsoft.com/en-us/download/details.aspx?id=52283 (accessed on 13 February 2019).
- Example—Leap Motion Gallery. Available online: https://gallery.leapmotion.com/category/example/ (accessed on 13 February 2019).
- Gonzalez-Jorge, H.; Riveiro, B.; Vazquez-Fernandez, E.; Martínez-Sánchez, J.; Arias, P. Metrological evaluation of Microsoft Kinect and Asus Xtion sensors. Meas. J. Int. Meas. Confed. 2013. [Google Scholar] [CrossRef]
- Xtion. 3D Sensor. ASUS Global. Available online: https://www.asus.com/3D-Sensor/Xtion/specifications/ (accessed on 6 November 2018).
- Breedon, P.; Siena, F.L.; Byrom, B.; Muehlhausen, W. Enhancing the measurement of clinical outcomes using microsoft kinect. In Proceedings of the International Conference on Interactive Technologies and Games, Nottingham, UK, 26–27 October 2016. [Google Scholar] [CrossRef]
- Carfagni, M.; Furferi, R.; Governi, L.; Santarelli, C.; Servi, M.; Uccheddu, F.; Volpe, Y. Metrological and Critical Characterization of the Intel D415 Stereo Depth Camera. Sensors 2018, 19, 489. [Google Scholar] [CrossRef] [PubMed]
- Romero, V.; Amaral, J.; Fitzpatrick, P.; Schmidt, R.C.; Duncan, A.W.; Richardson, M.J. Can low-cost motion-tracking systems substitute a Polhemus system when researching social motor coordination in children? Behav. Res. Methods 2017. [Google Scholar] [CrossRef] [PubMed]
- Liberty Latus Brochure. Available online: https://polhemus.com/_assets/img/LIBERTY_LATUS_brochure_1.pdf (accessed on 5 November 2018).
- Sun, Y.; Li, C.; Li, G.; Jiang, G.; Jiang, D.; Liu, H.; Zheng, Z.; Shu, W. Gesture Recognition Based on Kinect and sEMG Signal Fusion. Mob. Netw. Appl. 2018. [Google Scholar] [CrossRef]
- Bogatinov, D.; Lameski, P.; Trajkovik, V.; Trendova, K.M. Firearms training simulator based on low cost motion tracking sensor. Multimed. Tools Appl. 2017. [Google Scholar] [CrossRef]
- Suma, E.A.; Lange, B.; Rizzo, A.; Krum, D.M.; Bolas, M. FAAST: The flexible action and articulated skeleton toolkit. In Proceedings of the IEEE Virtual Reality, Singapore, 19–23 March 2011. [Google Scholar] [CrossRef]
- Fournier, H.; Lapointe, J.-F.; Kondratova, I.; Emond, B. Crossing the Barrier: A Scalable Simulator for Course of Fire Training. In Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), Orlando, FL, USA, 3–6 December 2012. [Google Scholar]
- Rosell-Polo, J.R.; Gregorio, E.; Gene, J.; Llorens, J.; Torrent, X.; Arno, J.; Escola, A. Kinect v2 sensor-based mobile terrestrial laser scanner for agricultural outdoor applications. IEEE/ASME Trans. Mechatron. 2017. [Google Scholar] [CrossRef]
- Keightley, K.E.; Bawden, G.W. 3D volumetric modeling of grapevine biomass using Tripod LiDAR. Comput. Electron. Agric. 2010. [Google Scholar] [CrossRef]
- Amazon.com: Kinect for Windows: Computers & Accessories. Available online: https://www.amazon.com/Microsoft-L6M-00001-Kinect-for-Windows/dp/B006UIS53K (accessed on 6 November 2018).
- Amazon.com: Xbox One Kinect Sensor: Electronics. Available online: https://www.amazon.com/d/Xbox-One-Consoles/Xbox-One-Kinect-Sensor/B00INAX3Q2/ref=sr_1_2?s=electronics&ie=UTF8&qid=1541547422&sr=1-2&keywords=kinect+v2 (accessed on 6 November 2018).
- ASUS Xtion Motion Sensor for PC. Ebay. Available online: https://www.ebay.com/itm/ASUS-Xtion-Motion-Sensor-for-PC/283257432126?hash=item41f375683e:g:tWAAAOSw7fBbqTfv:rk:6:pf:0 (accessed on 6 November 2018).
- Asus Xtion PRO Color RGB 3D Motion Depth Sensor Developer XtionPRO. Ebay. Available online: https://www.ebay.com/itm/Asus-Xtion-PRO-Color-RGB-3D-Motion-Depth-Sensor-Developer-XtionPRO/163536637794?epid=1201479310&hash=item26138b0f62:g:6KAAAOSwZ2tcKU3z (accessed on 14 February 2019).
- 82535IVCHVM Intel. Mouser Europe. Available online: https://eu.mouser.com/ProductDetail/Intel/82535IVCHVM?qs=33AqgSIO4a6GrmoNr1kb8w%3d%3d (accessed on 14 February 2019).
- Intel RealSense Depth Camera D415. Available online: https://click.intel.com/intelr-realsensetm-depth-camera-d415.html (accessed on 14 February 2019).
- Ives, J.C.; Wigglesworth, J.K. Sampling rate effects on surface EMG timing and amplitude measures. Clin. Biomech. 2003. [Google Scholar] [CrossRef]
- BTS FreeEMG 1000. Available online: https://www.ebay.com/itm/Emg-bts-freeemg-1000-con-6-sondas-Inalambrico-/112404350001?_ul=CL (accessed on 7 November 2018).
- Optech ILRIS-3D Laser Scanner, 16.000,00€. Available online: https://shop.laserscanning-europe.com/Optech-ILRIS-3D-Laser-scanner (accessed on 6 November 2018).
- Measurement Sciences Smart Markers—Measurement Sciences. Available online: https://www.ndigital.com/msci/products/smart-markers/ (accessed on 14 December 2018).
- Tung, J.Y.; Lulic, T.; Gonzalez, D.A.; Tran, J.; Dickerson, C.R.; Roy, E.A. Evaluation of a portable markerless finger position capture device: Accuracy of the Leap Motion controller in healthy adults. Physiol. Meas. 2015, 36, 1025. [Google Scholar] [CrossRef] [PubMed]
- Chen, K.; Liang, H.N.; Yue, Y.; Craig, P. Infrared motion detection and electromyographic gesture recognition for navigating 3D environments. In Computer Animation and Virtual Worlds; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
- Amazon.com: Leap Motion: Stores. Available online: https://www.amazon.com/stores/Leap-Motion/node/8532032011?productGridPageIndex=1 (accessed on 14 December 2018).
- Myo Gesture Control Armband—Black—RobotShop. Available online: https://www.robotshop.com/en/myo-gesture-control-armband-black.html (accessed on 14 December 2018).
- Amazon.com: Creative Senz3D Depth and Gesture Recognition Camera for Personal Computers: Computers & Accessories. Available online: https://www.amazon.com/Creative-Gesture-Recognition-Personal-Computers/dp/B00EVWX7CG (accessed on 14 February 2019).
- Wasenmüller, O.; Stricker, D. Comparison of kinect v1 and v2 depth images in terms of accuracy and precision. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017. [Google Scholar]
- Gonzalez-Jorge, H.; Rodriguez-Gonzalvez, P.; Martinez-Sanchez, J.; Gonzalez-Aguilera, D.; Arias, P.; Gesto, M.; Diaz-Vilarino, L. Metrological comparison between Kinect i and Kinect II sensors. Meas. J. Int. Meas. Confed. 2015. [Google Scholar] [CrossRef]
- Khoshelham, K.; Elberink, S.O. Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications. Sensors 2012, 12, 1437–1454. [Google Scholar] [CrossRef] [PubMed]
- Yang, L.; Zhang, L.; Dong, H.; Alelaiwi, A.; Saddik, A. El Evaluating and improving the depth accuracy of Kinect for Windows v2. IEEE Sens. J. 2015. [Google Scholar] [CrossRef]
- Chan, T.; Lichti, D.; Jahraus, A.; Esfandiari, H.; Lahamy, H.; Steward, J.; Glanzer, M. An Egg Volume Measurement System Based on the Microsoft Kinect. Sensors 2018, 18, 2454. [Google Scholar] [CrossRef] [PubMed]
- Kadambi, A.; Bhandari, A.; Raskar, R. 3D Depth Cameras in Vision: Benefits and Limitations of the Hardware. In Computer Vision and Machine Learning with RGB-D Sensors; Shao, L., Han, J., Kohli, P., Zhang, Z., Eds.; Advances in Computer Vision and Pattern Recognition; Springer: Cham, Switzerland, 2014. [Google Scholar]
- Xbox Kinect Size/Dimensions|123Kinect.com. Available online: http://123kinect.com/xbox-kinect-dimensions-size/1761/ (accessed on 15 November 2018).
- Official Xbox One and Kinect 2 Dimensions Revealed. Available online: https://www.gamepur.com/news/12519-official-xbox-one-and-kinect-2-dimensions-revealed.html (accessed on 15 November 2018).
- Kinect for Xboxx 360 and Kinect for Windows (KfW) v1 specs. George Birbilis @zoomicon. Available online: https://zoomicon.wordpress.com/2015/07/28/kinect-for-xbox-360-and-kinect-for-windows-kfw-v1-specs/ (accessed on 6 November 2018).
- Bragança, S.; Arezes, P.; Carvalho, M.; Ashdown, S.P.; Castellucci, I.; Leão, C. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability. Work 2018. [Google Scholar] [CrossRef] [PubMed]
- Mankoff, K.D.; Russo, T.A. The Kinect: A low-cost, high-resolution, short-range 3D camera. Earth Surf. Process. Landf. 2013, 38, 926–936. [Google Scholar] [CrossRef]
- Chikkanna, M.; Guddeti, R.M.R. Kinect based real-time gesture spotting using HCRF. In Proceedings of the International Conference on Advances in Computing, Communications and Informatics, Mysore, India, 22–25 August 2013. [Google Scholar] [CrossRef]
- Livingston, M.A.; Sebastian, J.; Ai, Z.; Decker, J.W. Performance measurements for the Microsoft Kinect skeleton. In Proceedings of the 2012 IEEE Virtual Reality (VR), Costa Mesa, CA, USA, 4–8 March 2012. [Google Scholar] [CrossRef]
- Otte, K.; Kayser, B.; Mansow-Model, S.; Verrel, J.; Paul, F.; Brandt, A.U.; Schmitz-Hübsch, T. Accuracy and reliability of the kinect version 2 for clinical measurement of motor function. PLoS ONE 2016. [Google Scholar] [CrossRef] [PubMed]
- Reither, L.R.; Foreman, M.H.; Migotsky, N.; Haddix, C.; Engsberg, J.R. Upper extremity movement reliability and validity of the Kinect version 2. Disabil. Rehabil. Assist. Technol. 2018. [Google Scholar] [CrossRef] [PubMed]
- Huber, M.E.; Seitz, A.L.; Leeser, M.; Sternad, D. Validity and reliability of Kinect skeleton for measuring shoulder joint angles: A feasibility study. Physiotherapy 2015. [Google Scholar] [CrossRef] [PubMed]
- Elgendi, M.; Picon, F.; Magenant-Thalmann, N. Real-time speed detection of hand gesture using, Kinect. In Proceedings of the Workshop on Autonomous Social Robots and Virtual Humans, The 25th Annual Conference on Computer Animation and Social Agents (CASA 2012), Singapore, 9–11 May 2012. [Google Scholar]
- Gutiérrez López de la Franca, C.; Hervás, R.; Johnson, E.; Mondéjar, T.; Bravo, J. Extended Body-Angles Algorithm to Recognize Activities within Intelligent Environments. J. Ambient Intell. Hum. Comput. 2017, 8, 531–549. [Google Scholar] [CrossRef]
- Gutiérrez-López-Franca, C.; Hervás, R.; Johnson, E. Strategies to Improve Activity Recognition Based on Skeletal Tracking: Applying Restrictions Regarding Body Parts and Similarity Boundaries. Sensors 2018, 18, 1665. [Google Scholar] [CrossRef] [PubMed]
- Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors 2013, 13, 6380–6393. [Google Scholar] [CrossRef] [PubMed]
- Leap Motion Controller Specs—CNET. Available online: https://www.cnet.com/products/leap-motion-controller/specs/ (accessed on 13 November 2018).
- Guna, J.; Jakus, G.; Pogačnik, M.; Tomažič, S.; Sodnik, J. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 2014. [Google Scholar] [CrossRef] [PubMed]
- Vikram, S.; Li, L.; Russell, S. Handwriting and Gestures in the Air, Recognizing on the Fly. CHI 2013 Ext. Abstr. 2013, 13, 1179–1184. [Google Scholar] [CrossRef]
- Sharma, J.K.; Gupta, R.; Pathak, V.K. Numeral Gesture Recognition Using Leap Motion Sensor. In Proceedings of the 2015 International Conference on Computational Intelligence and Communication Networks, Jabalpur, India, 12–14 December 2015. [Google Scholar] [CrossRef]
- Zeng, W.; Wang, C.; Wang, Q. Hand gesture recognition using Leap Motion via deterministic learning. Multimed. Tools Appl. 2018. [Google Scholar] [CrossRef]
- Jin, H.; Chen, Q.; Chen, Z.; Hu, Y.; Zhang, J. Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. Caai Trans. Intell. Technol. 2016, 1, 104–113. [Google Scholar] [CrossRef]
- Lu, W.; Tong, Z.; Chu, J. Dynamic hand gesture recognition with leap motion controller. IEEE Signal Process. Lett. 2016, 23, 1188–1192. [Google Scholar] [CrossRef]
- Li, W.J.; Hsieh, C.Y.; Lin, L.F.; Chu, W.C. Hand gesture recognition for post-stroke rehabilitation using leap motion. In Proceedings of the 2017 IEEE International Conference on Applied System Innovation: Applied System Innovation for Modern Technology, Sapporo, Japan, 13–17 May 2017. [Google Scholar] [CrossRef]
- Mantecón, T.; del-Blanco, C.R.; Jaureguizar, F.; García, N. Hand gesture recognition using infrared imagery provided by leap motion controller. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Marin, G.; Dominio, F.; Zanuttigh, P. Hand gesture recognition with leap motion and kinect devices. In Proceedings of the 2014 IEEE International Conference on Image Processing, Paris, France, 27–30 October 2014. [Google Scholar] [CrossRef]
- Penelle, B.; Debeir, O. Multi-sensor data fusion for hand tracking using Kinect and Leap Motion. In Proceedings of the 2014 Virtual Reality International Conference, Laval, France, 9–11 April 2014. [Google Scholar] [CrossRef]
- Craig, A.; Krishnan, S. Fusion of Leap Motion and Kinect Sensors for Improved Field of View and Accuracy for VR Applications. Virtual Reality Course Report; Stanford EE267. 2016. Available online: https://pdfs.semanticscholar.org/1167/96892f2df6e2b298aad8d543b3474e7f8a0b.pdf (accessed on 1 March 2019).
- Guzsvinecz, T.; Kovacs, C.; Reich, D.; Szucs, V.; Sik-Lanyi, C. Developing a virtual reality application for the improvement of depth perception. In Proceedings of the 9th IEEE International Conference on Cognitive Infocommunications, Budapest, Hungary, 22–24 August 2018; pp. 17–22. [Google Scholar]
Number of Results | |||
---|---|---|---|
Keywords | Web of Science | PubMed | IEEE Xplore |
Kinect review | 74 | 36 | 48 |
Kinect accuracy | 635 | 200 | 884 |
Kinect precision | 105 | 4 | 128 |
Kinect skeleton | 183 | 99 | 578 |
Kinect gesture recognition | 240 | 30 | 727 |
Kinect medical applications | 27 | 19 | 183 |
Kinect physical disability | 22 | 22 | 24 |
Kinect education | 67 | 77 | 183 |
Leap Motion review | 12 | 8 | 6 |
Leap Motion accuracy | 66 | 18 | 83 |
Leap Motion precision | 19 | 7 | 20 |
Leap Motion gesture recognition | 36 | 9 | 147 |
Leap Motion medical applications | 6 | 7 | 34 |
Leap Motion physical disability | 3 | 3 | 3 |
Leap Motion education | 10 | 12 | 25 |
Mode | Approximate Frame Rate/Second | Delay |
---|---|---|
High Precision mode | 50 fps | 20 ms |
Balanced Tracking mode | 100 fps | 10 ms |
High Speed mode | 200 fps | 5 ms |
Kinect v1 | Kinect v2 | Xtion | Xtion Pro Live | Intel RealSense SR300 | Intel RealSense D415 | |
---|---|---|---|---|---|---|
Color camera resolution | 1280 × 720 at 12 fps, 640 × 480 at 30 fps | 1920 × 1080 at 30 fps | 640 × 480 at 30 fps | 1280 × 1024 at 15 fps, 640 × 480 at 30 fps | 1920 × 1080 at 30 fps, 1280 × 720 at 60 fps | 1920 × 1080 at 60 fps |
Depth camera resolution | 320 × 240 at 30 fps | 512 × 424 at 30 fps | 320 × 240 at 30 fps | 640 × 480 at 30 fps, 320 × 240 at 60 fps | 640 × 480 at 30 fps | 1280 × 720 at 90 fps |
Depth technology | Infrared | ToF | Infrared | Infrared | Coded light | Stereoscopic active infrared |
Field of view 1 | 57°H, 43°V | 70°H, 60°V | 58°H, 45°V | 58°H, 45°V | 73°H, 59°V | 69.4°H, 42.5°V |
Specified measuring distance | 0.4 or 0.8 m–4 m | 0.5–4.5 m | 0.8–3.5 m | 0.8–3.5 m | 0.3–2 m | 0.16–10 m |
Connectivity | USB 2.0 or 3.0 | USB 3.0 | USB 2.0 | USB 2.0 | USB 3.0 | USB 3.0 Type-C |
Name | Mapping | Sampling Rate 1 | Cost |
---|---|---|---|
Kinect v1 | Depth (IR) | 30 Hz | US$99.95 [84] |
Kinect v2 | Depth (ToF) | 30 Hz | US$99.99 [85] |
Xtion | Depth (IR) | 30 Hz | €50 [86] |
Xtion Pro Live | Depth (IR) | 15 Hz | US$140 [87] |
Intel RealSense SR300 | Depth (Coded light) | 30 Hz | €68.12 [88] |
Intel RealSense D415 | Depth (Stereo active IR) | 90 Hz | US$149 [89] |
Polhemus Liberty Latus | EM field | 188 Hz or 94 Hz | US$12,500–US$60,000 2 |
sEMG | Electrodes | 800 Hz–1 kHz 3 [90] | US$25,000 4 [91] |
MINT-PD | Laser | No information. | Not available. |
ILRIS 3D | Laser | 2500 points/s | €16,000 [92] |
Name | Mapping | Sampling Rate | Connectivity | Cost |
---|---|---|---|---|
LMC | Algorithmic | 50–200 Hz 1 | USB 2.0 or 3.0 | US$80 [96] |
Optotrak marker | Strobe | 120 Hz | Wired/Wireless | Not available. |
Myo Armband | Electrodes | 200 Hz | Bluetooth | US$200 [97] |
Creative SENZ3D | Depth | 30 Hz | USB 2.0 or 3.0 | US$79 [98] |
Kinect v1 | Kinect v2 | |
---|---|---|
Dimensions | 27.94 cm × 6.35 cm × 3.81 cm [105] | 24.9 cm × 6.6 cm × 6.7 cm [106] |
Color resolution and fps | 640 × 380 at 30 fps or 1280 × 720 at 12 fps | 1920 × 1080 at 30 fps |
IR resolution and fps | 640 × 480 at 30 fps | 512 × 424 at 30 fps |
Depth resolution and fps | 320 × 240 at 30 fps | 512 × 424 at 30 fps |
Field of view wide-angle lens | 57° horizontal, 43° vertical | 70° horizontal, 60° vertical |
Specified min. distance | 0.4 m or 0.8 m | 0.5 m |
Recommended min. distance | 1.8 m | 1.4 m |
Tested min. distance | 1 m | 0.7 m |
Specified max. distance | 4 m | 4.5 m |
Tested max. distance | 6 m | 4 m |
Active infrared | Not available | Available |
Measurement method | Infrared structured light | Time of Flight |
Minimum latency | 102 ms | 20 ms |
Microphone array | 4 microphones, 16 kHz | 4 microphones, 48 kHz |
Tilt-motor | Available, ±27° [107] | Not available |
Temperature | Weak correlation | Strong correlation |
More distance | Less accuracy | Same accuracy |
Striped depth image | Increases with depth | No stripes on image |
Depth precision | Higher | Less |
Flying pixels | Not present | Present if surface is not flat |
Environment color | Depth estimation unaffected | Affects depth estimation |
Multipath interference | Not present | Present |
Angles affect precision | No | No |
Precision decreasing | Second order polynomial | No math. behavior |
Kinect | Manual Measurement | |
---|---|---|
Precision | Less precise | More precise |
Measuring speed | Faster | Slower |
No. of best measurements | Eight “best” results | 12 “best” results |
No. of worst measurements | Six “worst” results | Five “worst” results |
Nearest measured distance | 501 mm | 500 mm |
Farthest measured distance | 5050 mm | 5000 mm |
Kinect v1 | Kinect v2 | |
---|---|---|
Max. number of tracked people | 2 | 6 |
Available joints to track | 20 | 25 |
Tested distance | 0.85–4 m | 0.5–4.5 m |
Kinect v2 | LMC | |
---|---|---|
Dimensions | 24.9 cm × 6.6 cm × 6.7 cm | 7.874 cm × 3.048 cm × 1.27 cm |
Tracking hardware | 2 depth cameras, IR emitter | 2 cameras, 3 IR LEDs |
Depth resolution | 512 × 424 at 30 fps | 640 × 240 at 60 fps |
Tracking the user | Full body tracking | Hand tracking |
Field of view | 70° horizontal, 60° vertical | 150° horizontal, 120° vertical |
Measuring distance | 50–450 cm 1 | 2.5–60 cm (−80 cm) |
Measurement method | ToF | Mathematical methods |
Access to raw data | Available | Available in recent versions |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guzsvinecz, T.; Szucs, V.; Sik-Lanyi, C. Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review. Sensors 2019, 19, 1072. https://doi.org/10.3390/s19051072
Guzsvinecz T, Szucs V, Sik-Lanyi C. Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review. Sensors. 2019; 19(5):1072. https://doi.org/10.3390/s19051072
Chicago/Turabian StyleGuzsvinecz, Tibor, Veronika Szucs, and Cecilia Sik-Lanyi. 2019. "Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review" Sensors 19, no. 5: 1072. https://doi.org/10.3390/s19051072
APA StyleGuzsvinecz, T., Szucs, V., & Sik-Lanyi, C. (2019). Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review. Sensors, 19(5), 1072. https://doi.org/10.3390/s19051072