Abstract
With the advent of various mobile VR HMDs, they have been widely used in different applications by providing a higher degree of immersion. However, in particular, due to limited input interfaces such as a click button and an inertial sensor in most mobile VR HMDs, it is very difficult for users to naturally navigate VR scenes and to effectively interact with VR contents. This paper presents a new design and comparative analysis of a smartwatch metaphor-based hand gesture interface for supporting more natural 3D navigation in mobile VR. The interaction using the smartwatch metaphor-based interface is implemented based on the drone flying principle, which can support more user-centric 3D navigation as well as 3D manipulation, regardless of the location and limited capability of the mobile device. Furthermore, quantitative and qualitative experiments are performed to compare and analyze task performances with different mobile VR interfaces, and with different desktop VR interfaces. In the first experiment, we compare the proposed approach with widely used mobile VR interfaces for evaluating 3D navigation tasks such as 1) button and inertial sensor-based interface and 2) hand gesture interface in front of the mobile HMD. In the second experiment, in particular, we compare the proposed approach with two desktop VR interfaces such as 1) desktop-based VR interface with keyboard and 2) desktop-based VR interface with the hand gesture. Experiment results proves that the proposed smartwatch metaphor for mobile VR navigation outperformed traditional mobile VR interfaces. We also confirm that the task performance with the proposed approach is good enough to be compared with desktop VR interfaces. One of the main features in the proposed approach is to decouple the degree of freedom (DOF) of navigation and the DOF of visualization in mobile VR so that it can support user’s free head and body movement during the navigation and interaction.
Similar content being viewed by others
References
Beimler R, Bruder G, Steinicke F (2013) Immersive guided tours for virtual tourism through 3D city models. Proc. the GI Workshop on Virtual Reality and Augmented Reality. p 69–75
Bristeau P, Callou F, Vissière D, Petit N (2011). The navigation and control technology inside the AR.drone micro UAV. Proc. IFAC’11. p 1477–1484
Brooke, J (1996) SUS-A quick and dirty usability scale. Usability Evaluation in Industry 189(194):4–7
Choi SH, Kim M, Lee JY (2018) Situation-dependent remote AR collaborations: Image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode. Comput Ind 101:51–66
Christou C, Tzanavari A, Herakleous K, Poullis C (2016). Navigation in virtual reality: comparison of gaze-directed and pointing motion control. Proc. MELECON’16. p 1–6
Dam P, Braz P, Raposo A (2013) A study of navigation and selection techniques in virtual environments using Microsoft Kinect. Proc. VAMR’13. p 139–148
González-Delgado J et al (2015) Virtual 3D tour of the Neogene palaeontological heritage of Huelva (Guadalquivir Basin, Spain). J Environ Earth Sci 73(8):4609–4618
Google Cardboard (2016) https://vr.google.com/cardboard/
Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183
Hayakawa H, Fernando CL, Saraiji MY, Minamizawa K, Tachi S (2015) Telexistence drone: design of a flight telexistence system for immersive aerial sports experience. Proc. AH’15. p 171–172
Hilfert T, König M (2016) Low-cost virtual reality environment for engineering and construction. Visualization in Engineering. 4(1). ISSN: 2213–7459
Hui Z (2017) Head-mounted display-based intuitive virtual reality training system for the mining industry. Int J Min Sci Technol 27(4):717–722
Jang S-A, Baik K, Ko KH (2016) Muru in wonderland: an immersive video tour with gameful character interaction for children. Proc. DIS’16. p 173–176
Kim M, Lee JY (2016) Touch and hand gesture-based interaction for directly manipulating 3D virtual objects in mobile augmented reality. Multimed Tools Appl 75(23):16529–16550
Kumar V, Todorov E (2015) MuJoCo HAPTIX: A virtual reality system for hand manipulation. Proc. IEEE-RAS 15th International Conf. on Humanoid Robots. p 657–663
Lam MC, Arshad H, Prabuwono AS, Tan SY (2017) Interaction techniques in desktop virtual environment: the study of visual feedback and precise manipulation method. Multimed Tools Appl. https://doi.org/10.1007/s11042-017-5205-9. pp. 1-32
Leap Motion (2015) https://developer.leapmotion.com/
Lee JY, Rhee GW, Seo DW (2010) Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment. Int J Adv Manuf Tech 51(9–12):1069–1082
Lee GA, Dünser A, Nassani A, Billinghurst M (2013) AntarcticAR: An outdoor AR experience of a virtual tour to Antarctica. Proc. ISMAR’13
Lie Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: sensor-based activity recognition. Neurocomputing 181(C):108–115
Lie L, Cheng L, Lie Y, Jia Y, Rosenblum DS (2016) Recognizing complex activities by a probabilistic interval-based model. Proc. AAAI’16
Maines C, Tang S (2015) An application of game technology to virtual university campus tour and interior navigation: Proc. DeSE’15
Minohara T (2015) Navigation systems with 3D maps for mobile tablets. Proc. DASC’15. p 1–4
Oculus Rift DK2 (2016) https://www.oculus.com/
Powell W, Powell V, Brown P, Cook M, Uddin J (2016) The influence of navigation interaction technique on perception and behavior in mobile virtual reality. Proc. 11th International Conf. on Disability, Virtual Reality and Assistive Technologies. p 73–81
Samsung Gear VR (2016) http://www.samsung.com/global/galaxy/gear-vr/
Santos BS, Dias P, Pimentel A, Baggerman J-W, Ferreira C, Silva S, Madeira J (2009) Head-mounted display versus desktop for 3D navigation in virtual reality: a user study. Multimed Tools Appl 15(3):161–181
Sarkar A, Patel KA, Ram RKG, Capoor GK (2016) Gesture control of drone using a motion controller. Proc. CIICS’16. p 1–5
Slobounov SM, Ray W, Johnson B, Slobounov E, Newell K (2015) Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study. Int J Psychophysiol 95(3):254–260
Song P, Goh W, Hutama W, Fu C, Liu X (2012) A handle bar metaphor for virtual object manipulation with mid-air interaction. Proc. CHI’2012. p 1297–1306
Teriziman L, Marchal M, Emily M, Multon F, Arnalidi B, Lécuyer A (2010) Shake-your-head: revising walking-in-place for desktop virtual reality. Proc.VRST’10. p 27–34
Tregillus S, Folmer E (2016) VR-STEP: Walking-in-place using inertial sensing for hands free navigation in mobile VR environments. Proc. CHI’16. p 1250–1255
Unity3D (2016) https://unity3d.com/kr/
Wu C, Wang Z, Yang S (2016) Drone streaming with Wi-Fi grid aggregation for virtual tour. https://arxiv.org/abs/1605.09486
Acknowledgements
This work was partly supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2016R1D1A1B03934697) and by the “Development of IIoT-based manufacturing testbeds for the Korean manufacturing equipment industry” program, funded by the Ministry of Science and ICT (2015-0-00374). The authors would like to thank Minseok Kim for designing a new experimental study and performing its evaluation.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Park, KB., Lee, J.Y. New design and comparative analysis of smartwatch metaphor-based hand gestures for 3D navigation in mobile virtual reality. Multimed Tools Appl 78, 6211–6231 (2019). https://doi.org/10.1007/s11042-018-6403-9
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-018-6403-9