Evaluation of the leap motion controller as a new contact-free pointing device - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Dec 24;15(1):214-33.
doi: 10.3390/s150100214.

Evaluation of the leap motion controller as a new contact-free pointing device

Affiliations

Evaluation of the leap motion controller as a new contact-free pointing device

Daniel Bachmann et al. Sensors (Basel). .

Abstract

This paper presents a Fitts' law-based analysis of the user's performance in selection tasks with the Leap Motion Controller compared with a standard mouse device. The Leap Motion Controller (LMC) is a new contact-free input system for gesture-based human-computer interaction with declared sub-millimeter accuracy. Up to this point, there has hardly been any systematic evaluation of this new system available. With an error rate of 7.8% for the LMC and 2.8% for the mouse device, movement times twice as large as for a mouse device and high overall effort ratings, the Leap Motion Controller's performance as an input device for everyday generic computer pointing tasks is rather limited, at least with regard to the selection recognition provided by the LMC.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Visualization of a (a) real (using infrared imaging) and (b) 3D model of the Leap Motion Controller with the coordinate system.
Figure 2.
Figure 2.
Experimental set-up: (a) participant using the mouse controller and (b) utilizing the Leap Motion Controller (LMC) to perform selection tasks.
Figure 3.
Figure 3.
Scaled illustrations of the graphical user interface. Selected targets are displayed in gray; selectable targets are colored green: (a) the additional custom-designed training phase for the LMC pointing task; and (b) the interface for the pointing task.
Figure 4.
Figure 4.
Results of the experiment: (a) error rate (ER) by device and index of difficulty (ID); (b) movement time (MT) by device and ID with linear regression lines fitted to the data; (c) throughput (TP) by block and device; and (d) TP by block and gender. Error bars indicate the 95-percent confidence intervals.
Figure 5.
Figure 5.
Result of the experiment: throughput (TP), the main indicator to estimate the performance of users as a function of device and ID number.
Figure 6.
Figure 6.
Aggregate accumulator plots for different indices of difficulty: (a) ID = 1.0, (b,c) ID = 2.3 and (d) ID = 4.1.
Figure 7.
Figure 7.
Analysis of the interaction between target distance (40 vs. 160 mm) and target width (10 vs. 40 mm). Trajectories were fitted to ellipses, and the ratios of the resulting radii were used as indices of circularity.

Similar articles

Cited by

References

    1. Mitra S., Acharya T. Gesture Recognition: A Survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2007;37:311–324.
    1. Morimoto C.H., Mimica M.R. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 2005;98:4–24.
    1. Anusuya M.A., Katti S.K. Speech Recognition by Machine A Review. Int. J. Comput. Sci. Inform. Secur. 2010;6:181–205.
    1. Ebert L., Flach P., Thali M., Ross S. Out of touch—A plugin for controlling OsiriX with gestures using the leap controller. J. Forensic Radiol. Imaging. 2014;2:126–128.
    1. Ch'ng E. New Ways of Accessing Information Spaces Using 3D Multitouch Tables. Proceedings of the 2012 International Conference on Cyberworlds (CW); Darmstadt, Germany. 25–27September 2012; pp. 144–150.

Publication types

LinkOut - more resources