Abstract
Eye-control interfaces are human–computer interfaces in which interaction is mediated by the user’s gaze. Using dwell time for target selection may be hindered by the “Midas Touch” problem, which posits that intentional selection and perceptual processes cannot be separated. To solve this problem, we investigated the influence of different dwell times on task performance. Results suggest that the optimal dwell time to trigger a click in eye-control movement plus eye-control click and hand-control movement plus eye-control click are 700 and 200 ms, respectively. In addition, the eye-control movement plus eye-control click mode has a lower completion rate than the hand-control movement plus eye-control click mode.
Similar content being viewed by others
References
Anliker J (1976) Eye movements-on-line measurement, analysis, and control. In R. A. Monty & J. W. Senders (Eds.), Eye movements and psychological processes (pp. 185–202). Hillsdale, NJ: Lawrence Erlbaum
Benligiray B, Topal C, Akinlar C (2018) Slicetype: fast gaze typing with a merging keyboard. J Multimodal User Interfaces 13(4):321–334
Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture: expressive, precise and targeted free-space interactions. In: Proceedings of the 2015 ACM on international conference on multimodal interaction, ACM, New York, pp 131–138
Chin CA, Barreto A (2007) The integration of electromyogram and eye gaze tracking inputs for hands-free cursor control. Biomed Sci Instrum 43:152–157
Cleveland NR (1994) Eyegaze human-computer interface for people with disabilities. Proceedings of 1st Automation Technology and Human Performance Conference. Washington, DC
Cornsweet TN, Crane HD (1973) Accurate two-dimensional eye tracker using first and fourth Purkinje images. JOSA 63(8):921–928
Dorr M, Pomarjanschi L, Barth E (2007) Gaze beats mouse: a case study on a gaze-controlled breakout. Cogain 7(2):197–211
Drewes H (2010) Eye gaze tracking for human computer interaction. Ludwig-Maximilians-Universität München, Munich, German
Feng Chengzhi (2010) Eye-movement based human-computer interaction. Soochow University Press, Suzhou
Ghani MU, Chaudhry S, Sohail M, Geelani MN (2013) GazePointer: A real time mouse pointer control implementation based on eye gaze tracking. Multi Topic Conference 68:154–159
Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks - detection and duration analysis in real time. Computer vision and pattern recognition. IEEE, New York, pp 1010–1017
Guo Xiaobo (2009) Research on eye-controlled mouse system based on eye-tracking technology. Tianjin University, Tianjin
Hansen JP, Dan WH, Johansen AS (2001) Bringing gaze-based interaction back to basics. Universal Access in HCI: Towards an Information Society for All, Proceedings of HCI International, pp 325–329
Helmert JR, Pannasch S, Velichkovsky BM (2008) Influences of dwell time and cursor control on the performance in gaze driven typing. J Eye Mov Res 2(4):1–8
Hennessey C, Noureddin B, Lawrence P (2006) A single camera eye-gaze tracking system with free head motion, Advances in eye tracking technology, pp 87–94
Huang Qiao (2008) The HCI system base on eye-gazed input. Comput Syst Appl 17(5):14–16
Hutchinson TE (1989) Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybernet 19(6):1527–1533
Hyrskykari A (1997) Gaze control as an input device. Proceedings of ACHCH’97(pp. 22–27). University of Tampere
Istance HO, Spinner C, Howarth PA (1996) Providing motor impaired users with access to standard Graphical User Interface (GUI) software via eye-based interaction. Proceedings of ECDVRAT’96 (pp. 109–116). University of Reading, Reading
Jacob RJK (1990) The use of eye movements in human-computer interaction techniques: what you look at is what you get. Trans Inf Syst 9:152–169
Jacob RJK (1995) Eye tracking in advanced interface design. Virtual environments and advanced interface design. Oxford University Press, Inc, Oxford
Kenyon RV (1985) A soft contact lens search coil for measuring eye movements. Vision Res 25(11):1629–1633
Koesling H, Zoellner M, Sichelschmidt L, Ritter H (2009) With a flick of the eye: Assessing gaze-controlled human-computer interaction. Human centered robot systems, cognition, interaction, technology, pp 83–92
Kumar M, Paepcke A, Winograd T (2007) EyePoint: practical pointing and selection using gaze and keyboard. SIGCHI Conference on Human Factors in Computing Systems 17:421–430
Majaranta P, Räihä KJ (2002) Twenty years of eye typing: Systems and design issues. In: Proceedings of the Symposium on ETRA 2002: Eye Tracking Research & Applications Symposium 2002, New Orleans, 15–22
Park KS, Lee KT (1996) Eye-controlled human/computer interface using the line-of-sight and the intentional blink. Comput Eng 30(3):463–473
Rasmusson D, Chappell R, Trego M (1999) Quick glance: Eye-tracking access to the windows95 operating environment. Proceedings of the Fourteenth International Conference on Technology and Persons with Disabilities(CSUN’99), Los Angeles. Retrieved March 22, 2007, from http://www.csun.edu/cod/conf/1999/proceedings/session0153.htm
Ryu K, Lee JJ, Park JM (2019) GG interaction: a gaze-grasp pose interaction for 3d virtual object selection. J Multimodal User Interfaces 13(4):383–393
Selvamuthu D, Das D (2018) Single-factor experimental design. Introduction to statistical methods, design of experiments and statistical quality control. Springer, Singapore. https://doi.org/10.1007/978-981-13-1736-1_7
Sol R, Chen M, Marques JC (2013) Cursor control trace: another look into eye-gaze, hand, and eye-hand pointing techniques. ACHI 2013, The Sixth International Conference on Advances in Computer-Human Interactions, pp 440–443
Thomas E et al (1989) Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybernet 19(6):1527–1533
Van Schaik P, Ling J (2005) Five psychometric scales for online measurement of the quality of human-computer interaction in web sites. Int J Hum Comput Interact 18(3):309–322
Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (MAGIC) pointing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp 246–253
Zhang Y, Stellmach S, Sellen A, Blake A (2015) The costs and benefits of combining gaze and hand gestures for remote interaction. In: Human–computer interaction. Springer, pp 570–577
Zhanglichuan Li Hongting., & Ge Liezhong (2009) Application of Tobii eye tracker in human-computer interaction. Chin J Ergon 15(2):67–69
Acknowledgements
This work was supported by the National Nature Science Foundation of China under Grant (31900768), the scientific research starting foundation of Zhejiang Sci-Tech University [16062022-Y], and the open fund of the State Key Laboratory of Nuclear Power Safety Monitoring Technology and Equipment (K-A2019.428).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Huang, W., Cheng, B., Zhang, G. et al. Ergonomics research on eye–hand control dual channel interaction. Multimed Tools Appl 80, 7833–7851 (2021). https://doi.org/10.1007/s11042-020-10097-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-020-10097-z