Ergonomics research on eye–hand control dual channel interaction | Multimedia Tools and Applications Skip to main content
Log in

Ergonomics research on eye–hand control dual channel interaction

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Eye-control interfaces are human–computer interfaces in which interaction is mediated by the user’s gaze. Using dwell time for target selection may be hindered by the “Midas Touch” problem, which posits that intentional selection and perceptual processes cannot be separated. To solve this problem, we investigated the influence of different dwell times on task performance. Results suggest that the optimal dwell time to trigger a click in eye-control movement plus eye-control click and hand-control movement plus eye-control click are 700 and 200 ms, respectively. In addition, the eye-control movement plus eye-control click mode has a lower completion rate than the hand-control movement plus eye-control click mode.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Anliker J (1976) Eye movements-on-line measurement, analysis, and control. In R. A. Monty & J. W. Senders (Eds.), Eye movements and psychological processes (pp. 185–202). Hillsdale, NJ: Lawrence Erlbaum

  2. Benligiray B, Topal C, Akinlar C (2018) Slicetype: fast gaze typing with a merging keyboard. J Multimodal User Interfaces 13(4):321–334

  3. Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture: expressive, precise and targeted free-space interactions. In: Proceedings of the 2015 ACM on international conference on multimodal interaction, ACM, New York, pp 131–138

  4. Chin CA, Barreto A (2007) The integration of electromyogram and eye gaze tracking inputs for hands-free cursor control. Biomed Sci Instrum 43:152–157

    Google Scholar 

  5. Cleveland NR (1994) Eyegaze human-computer interface for people with disabilities. Proceedings of 1st Automation Technology and Human Performance Conference. Washington, DC

  6. Cornsweet TN, Crane HD (1973) Accurate two-dimensional eye tracker using first and fourth Purkinje images. JOSA 63(8):921–928

    Article  Google Scholar 

  7. Dorr M, Pomarjanschi L, Barth E (2007) Gaze beats mouse: a case study on a gaze-controlled breakout. Cogain 7(2):197–211

    Google Scholar 

  8. Drewes H (2010) Eye gaze tracking for human computer interaction. Ludwig-Maximilians-Universität München, Munich, German

    Google Scholar 

  9. Feng Chengzhi (2010) Eye-movement based human-computer interaction. Soochow University Press, Suzhou

    Google Scholar 

  10. Ghani MU, Chaudhry S, Sohail M, Geelani MN (2013) GazePointer: A real time mouse pointer control implementation based on eye gaze tracking. Multi Topic Conference 68:154–159

  11. Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks - detection and duration analysis in real time. Computer vision and pattern recognition. IEEE, New York, pp 1010–1017

    Google Scholar 

  12. Guo Xiaobo (2009) Research on eye-controlled mouse system based on eye-tracking technology. Tianjin University, Tianjin

  13. Hansen JP, Dan WH, Johansen AS (2001) Bringing gaze-based interaction back to basics. Universal Access in HCI: Towards an Information Society for All, Proceedings of HCI International, pp 325–329

  14. Helmert JR, Pannasch S, Velichkovsky BM (2008) Influences of dwell time and cursor control on the performance in gaze driven typing. J Eye Mov Res 2(4):1–8

    Google Scholar 

  15. Hennessey C, Noureddin B, Lawrence P (2006) A single camera eye-gaze tracking system with free head motion, Advances in eye tracking technology, pp 87–94

  16. Huang Qiao (2008) The HCI system base on eye-gazed input. Comput Syst Appl 17(5):14–16

    Article  Google Scholar 

  17. Hutchinson TE (1989) Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybernet 19(6):1527–1533

    Article  Google Scholar 

  18. Hyrskykari A (1997) Gaze control as an input device. Proceedings of ACHCH’97(pp. 22–27). University of Tampere

  19. Istance HO, Spinner C, Howarth PA (1996) Providing motor impaired users with access to standard Graphical User Interface (GUI) software via eye-based interaction. Proceedings of ECDVRAT’96 (pp. 109–116). University of Reading, Reading

  20. Jacob RJK (1990) The use of eye movements in human-computer interaction techniques: what you look at is what you get. Trans Inf Syst 9:152–169

    Article  Google Scholar 

  21. Jacob RJK (1995) Eye tracking in advanced interface design. Virtual environments and advanced interface design. Oxford University Press, Inc, Oxford

    Google Scholar 

  22. Kenyon RV (1985) A soft contact lens search coil for measuring eye movements. Vision Res 25(11):1629–1633

    Article  Google Scholar 

  23. Koesling H, Zoellner M, Sichelschmidt L, Ritter H (2009) With a flick of the eye: Assessing gaze-controlled human-computer interaction. Human centered robot systems, cognition, interaction, technology, pp 83–92

  24. Kumar M, Paepcke A, Winograd T (2007) EyePoint: practical pointing and selection using gaze and keyboard. SIGCHI Conference on Human Factors in Computing Systems 17:421–430

  25. Majaranta P, Räihä KJ (2002) Twenty years of eye typing: Systems and design issues. In: Proceedings of the Symposium on ETRA 2002: Eye Tracking Research & Applications Symposium 2002, New Orleans, 15–22

  26. Park KS, Lee KT (1996) Eye-controlled human/computer interface using the line-of-sight and the intentional blink. Comput Eng 30(3):463–473

    Google Scholar 

  27. Rasmusson D, Chappell R, Trego M (1999) Quick glance: Eye-tracking access to the windows95 operating environment. Proceedings of the Fourteenth International Conference on Technology and Persons with Disabilities(CSUN’99), Los Angeles. Retrieved March 22, 2007, from http://www.csun.edu/cod/conf/1999/proceedings/session0153.htm

  28. Ryu K, Lee JJ, Park JM (2019) GG interaction: a gaze-grasp pose interaction for 3d virtual object selection. J Multimodal User Interfaces 13(4):383–393

  29. Selvamuthu D, Das D (2018) Single-factor experimental design. Introduction to statistical methods, design of experiments and statistical quality control. Springer, Singapore. https://doi.org/10.1007/978-981-13-1736-1_7

  30. Sol R, Chen M, Marques JC (2013) Cursor control trace: another look into eye-gaze, hand, and eye-hand pointing techniques. ACHI 2013, The Sixth International Conference on Advances in Computer-Human Interactions, pp 440–443

  31. Thomas E et al (1989) Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybernet 19(6):1527–1533

    Article  Google Scholar 

  32. Van Schaik P, Ling J (2005) Five psychometric scales for online measurement of the quality of human-computer interaction in web sites. Int J Hum Comput Interact 18(3):309–322

    Article  Google Scholar 

  33. Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (MAGIC) pointing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp 246–253

  34. Zhang Y, Stellmach S, Sellen A, Blake A (2015) The costs and benefits of combining gaze and hand gestures for remote interaction. In: Human–computer interaction. Springer, pp 570–577

  35. Zhanglichuan Li Hongting., & Ge Liezhong (2009) Application of Tobii eye tracker in human-computer interaction. Chin J Ergon 15(2):67–69

    Google Scholar 

Download references

Acknowledgements

This work was supported by the National Nature Science Foundation of China under Grant (31900768), the scientific research starting foundation of Zhejiang Sci-Tech University [16062022-Y], and the open fund of the State Key Laboratory of Nuclear Power Safety Monitoring Technology and Equipment (K-A2019.428).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhen Yang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, W., Cheng, B., Zhang, G. et al. Ergonomics research on eye–hand control dual channel interaction. Multimed Tools Appl 80, 7833–7851 (2021). https://doi.org/10.1007/s11042-020-10097-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-10097-z

Keywords

Navigation