From microscope to head-mounted display: integrating hand tracking into microsurgical augmented reality | International Journal of Computer Assisted Radiology and Surgery Skip to main content
Log in

From microscope to head-mounted display: integrating hand tracking into microsurgical augmented reality

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

The operating microscope plays a central role in middle and inner ear procedures that involve working within tightly confined spaces under limited exposure. Augmented reality (AR) may improve surgical guidance by combining preoperative computed tomography (CT) imaging that can provide precise anatomical information, with intraoperative microscope video feed. With current technology, the operator must manually interact with the AR interface using a computer. The latter poses a disruption in the surgical flow and is suboptimal for maintaining the sterility of the operating environment. The purpose of this study was to implement and evaluate free-hand interaction concepts leveraging hand tracking and gesture recognition as an attempt to reduce the disruption during surgery and improve human-computer interaction.

Methods

An electromagnetically tracked surgical microscope was calibrated using a custom 3D printed calibration board. This allowed the augmentation of the microscope feed with segmented preoperative CT-derived virtual models. Ultraleap’s Leap Motion Controller 2 was coupled to the microscope and used to implement hand-tracking capabilities. End-user feedback was gathered from a surgeon during development. Finally, users were asked to complete tasks that involved interacting with the virtual models, aligning them to physical targets, and adjusting the AR visualization.

Results

Following observations and user feedback, we upgraded the functionalities of the hand interaction system. User feedback showed the users’ preference for the new interaction concepts that provided minimal disruption of the surgical workflow and more intuitive interaction with the virtual content.

Conclusion

We integrated hand interaction concepts, typically used with head-mounted displays (HMDs), into a surgical stereo microscope system intended for AR in otologic microsurgery. The concepts presented in this study demonstrated a more favorable approach to human-computer interaction in a surgical context. They hold potential for a more efficient execution of surgical tasks under microscopic AR guidance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

Not applicable

Code availability

Not applicable

References

  1. Ridge SE, Shetty KR, Lee DJ (2021) Heads-up surgery: endoscopes and exoscopes for otology and neurotology in the era of the COVID-19 pandemic. Otolaryngologic Clinics of North America 54(1):11–23

    Article  PubMed  Google Scholar 

  2. Li L, Yang J, Chu Y, Wu W, Xue J, Liang P, Chen L (2016) A novel augmented reality navigation system for endoscopic sinus and skull base surgery: a feasibility study. PLoS One 11(1):e0146996

    Article  PubMed  PubMed Central  Google Scholar 

  3. Hussain R, Lalande A, Berihu Girum K, Guigou C, Grayeli AB (2020) Augmented reality for inner ear procedures: visualization of the cochlear central axis in microscopic videos. International Journal of Computer Assisted Radiology and Surgery 15(10):1703–1711

    Article  PubMed  Google Scholar 

  4. Marroquin R, Lalande A, Hussain R, Guigou C, Grayeli AB (2018) Augmented reality of the middle ear combining otoendoscopy and temporal bone computed tomography. Otology & Neurotology 39(8):931–939

    Article  Google Scholar 

  5. Citardi MJ, Agbetoba A, Bigcas JL, Luong A (2016) Augmented reality for endoscopic sinus surgery with surgical navigation: a cadaver study. In: International forum of allergy & rhinology, 6(5):523–528

  6. Chu Y, Yang J, Ma S, Ai D, Li W, Song H, Li L, Chen D, Chen L, Wang Y (2017) Registration and fusion quantification of augmented reality based nasal endoscopic surgery. Medical image analysis 42:241–256

    Article  PubMed  Google Scholar 

  7. Bong JH, Song Hj OY, Park N, Kim H, Park S (2018) Endoscopic navigation system with extended field of view using augmented reality technology. The International Journal of Medical Robotics and Computer Assisted Surgery 14(2):e1886

    Google Scholar 

  8. Winne C, Khan M, Stopp F, Jank E, Keeve E (2011) Overlay visualization in endoscopic ENT surgery. International journal of computer assisted radiology and surgery 6(3):401–406

    Article  PubMed  Google Scholar 

  9. Thoranaghatte R, Garcia J, Caversaccio M, Widmer D, Gonzalez Ballester MA, Nolte LP, Zheng G (2009) Landmark-based augmented reality system for paranasal and transnasal endoscopic surgeries. The International Journal of Medical Robotics and Computer Assisted Surgery 5(4):415–422

    PubMed  Google Scholar 

  10. Caversaccio M, Thoranaghatte R, Zheng G, Eggli P, Nolte L, Ballester G et al (2008) Augmented reality endoscopic system (ARES): preliminary results. Rhinology 46(2):156–158

    PubMed  Google Scholar 

  11. Caversaccio M, Garcia-Giraldez J, Gonzalez-Ballester M, Marti G (2007) Image-guided surgical microscope with mounted minitracker. The Journal of Laryngology & Otology 121(2):160–162

    Article  CAS  Google Scholar 

  12. Liu WP, Richmon JD, Sorger JM, Azizian M, Taylor RH (2015) Augmented reality and cone beam CT guidance for transoral robotic surgery. Journal of robotic surgery 9(3):223–233

    Article  PubMed  Google Scholar 

  13. Liu WP, Azizian M, Sorger J, Taylor RH, Reilly BK, Cleary K, Preciado D (2014) Cadaveric feasibility study of da Vinci Si-assisted cochlear implant with augmented visual navigation for otologic surgery. JAMA Otolaryngology-Head & Neck Surgery 140(3):208–214

    Article  Google Scholar 

  14. Khakhar R, You F, Chakkalakal D, Dobbelstein D, Picht T (2021) Hands-free adjustment of the microscope in microneurosurgery. World Neurosurgery 148:e155–e163

    Article  PubMed  Google Scholar 

  15. Kranzlmuller D, Reitinger B, Hackl I, Volkert J (2001) Voice controlled virtual reality and its perspectives for everyday life. Itg Fachbericht p 101–108

  16. Osking H, Doucette JA (2019) Enhancing emotional effectiveness of virtual-reality experiences with voice control interfaces. In: Immersive Learning Research Network: 5th International Conference, iLRN 2019, London, UK, June 23–27, 2019, Proceedings 5, Springer, p 199–209

  17. Wu CM, Hsu CW, Lee TK, Smith S (2017) A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment. Virtual Reality 21:19–29

    Article  Google Scholar 

  18. Lin JW, Han PH, Lee JY, Chen YS, Chang TW, Chen KW, Hung YP (2017) Visualizing the keyboard in virtual reality for enhancing immersive experience. In: ACM SIGGRAPH 2017 Posters. p 1–2

  19. Tiferes J, Hussein AA, Bisantz A, Higginbotham DJ, Sharif M, Kozlowski J, Ahmad B, O’Hara R, Wawrzyniak N, Guru K (2019) Are gestures worth a thousand words? verbal and nonverbal communication during robot-assisted surgery. Applied ergonomics 78:251–262

    Article  PubMed  Google Scholar 

  20. Vogel D, Balakrishnan R (2005) Distant freehand pointing and clicking on very large, high resolution displays. In: Proceedings of the 18th annual ACM symposium on User interface software and technology, p 33–42

  21. Tao D, Diao X, Wang T, Guo J, Qu X (2021) Freehand interaction with large displays: effects of body posture, interaction distance and target size on task performance, perceived usability and workload. Applied Ergonomics 93(103):370

    Google Scholar 

  22. Cohen CJ, Beach G, Foulk G (2001) A basic hand gesture control system for pc applications. In: Proceedings 30th Applied Imagery Pattern Recognition Workshop (AIPR 2001). Analysis and Understanding of Time Varying Imagery, IEEE, p 74–79

  23. El Chemaly T, Athayde Neves C, Leuze C, Hargreaves B, Blevins NH (2023) Stereoscopic calibration for augmented reality visualization in microscopic surgery. International Journal of Computer Assisted Radiology and Surgery 18(11):2033–2041

    Article  PubMed  Google Scholar 

  24. Zhang Z (2000) A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence 22(11):1330–1334

    Article  Google Scholar 

  25. Shiu YC, Ahmad S (1987) Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX= XB

  26. Neves C, Tran E, Kessler I, Blevins N (2021) Fully automated preoperative segmentation of temporal bone structures from clinical CT scans. Scientific reports 11(1):1–11

    Article  Google Scholar 

  27. Ultraleap (2024) Digital worlds that feel human. https://www.ultraleap.com/

  28. Troutman RC (1965) The operating microscope in ophthalmic surgery. Transactions of the American Ophthalmological Society 63:335

    CAS  PubMed  PubMed Central  Google Scholar 

  29. Lehecka M, Laakso A, Hernesniemi J, Çelik Ö (2011) Helsinki microneurosurgery basics and tricks. Helsinki: M. Lehecka, A. Laakso and J. Hernesniemi.

  30. Hart SG (2006) Nasa-task load index (nasa-tlx); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting, Sage publications Sage CA: Los Angeles, CA, p 904–908

  31. Bangor A, Kortum PT, Miller JT (2008) An empirical evaluation of the system usability scale. Intl Journal of Human-Computer Interaction 24(6):574–594

    Article  Google Scholar 

Download references

Funding

We gratefully acknowledge the Kaufer Family Fund for its philanthropic support of this project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Trishia El Chemaly.

Ethics declarations

Conflict of interest

Dr. Nikolas Blevins, is a co-founder and holds equity in ClaraSim, a startup company that aims to commercialize the technologies and methodologies developed through this research. The remaining authors have no conflict of interest to declare.

Ethics approval

Not applicable

Consent to participate

Not applicable

Consent for publication

Not applicable

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (mp4 6193 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

El Chemaly, T., Athayde Neves, C., Fu, F. et al. From microscope to head-mounted display: integrating hand tracking into microsurgical augmented reality. Int J CARS 19, 2023–2029 (2024). https://doi.org/10.1007/s11548-024-03224-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-024-03224-w

Keywords

Navigation