USG-Net: Deep Learning-based Ultrasound Scanning-Guide for an Orthopedic Sonographer | SpringerLink
Skip to main content

USG-Net: Deep Learning-based Ultrasound Scanning-Guide for an Orthopedic Sonographer

  • Conference paper
  • First Online:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2022 (MICCAI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13437))

  • 6461 Accesses

Abstract

Ultrasound (US) imaging is widely used in the field of medicine. US images containing pathological information are essential for better diagnosis. However, it is challenging to obtain informative US images because of their anatomical complexity, which is significantly dependent on the expertise of the sonographer. Therefore, in this study, we propose a fully automatic scanning-guide algorithm that assists unskilled sonographers in acquiring informative US images by providing accurate directions of probe movement to search for target disease regions. The main contributions of this study are: (1) proposing a new scanning-guide task that searches for a rotator cuff tear (RCT) region using a deep learning-based algorithm, i.e., ultrasound scanning-guide network (USG-Net); (2) constructing a dataset to optimize the corresponding deep learning algorithm. Multidimensional US images collected from 80 patients with RCT were processed to optimize the scanning-guide algorithm which classified the existence of RCT. Furthermore, the algorithm provides accurate directions for the RCT, if it is not in the current frame. The experimental results demonstrate that the fully optimized scanning-guide algorithm offers accurate directions to localize a probe within target regions and helps to acquire informative US images.

K. Lee and J. Yang—Contributed equally.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 5719
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 7149
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: International Conference on Machine Learning, pp. 214–223. PMLR (2017)

    Google Scholar 

  2. Arlot, S., Celisse, A.: A survey of cross-validation procedures for model selection. Stat. Surv. 4, 40–79 (2010)

    Article  MathSciNet  Google Scholar 

  3. Baumgartner, C.F., et al.: Sononet: real-time detection and localisation of fetal standard scan planes in freehand ultrasound. IEEE Trans. Med. Imaging 36(11), 2204–2215 (2017). https://doi.org/10.1109/TMI.2017.2712367

    Article  Google Scholar 

  4. Chen, L.C., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L.: Semantic image segmentation with deep convolutional nets and fully connected CRFs. arXiv preprint arXiv:1412.7062 (2014)

  5. Chiang, T.C., Huang, Y.S., Chen, R.T., Huang, C.S., Chang, R.F.: Tumor detection in automated breast ultrasound using 3-d cnn and prioritized candidate aggregation. IEEE Trans. Med. Imaging 38(1), 240–249 (2018)

    Article  Google Scholar 

  6. Dalton, S.: The conservative management of rotator cuff disorders (1994)

    Google Scholar 

  7. Droste, R., Drukker, L., Papageorghiou, A.T., Noble, J.A.: Automatic probe movement guidance for freehand obstetric ultrasound. In: Martel, A.L., et al. (eds.) MICCAI 2020. LNCS, vol. 12263, pp. 583–592. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59716-0_56

    Chapter  Google Scholar 

  8. Fenster, A., Parraga, G., Bax, J.: Three-dimensional ultrasound scanning. Interface Focus 1(4), 503–519 (2011)

    Article  Google Scholar 

  9. Gee, A., Prager, R., Treece, G., Berman, L.: Engineering a freehand 3D ultrasound system. Pattern Recogn. Lett. 24(4–5), 757–777 (2003)

    Article  Google Scholar 

  10. Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4700–4708 (2017)

    Google Scholar 

  11. Huang, Q., Huang, Y., Luo, Y., Yuan, F., Li, X.: Segmentation of breast ultrasound image with semantic classification of superpixels. Med. Image Anal. 61, 101657 (2020)

    Article  Google Scholar 

  12. Lee, M.H., Kim, J.Y., Lee, K., Choi, C.H., Hwang, J.Y.: Wide-field 3d ultrasound imaging platform with a semi-automatic 3d segmentation algorithm for quantitative analysis of rotator cuff tears. IEEE Access 8, 65472–65487 (2020)

    Article  Google Scholar 

  13. Lei, Y., et al.: Ultrasound prostate segmentation based on multidirectional deeply supervised v-net. Med. Phys. 46(7), 3194–3206 (2019)

    Article  Google Scholar 

  14. Li, K., et al.: Autonomous navigation of an ultrasound probe towards standard scan planes with deep reinforcement learning. In: 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 8302–8308. IEEE (2021)

    Google Scholar 

  15. Looney, P., et al.: Fully automated, real-time 3d ultrasound segmentation to estimate first trimester placental volume using deep learning. JCI Insight 3(11), e120178 (2018)

    Article  Google Scholar 

  16. Ouahabi, A., Taleb-Ahmed, A.: Deep learning for real-time semantic segmentation: application in ultrasound imaging. Pattern Recogn. Lett. 144, 27–34 (2021)

    Article  Google Scholar 

  17. Prevost, R., et al.: 3D freehand ultrasound without external tracking using deep learning. Med. Image Anal. 48, 187–202 (2018)

    Article  Google Scholar 

  18. Shin, S.Y., Lee, S., Yun, I.D., Kim, S.M., Lee, K.M.: Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans. Med. Imaging 38(3), 762–774 (2018)

    Article  Google Scholar 

  19. Song, X., et al.: Cross-modal attention for MRI and ultrasound volume registration. In: de Bruijne, M., et al. (eds.) MICCAI 2021. LNCS, vol. 12904, pp. 66–75. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87202-1_7

    Chapter  Google Scholar 

  20. Tempelhof, S., Rupp, S., Seil, R.: Age-related prevalence of rotator cuff tears in asymptomatic shoulders. J. Shoulder Elbow Surg. 8(4), 296–299 (1999)

    Article  Google Scholar 

  21. Xie, H., Shan, H., Wang, G.: Deep encoder-decoder adversarial reconstruction (dear) network for 3d ct from few-view data. Bioengineering 6(4), 111 (2019)

    Article  Google Scholar 

  22. Xue, C., et al.: Global guidance network for breast lesion segmentation in ultrasound images. Med. Image Anal. 70, 101989 (2021)

    Article  Google Scholar 

  23. Yamamoto, A., et al.: Prevalence and risk factors of a rotator cuff tear in the general population. J. Shoulder Elbow Surg. 19(1), 116–120 (2010)

    Article  Google Scholar 

  24. Zhou, Y., et al.: Multi-task learning for segmentation and classification of tumors in 3D automated breast ultrasound images. Med. Image Anal. 70, 101918 (2021)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the Technology Innovation Program (No. 2001424) funded By the Ministry of Trade, Industry & Energy (MOTIE, Korea) and the Korea Medical Device Development Fund grant funded by the Korea government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health & Welfare, the Ministry of Food and Drug Safety) (Project Number: RS-2020-KD000125, 9991006798).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jae Youn Hwang .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 226 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lee, K., Yang, J., Lee, M.H., Chang, J.H., Kim, JY., Hwang, J.Y. (2022). USG-Net: Deep Learning-based Ultrasound Scanning-Guide for an Orthopedic Sonographer. In: Wang, L., Dou, Q., Fletcher, P.T., Speidel, S., Li, S. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2022. MICCAI 2022. Lecture Notes in Computer Science, vol 13437. Springer, Cham. https://doi.org/10.1007/978-3-031-16449-1_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-16449-1_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-16448-4

  • Online ISBN: 978-3-031-16449-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics