A Pathologist-Informed Workflow for Classification of Prostate Glands in Histopathology | SpringerLink
Skip to main content

A Pathologist-Informed Workflow for Classification of Prostate Glands in Histopathology

  • Conference paper
  • First Online:
Medical Optical Imaging and Virtual Microscopy Image Analysis (MOVI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13578))

Abstract

Pathologists diagnose and grade prostate cancer by examining tissue from needle biopsies on glass slides. The cancer’s severity and risk of metastasis are determined by the Gleason grade, a score based on the organization and morphology of prostate cancer glands. For diagnostic work-up, pathologists first locate glands in the whole biopsy core, and—if they detect cancer—they assign a Gleason grade. This time-consuming process is subject to errors and significant inter-observer variability, despite strict diagnostic criteria. This paper proposes an automated workflow that follows pathologists’ modus operandi, isolating and classifying multi-scale patches of individual glands in whole slide images (WSI) of biopsy tissues using distinct steps: (1) two fully convolutional networks segment epithelium versus stroma and gland boundaries, respectively; (2) a classifier network separates benign from cancer glands at high magnification; and (3) an additional classifier predicts the grade of each cancer gland at low magnification. Altogether, this process provides a gland-specific approach for prostate cancer grading that we compare against other machine-learning-based grading methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 6291
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 7864
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Abbasi Koohpayegani, S., Tejankar, A., Pirsiavash, H.: Compress: self-supervised learning by compressing representations. In: Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M.F., Lin, H. (eds.) Advances in Neural Information Processing Systems. vol. 33, pp. 12980–12992. Curran Associates, Inc. (2020). https://proceedings.neurips.cc//paper/2020/file/975a1c8b9aee1c48d32e13ec30be7905-Paper.pdf

  2. Avenel, C., Tolf, A., Dragomir, A., Carlbom, I.B.: Glandular segmentation of prostate cancer: an illustration of how the choice of histopathological stain is one key to success for computational pathology. Front. Bioeng. Biotechnol. 7, 125 (2019). https://doi.org/10.3389/fbioe.2019.00125, https://www.frontiersin.org/article/10.3389/fbioe.2019.00125

  3. Bulten, W., et al.: Epithelium segmentation using deep learning in H &E-stained prostate specimens with immunohistochemistry as reference standard. Sci. Rep. 9, 864 (2019). https://doi.org/10.1038/s41598-018-37257-4

  4. Bulten, W., et al.: The panda challenge: prostate cancer grade assessment using the Gleason grading system, March 2020. https://doi.org/10.5281/zenodo.3715938

  5. Doersch, C., Zisserman, A.: Multi-task self-supervised visual learning. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 2070–2079 (2017). https://doi.org/10.1109/ICCV.2017.226

  6. Drozdzal, M., Vorontsov, E., Chartrand, G., Kadoury, S., Pal, C.: The importance of skip connections in biomedical image segmentation, August 2016. https://doi.org/10.1007/978-3-319-46976-8_19

  7. Ferrero, A., Elhabian, S., Whitaker, R.: SetGANs: enforcing distributional accuracy in generative adversarial networks, June 2019

    Google Scholar 

  8. Gavrilovic, M., et al.: Blind color decomposition of histological images. IEEE Trans. Med. Imaging 32(6), 983–994 (2013). https://doi.org/10.1109/TMI.2013.2239655

  9. Girshick, R.: Fast R-CNN. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 1440–1448 (2015). https://doi.org/10.1109/ICCV.2015.169

  10. He, K., Gkioxari, G., Dollar, P., Girshick, R.: Mask R-CNN. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 2980–2988 (2017). https://doi.org/10.1109/ICCV.2017.322

  11. He, K., Zhang, X., Ren, S., Sun, J.: Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1904–1916 (2015). https://doi.org/10.1109/TPAMI.2015.2389824

    Article  Google Scholar 

  12. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770–778 (2016). https://doi.org/10.1109/CVPR.2016.90

  13. Li, J., Sarma, K., Ho, K.C., Gertych, A., Knudsen, B., Arnold, C.: A multi-scale u-net for semantic segmentation of histological images from radical prostatectomies. In: AMIA Annual Symposium Proceedings, pp. 1140–1148. AMIA Symposium 2017, April 2018

    Google Scholar 

  14. Li, W., et al.: Path R-CNN for prostate cancer diagnosis and Gleason grading of histological images. IEEE Trans. Med. Imaging 38(4), 945–954 (2019). https://doi.org/10.1109/TMI.2018.2875868

    Article  Google Scholar 

  15. Lokhande, A., Bonthu, S., Singhal, N.: Carcino-net: A deep learning framework for automated Gleason grading of prostate biopsies. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), pp. 1380–1383 (2020). https://doi.org/10.1109/EMBC44109.2020.9176235

  16. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2015

    Google Scholar 

  17. Ma, J.: Histogram matching augmentation for domain adaptation with application to multi-centre, multi-vendor and multi-disease cardiac image segmentation (2020)

    Google Scholar 

  18. Ma, Z., Li, J., Salemi, H., Arnold, C., Knudsen, B., Gertych, A., Ing, N.: Semantic segmentation for prostate cancer grading by convolutional neural networks, p. 46, March 2018. https://doi.org/10.1117/12.2293000

  19. Nagpal, K., et al.: Development and validation of a deep learning algorithm for improving Gleason scoring of prostate cancer. NPJ Digital Med. 2, 48 (2019). https://doi.org/10.1038/s41746-019-0112-2

  20. Ronneberger, O., Fischer, P., Brox, T.: U-net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28

  21. Silva-Rodríguez, J., Payá-Bosch, E., García, G., Colomer, A., Naranjo, V.: Prostate gland segmentation in histology images via residual and multi-resolution U-NET. In: Analide, C., Novais, P., Camacho, D., Yin, H. (eds.) IDEAL 2020. LNCS, vol. 12489, pp. 1–8. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-62362-3_1

  22. Zheng, X., Wang, Y., Wang, G., Liu, J.: Fast and robust segmentation of white blood cell images by self-supervised learning. Micron 107, 55–71 (2018). https://doi.org/10.1016/j.micron.2018.01.010

Download references

Acnowledgments

We acknowledge the generous support from the Department of Defense Prostate Cancer Program Population Science Award W81XWH-21-1-0725-. We also acknowledge that we received the training data from Cedars-Sinai Hospital in Los Angeles and we thank Dr. Akadiusz Gertych for his work on establishing the tiles. The results presented here are in part based upon data generated by the TCGA Research Network: https://www.cancer.gov/tcga.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alessandro Ferrero .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ferrero, A., Knudsen, B., Sirohi, D., Whitaker, R. (2022). A Pathologist-Informed Workflow for Classification of Prostate Glands in Histopathology. In: Huo, Y., Millis, B.A., Zhou, Y., Wang, X., Harrison, A.P., Xu, Z. (eds) Medical Optical Imaging and Virtual Microscopy Image Analysis. MOVI 2022. Lecture Notes in Computer Science, vol 13578. Springer, Cham. https://doi.org/10.1007/978-3-031-16961-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-16961-8_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-16960-1

  • Online ISBN: 978-3-031-16961-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics