Abstract
Accurate segmentation of metastatic lymph nodes in rectal cancer is crucial for the staging and treatment of rectal cancer. However, existing segmentation approaches face challenges due to the absence of pixel-level annotated datasets tailored for lymph nodes around the rectum. Additionally, metastatic lymph nodes are characterized by their relatively small size, irregular shapes, and lower contrast compared to the background, further complicating the segmentation task. To address these challenges, we present the first large-scale perirectal metastatic lymph node CT image dataset called Meply, which encompasses pixel-level annotations of 269 patients diagnosed with rectal cancer. Furthermore, we introduce a novel lymph-node segmentation model named CoSAM. The CoSAM utilizes sequence-based detection to guide the segmentation of metastatic lymph nodes in rectal cancer, contributing to improved localization performance for the segmentation model. It comprises three key components: sequence-based detection module, segmentation module, and collaborative convergence unit. To evaluate the effectiveness of CoSAM, we systematically compare its performance with several popular segmentation methods using the Meply dataset. The code can be accessed at: https://github.com/kanydao/CoSAM.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Mediastinal lymph node quantification (lnq): Segmentation of heterogeneous ct data. https://lnq2023.grand-challenge.org/ (2023)
Segmentation of organs-at-risk and gross tumor volume of npc for radiotherapy planning (segrap2023). https://segrap2023.grand-challenge.org/ (2023)
Andrearczyk, V., Oreiller, V., Boughdad, S., Rest, C.C.L., Elhalawani, H., Jreige, M., Prior, J.O., Vallières, M., Visvikis, D., Hatt, M., et al.: Overview of the hecktor challenge at miccai 2021: automatic head and neck tumor segmentation and outcome prediction in pet/ct images. In: 3D Head and Neck Tumor Segmentation in PET/CT Challenge, pp. 1–37. Springer (2021)
Bouget, D., Pedersen, A., Vanel, J., Leira, H.O., Langø, T.: Mediastinal lymph nodes segmentation using 3d convolutional neural network ensembles and anatomical priors guiding. Comput. Methods Biomech. Biomed. Eng.: Imaging Visual. 11(1), 44–58 (2023)
Cao, H., Wang, Y., Chen, J., Jiang, D., Zhang, X., Tian, Q., Wang, M.: Swin-unet: Unet-like pure transformer for medical image segmentation. In: European Conference on Computer Vision, pp. 205–218. Springer (2022)
Cardenas, C.E., Mohamed, A.S., Yang, J., Gooding, M., Veeraraghavan, H., Kalpathy-Cramer, J., Ng, S.P., Ding, Y., Wang, J., Lai, S.Y., et al.: Head and neck cancer patient images for determining auto-segmentation accuracy in t2-weighted magnetic resonance imaging through expert manual segmentations. Med. Phys. 47(5), 2317–2322 (2020)
Chen, J., Lu, Y., Yu, Q., Luo, X., Adeli, E., Wang, Y., Lu, L., Yuille, A.L., Zhou, Y.: Transunet: Transformers make strong encoders for medical image segmentation. arXiv preprint arXiv:2102.04306 (2021)
Huang, X., Deng, Z., Li, D., Yuan, X., Fu, Y.: Missformer: an effective transformer for 2d medical image segmentation. IEEE Trans. Med. Imaging (2022)
Ibtehaz, N., Rahman, M.S.: Multiresunet: rethinking the u-net architecture for multimodal biomedical image segmentation. Neural Netw. 121, 74–87 (2020)
Jha, D., Riegler, M.A., Johansen, D., Halvorsen, P., Johansen, H.D.: Doubleu-net: A deep convolutional neural network for medical image segmentation. In: 2020 IEEE 33rd International Symposium on Computer-Based Medical Systems (CBMS), pp. 558–564. IEEE (2020)
Keller, D.S., Berho, M., Perez, R.O., Wexner, S.D., Chand, M.: The multidisciplinary management of rectal cancer. Nat. Rev. Gastroenterol. Hepatol. 17(7), 414–429 (2020)
Kirillov, A., Mintun, E., Ravi, N., Mao, H., Rolland, C., Gustafson, L., Xiao, T., Whitehead, S., Berg, A.C., Lo, W.Y., et al.: Segment anything. arXiv preprint arXiv:2304.02643 (2023)
Ma, J., Wang, B.: Segment anything in medical images. arXiv preprint arXiv:2304.12306 (2023)
Milletari, F., Navab, N., Ahmadi, S.A.: V-net: Fully convolutional neural networks for volumetric medical image segmentation. In: 2016 Fourth International Conference on 3D Vision (3DV), pp. 565–571. IEEE (2016)
Muthusamy, V.R., Chang, K.J.: Optimal methods for staging rectal cancer. Clin. Cancer Res. 13(22), 6877s–6884s (2007)
Oktay, O., Schlemper, J., Folgoc, L.L., Lee, M., Heinrich, M., Misawa, K., Mori, K., McDonagh, S., Hammerla, N.Y., Kainz, B., et al.: Attention u-net: Learning where to look for the pancreas. arXiv preprint arXiv:1804.03999 (2018)
Ronneberger, O., Fischer, P., Brox, T.: U-net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5–9, 2015, Proceedings, Part III 18, pp. 234–241. Springer (2015)
Roth, H.R., Lu, L., Seff, A., Cherry, K.M., Hoffman, J., Wang, S., Liu, J., Turkbey, E., Summers, R.M.: A new 2.5 d representation for lymph node detection using random sets of deep convolutional neural network observations. In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2014: 17th International Conference, Boston, MA, USA, September 14–18, 2014, Proceedings, Part I 17, pp. 520–527. Springer (2014)
Wang, H., Cao, P., Wang, J., Zaiane, O.R.: Uctransnet: rethinking the skip connections in u-net from a channel-wise perspective with transformer. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 2441–2449 (2022)
Wu, J., Fu, R., Fang, H., Liu, Y., Wang, Z., Xu, Y., Jin, Y., Arbel, T.: Medical Sam adapter: adapting segment anything model for medical image segmentation. arXiv preprint arXiv:2304.12620 (2023)
Yan, K., Wang, X., Lu, L., Summers, R.M.: Deeplesion: automated mining of large-scale lesion annotations and universal lesion detection with deep learning. J. Med. Imaging 5(3), 036501–036501 (2018)
Zhang, H., Guo, W., Qiu, C., Wan, S., Zou, B., Wang, W., Jin, P.: Care: A large scale ct image dataset and clinical applicable benchmark model for rectal cancer segmentation. arXiv preprint arXiv:2308.08283 (2023)
Zhang, H., Xie, R., Wan, S., Jin, P.: Decoupling mil transformer-based network for weakly supervised polyp detection. In: 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pp. 969–973. IEEE (2023)
Zhang, H., Yang, J., Wan, S., Fua, P.: Lefusion: synthesizing myocardial pathology on cardiac MRI via lesion-focus diffusion models. arXiv preprint arXiv:2403.14066 (2024)
Zhang, K., Liu, D.: Customized segment anything model for medical image segmentation. arXiv preprint arXiv:2304.13785 (2023)
Acknowledgment
This work is supported by The University Synergy Innovation Program of Anhui Province (Grant No. GXXT-2022-056).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Guo, W. et al. (2025). Meply: A Large-scale Dataset and Baseline Evaluations for Metastatic Perirectal Lymph Node Detection and Segmentation. In: Lin, Z., et al. Pattern Recognition and Computer Vision. PRCV 2024. Lecture Notes in Computer Science, vol 15044. Springer, Singapore. https://doi.org/10.1007/978-981-97-8496-7_25
Download citation
DOI: https://doi.org/10.1007/978-981-97-8496-7_25
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-8495-0
Online ISBN: 978-981-97-8496-7
eBook Packages: Computer ScienceComputer Science (R0)