Deep Convolutional Neural Network for HEp-2 Fluorescence Intensity Classification
Abstract
:Featured Application
Abstract
1. Introduction
2. Materials and Methods
2.1. Database
2.2. Statistics
2.3. Preprocessing
2.4. Deep CNN
2.5. Pre-Trained Networks Used
- AlexNet [20]: This network has been trained on 1.3 million high-resolution images in the LSVRC-2010 ImageNet training set into the 1000 different object classes. Rectified Linear Units (ReLU) is used as a non-linear activation function at each layer;
- googleNET [21]: This architectures makes use of so-called inception blocks. Inception blocks can be interpreted as a network-in-a-network, where the input is branched into several different convolutional sub-networks, which are concatenated at the end of the block;
- VGG [22]: The main idea of this architecture is to increase depth and reduce the dimension of convolution filters. The image is passed through a stack of convolutional (conv.) layers, where are usedfilters with a very small receptive field: 3 × 3 (which is the smallest size to capture the notion of left/right, up/down, center);
- ResNet [23]: This architecture is currently the best performing deep architecture, being the winner of the ImageNet challenge in 2015. The authors propose deeper CNN for solving the problem of performance degradation due to depth with a residual learning framework. Instead of hoping each few stacked layers directly fit a desired underlying mapping, the authors explicitly let these layers fit a residual mapping;
- Densenet [24]: This pre-trained CNN has an architecture that connects each layer with all the others with depth greater than its own (feed forward mode). One of the peculiarities is that the concatenation of the features from the previous layers is made by concatenating them;
- Sequeezenet [25]: The authors have built a smaller architecture with three main advantages: smaller CNNs require less communication across servers during distributed training; smaller CNNs require less bandwidth to export a new model from the cloud to an autonomous car; smaller CNNs are more feasible to deploy on FPGAs and other hardware with limited memory.
2.6. SVM Classification
3. Results
4. Discussion
Author Contributions
Funding
Conflicts of Interest
References
- Agmon-Levin, N.; Damoiseaux, J.; Kallenberg, C.; Sack, U.; Witte, T.; Herold, M.; Bossuyt, X.; Musset, L.; Cervera, R.; Plaza-Lopez, A.; et al. International recommendations for the assessment of autoantibodies to cellular antigens referred to as anti-nuclear antibodies. Ann. Rheum. Dis. 2014, 73, 17–23. [Google Scholar] [CrossRef] [PubMed]
- Vivona, L.; Cascio, D.; Taormina, V.; Raso, G. Automated approach for indirect immunofluorescence images classification based on unsupervised clustering method. IET Comput. Vis. 2018, 12, 989–995. [Google Scholar] [CrossRef]
- Hobson, P.; Lovell, B.C.; Percannella, G.; Saggese, A.; Vento, M.; Wiliem, A. Computer aided diagnosis for anti-nuclear antibodies HEp-2 images: Progress and challenges. Pattern Recognit. Lett. 2016, 82, 3–11. [Google Scholar] [CrossRef]
- Di Cataldo, S.; Tonti, S.; Bottino, A.; Ficarra, E. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence. Comput. Methods Programs Biomed. 2016, 128, 86–99. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Elgaaied, A.B.; Cascio, D.; Bruno, S.; Ciaccio, M.C.; Cipolla, M.; Fauci, A.; Morgante, R.; Taormina, V.; Gorgi, Y.; Triki, R.M.; et al. Computer-assisted classification patterns in autoimmune diagnostics: the A.I.D.A. Project. BioMed Res. Int. 2016, 2016, 1–9. [Google Scholar] [CrossRef] [PubMed]
- Cascio, D.; Taormina, V.; Raso, G. Automatic HEp-2 specimen analysis system based on active contours model and SVM classification. Appl. Sci. 2019, 9, 307. [Google Scholar] [CrossRef]
- Gupta, K.; Bhavsar, A.; Sao, A.K. CNN based mitotic HEp-2 cell image detection. In Proceedings of the 5th International Conference on Bioimaging, Funchal, Portugal, 19–21 January 2018; pp. 167–174. [Google Scholar]
- Ciatto, S.; Cascio, D.; Fauci, F.; Magro, R.; Raso, G.; Ienzi, R.; Martinelli, F.; Simone, M.V. Computer-assisted diagnosis (CAD) in mammography: comparison of diagnostic accuracy of a new algorithm (Cyclopus®, Medicad) with two commercial systems. La Radiol. Med. 2009, 114, 626–635. [Google Scholar] [CrossRef] [PubMed]
- Cascio, D.; Fauci, F.; Iacomi, M.; Raso, G.; Magro, R.; Castrogiovanni, D.; Filosto, G.; Ienzi, R.; Simone Vasile, M. Computer-aided diagnosis in digital mammography: Comparison of two commercial systems. Imaging Med. 2014, 6, 13–20. [Google Scholar] [CrossRef]
- Foggia, P.; Percannella, G.; Soda, P.; Vento, M. Benchmarking HEp-2 cells classification methods. IEEE Trans. Med. Imaging 2013, 32, 1878–1889. [Google Scholar] [CrossRef] [PubMed]
- Hobson, P.; Lovell, B.C.; Percannella, G.; Vento, M.; Wiliem, A. Benchmarking human epithelial type 2 interphase cells classification methods on a very large dataset. Artif. Intell. Med. 2015, 65, 239–250. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hanley, J.A.; McNeil, B.J. The meaning and use of the area under a receiver operating characteristic (roc) curve. Radiology 1982, 143, 29–36. [Google Scholar] [CrossRef] [PubMed]
- Shen, L.; Jia, X.; Li, Y. Deep cross residual network for HEp-2 cell staining pattern classification. Pattern Recognit. 2018, 82, 68–78. [Google Scholar] [CrossRef]
- Lu, L.; Zheng, Y.; Carneiro, G.; Yang, L. Deep learning and convolutional neural networks for medical image computing. In Advances in Computer Vision and Pattern Recognition; Springer: New York, NY, USA, 2017. [Google Scholar]
- Zhang, Y.-D.; Dong, Z.; Chen, X.; Jia, W.; Du, S.; Muhammad, K.; Wang, S.-H. Image based fruit category classification by 13-layer deep convolutional neural network and data augmentation. Multimed. Tools Appl. 2017, 1–20. [Google Scholar] [CrossRef]
- Zhang, Y.-D.; Pan, C.; Chen, X.; Wang, F. Abnormal breast identification by nine-layer convolutional neural network with parametric rectified linear unit and rank-based stochastic pooling. J. Comput. Sci. 2018, 27, 57–68. [Google Scholar] [CrossRef]
- Masala, G.L.; Tangaro, S.; Golosio, B.; Oliva, P.; Stumbo, S.; Bellotti, R.; De Carlo, F.; Gargano, G.; Cascio, D.; Fauci, F.; et al. Comparative study of feature classification methods for mass lesion recognition in digitized mammograms. Nuovo Cimento Soc. Ital. Fis. Sez. C 2007, 30, 305–316. [Google Scholar]
- Iacomi, M.; Cascio, D.; Fauci, F.; Raso, G. Mammographic images segmentation based on chaotic map clustering algorithm. BMC Med. Imaging 2014, 14, 1–11. [Google Scholar] [CrossRef] [PubMed]
- Fauci, F.; Manna, A.L.; Cascio, D.; Magro, R.; Raso, R.; Iacomi, M.; Vasile, M.S. A fourier based algorithm for microcalcifications enhancement in mammographic images. In Proceedings of the IEEE Nuclear Science Symposium and Medical Imaging Conference, Anaheim, CA, USA, 27 October–3 November 2012; pp. 4388–4391. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very Deep convolutional networks for large-scale image recognition. arXiv, 2014; arXiv:1409.1556. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NA, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar]
- Huang, G.; Liu, Z.; Van der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Pattern Recognition and Computer Vision 2017, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Iandola, F.N.; Han, S.; Moskewicz, M.W.; Ashraf, K.; Dally, W.J.; Keutzer, K. Squeezenet: Alexnet-level accuracy with 50x fewer parameters and <0.5 mb model size. arXiv, 2016; arXiv:1602.07360. [Google Scholar]
- Cascio, D.; Taormina, V.; Cipolla, M.; Fauci, F.; Vasile, M.; Raso, G. HEp-2 cell classification with heterogeneous classes-processes based on K-nearest neighbours. In Proceedings of the 1st IEEE Workshop on Pattern Recognition Techniques for Indirect Immunofluorescence Images ICPR, Washington, DC, USA, 24–24 August 2014; pp. 10–15. [Google Scholar]
- Cascio, D.; Taormina, V.; Cipolla, M.; Bruno, S.; Fauci, F.; Raso, G. A multi-process system for HEp-2 cells classification based on SVM. Pattern Recognit. Lett. 2016, 82, 56–63. [Google Scholar] [CrossRef]
Pre-Trained CNN | Depth | n. Layers | Best Layer Results | Accuracy |
---|---|---|---|---|
alexnet | 8 | 25 | conv3 | 90.3% |
googlenet | 22 | 144 | incep_3a-output | 90.1% |
vgg16 | 16 | 41 | drop7 | 90.3% |
vgg19 | 19 | 47 | drop7 | 90.5% |
resnet18 | 18 | 72 | res5b_relu | 92.3% |
resnet50 | 50 | 177 | avg_pool | 92.2% |
resnet101 | 101 | 347 | res5c_relu | 92.2% |
sequeezenet | 18 | 68 | drop9 | 89.2% |
densenet201 | 201 | 709 | bn | 92.8% |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cascio, D.; Taormina, V.; Raso, G. Deep Convolutional Neural Network for HEp-2 Fluorescence Intensity Classification. Appl. Sci. 2019, 9, 408. https://doi.org/10.3390/app9030408
Cascio D, Taormina V, Raso G. Deep Convolutional Neural Network for HEp-2 Fluorescence Intensity Classification. Applied Sciences. 2019; 9(3):408. https://doi.org/10.3390/app9030408
Chicago/Turabian StyleCascio, Donato, Vincenzo Taormina, and Giuseppe Raso. 2019. "Deep Convolutional Neural Network for HEp-2 Fluorescence Intensity Classification" Applied Sciences 9, no. 3: 408. https://doi.org/10.3390/app9030408
APA StyleCascio, D., Taormina, V., & Raso, G. (2019). Deep Convolutional Neural Network for HEp-2 Fluorescence Intensity Classification. Applied Sciences, 9(3), 408. https://doi.org/10.3390/app9030408