A Weakly Supervised Approach for Disease Segmentation of Maize Northern Leaf Blight from UAV Images
Abstract
:1. Introduction
1.1. Related Work
1.1.1. Plant Disease Segmentation
1.1.2. Weakly Supervised Segmentation
2. Materials and Methods
2.1. Materials
2.2. Methods
2.2.1. Pseudo-Label Extraction
- Auxiliary Branch Block (ABB)
- Feature Reuse Module (FRM)
- Loss Function for Pseudo-Label Extraction
- Pseudo-Label Generation
2.2.2. Segmentation Model
2.2.3. Evaluation Indicator
2.2.4. Three Comparison Methods
2.2.5. Implementation Details
3. Results
3.1. Pseudo-Labels Quality Comparison
3.2. Pseudo-Labels’ Generation Time
3.3. Segmentation Results
3.4. Ablation Studies
4. Discussion
4.1. Effect of the Proposed Modules for Label Generation
4.2. Analysis of Coefficient in Loss Function
4.3. Segmentation Results with Part of Full Labels
4.4. Comparison with Related Studies and Room for Improvements
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ABB | auxiliary branch block |
BP | backpropagation |
CAM | classification activation map |
CNN | convolutional neural network |
FRM | feature reuse module |
GPS | global positioning system |
HSV | Hue, saturation and value |
UAV | unmanned aerial vehicles |
NLB | Northern Leaf Blight |
RGB | red green and blue |
SLIC | simple linear iterative cluster |
SIFT | scale-invariant feature transform |
VI | vegetation index |
References
- Bayraktar, E.; Basarkan, M.E.; Celebi, N. A low-cost UAV framework towards ornamental plant detection and counting in the wild. ISPRS-J. Photogramm. Remote Sens. 2020, 167, 1–11. [Google Scholar] [CrossRef]
- Wiesner-Hanks, T.; Wu, H.; Stewart, E.; DeChant, C.; Kaczmar, N.; Lipson, H.; Gore, M.A.; Nelson, R.J. Millimeter-level plant disease detection from aerial photographs via deep learning and crowdsourced data. Front. Plant Sci. 2019, 10, 1550. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tetila, E.C.; Machado, B.B.; de Souza Belete, N.A.; Guimaraes, D.A.; Pistori, H. Identification of Soybean Foliar Diseases Using Unmanned Aerial Vehicle Images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
- Yue, J.; Lei, T.; Li, C.; Zhu, J. The application of unmanned aerial vehicle remote sensing in quickly monitoring crop pests. Intell. Autom. Soft Comput. 2012, 18, 1043–1052. [Google Scholar] [CrossRef]
- Yang, N.; Yuan, M.; Wang, P.; Zhang, R.; Sun, J.; Mao, H. Tea diseases detection based on fast infrared thermal image processing technology. J. Sci. Food Agric. 2019, 99, 3459–3466. [Google Scholar] [CrossRef]
- Available online: www.deere.com/en/sprayers/see-spray-ultimate/ (accessed on 15 February 2023).
- Dammer, K.H.; Garz, A.; Hobart, M.; Schirrmann, M. Combined UAV- and tractor-based stripe rust monitoring in winter wheat under field conditions. Agron. J. 2021, 114, 651–661. [Google Scholar] [CrossRef]
- Gillis, M.; Shafii, M.; Browne, G.T.; Upadhyaya, S.K.; Coates, R.W.; Udompetaikul, V. Tractor-mounted, GPS-based spot fumigation system manages Prunus replant disease. Calif. Agric. 2013, 67, 222–227. [Google Scholar] [CrossRef] [Green Version]
- Gnyp, M.; Panitzki, M.; Reusch, S.; Jasper, J.; Bolten, A.; Bareth, G. Comparison between tractor-based and UAV-based spectrometer measurements in winter wheat. In Proceedings of the 13th International Conference on Precision Agriculture, St. Louis, MO, USA, 31 July–3 August 2016. [Google Scholar]
- León-Rueda, W.A.; León, C.; Caro, S.G.; Ramírez-Gil, J.G. Identification of diseases and physiological disorders in potato via multispectral drone imagery using machine learning tools. Trop. Plant Pathol. 2021, 47, 152–167. [Google Scholar] [CrossRef]
- Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef] [Green Version]
- Bagheri, N. Application of aerial remote sensing technology for detection of fire blight infected pear trees. Comput. Electron. Agric. 2020, 168, 105147. [Google Scholar] [CrossRef]
- Su, J.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
- Bohnenkamp, D.; Behmann, J.; Mahlein, A.-K. In-Field Detection of Yellow Rust in Wheat on the Ground Canopy and UAV Scale. Remote Sens. 2019, 11, 2495. [Google Scholar] [CrossRef] [Green Version]
- Liu, L.; Dong, Y.; Huang, W.; Du, X.; Ma, H. Monitoring Wheat Fusarium Head Blight Using Unmanned Aerial Vehicle Hyperspectral Imagery. Remote Sens. 2020, 12, 3811. [Google Scholar] [CrossRef]
- Ahmadi, P.; Mansor, S.; Farjad, B.; Ghaderpour, E. Unmanned Aerial Vehicle (UAV)-Based Remote Sensing for Early-Stage Detection of Ganoderma. Remote Sens. 2022, 14, 1239. [Google Scholar] [CrossRef]
- Ha, J.G.; Moon, H.; Kwak, J.T.; Hassan, S.I.; Dang, M.; Lee, O.N.; Park, H.Y. Deep convolutional neural network for classifying Fusarium wilt of radish from unmanned aerial vehicles. J. Appl. Remote Sens. 2017, 11, 042621. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, S. Segmentation of corn leaf disease based on fully convolution neural network. Acad. J. Comput. Inf. Sci. 2018, 1, 9–18. [Google Scholar]
- Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef]
- Wu, H.; Wiesner-Hanks, T.; Stewart, E.L.; DeChant, C.; Kaczmar, N.; Gore, M.A.; Nelson, R.J.; Lipson, H. Autonomous Detection of Plant Disease Symptoms Directly from Aerial Imagery. Plant Phenome J. 2019, 2, 190006. [Google Scholar] [CrossRef]
- Narkhede, M.V.; Bartakke, P.P.; Sutaone, M.S. A review on weight initialization strategies for neural networks. Artif. Intell. Rev. 2021, 55, 291–322. [Google Scholar] [CrossRef]
- Boulila, W.; Driss, M.; Alshanqiti, E.; Al-Sarem, M.; Saeed, F.; Krichen, M. Weight Initialization Techniques for Deep Learning Algorithms in Remote Sensing: Recent Trends and Future Perspectives. In Advances on Smart and Soft Computing; Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2022; pp. 477–484. [Google Scholar]
- Krichen, M.; Mihoub, A.; Alzahrani, M.Y.; Adoni, W.Y.H.; Nahhal, T. Are Formal Methods Applicable To Machine Learning And Artificial Intelligence? In Proceedings of the 2022 2nd International Conference of Smart Systems and Emerging Technologies (SMARTTECH), Riyadh, Saudi Arabia, 9–11 May 2022; pp. 48–53. [Google Scholar]
- Urban, C.; Miné, A. A review of formal methods applied to machine learning. arXiv 2021, arXiv:02466. [Google Scholar]
- Seshia, S.A.; Sadigh, D.; Sastry, S.S. Toward verified artificial intelligence. Commun. ACM 2022, 65, 46–55. [Google Scholar] [CrossRef]
- Katz, G.; Barrett, C.; Dill, D.L.; Julian, K.; Kochenderfer, M.J. Reluplex: An efficient SMT solver for verifying deep neural networks. In Proceedings of the Computer Aided Verification: 29th International Conference, CAV 2017, Heidelberg, Germany, 24–28 July 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 97–117. [Google Scholar]
- Zhang, T.; Xu, Z.; Su, J.; Yang, Z.; Liu, C.; Chen, W.-H.; Li, J. Ir-UNet: Irregular Segmentation U-Shape Network for Wheat Yellow Rust Detection by UAV Multispectral Imagery. Remote Sens. 2021, 13, 3892. [Google Scholar] [CrossRef]
- Qin, J.; Wang, B.; Wu, Y.; Lu, Q.; Zhu, H. Identifying Pine Wood Nematode Disease Using UAV Images and Deep Learning Algorithms. Remote Sens. 2021, 13, 162. [Google Scholar] [CrossRef]
- Hu, G.; Wu, H.; Zhang, Y.; Wan, M. A low shot learning method for tea leaf’s disease identification. Comput. Electron. Agric. 2019, 163, 104852. [Google Scholar] [CrossRef]
- Huang, M.; Xu, G.; Li, J.; Huang, J. A Method for Segmenting Disease Lesions of Maize Leaves in Real Time Using Attention YOLACT++. Agriculture 2021, 11, 1216. [Google Scholar] [CrossRef]
- Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
- Stewart, E.L.; Wiesner-Hanks, T.; Kaczmar, N.; DeChant, C.; Wu, H.; Lipson, H.; Nelson, R.J.; Gore, M.A. Quantitative Phenotyping of Northern Leaf Blight in UAV Images Using Deep Learning. Remote Sens. 2019, 11, 2209. [Google Scholar] [CrossRef] [Green Version]
- Zhou, Y.; Tang, Y.; Zou, X.; Wu, M.; Tang, W.; Meng, F.; Zhang, Y.; Kang, H. Adaptive Active Positioning of Camellia oleifera Fruit Picking Points: Classical Image Processing and YOLOv7 Fusion Algorithm. Appl. Sci. 2022, 12, 12959. [Google Scholar] [CrossRef]
- Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [Google Scholar] [CrossRef]
- Zhou, B.; Khosla, A.; Lapedriza, A.; Oliva, A.; Torralba, A. Learning deep features for discriminative localization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2921–2929. [Google Scholar]
- Kumar Singh, K.; Jae Lee, Y. Hide-and-seek: Forcing a network to be meticulous for weakly-supervised object and action localization. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 3524–3533. [Google Scholar]
- Zhang, X.; Wei, Y.; Feng, J.; Yang, Y.; Huang, T.S. Adversarial complementary learning for weakly supervised object localization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Lake City, UT, USA, 18–22 June 2018; pp. 1325–1334. [Google Scholar]
- Wei, Y.; Xiao, H.; Shi, H.; Jie, Z.; Feng, J.; Huang, T.S. Revisiting dilated convolution: A simple approach for weakly-and semi-supervised semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 7268–7277. [Google Scholar]
- Lee, J.; Kim, E.; Lee, S.; Lee, J.; Yoon, S. Ficklenet: Weakly and semi-supervised semantic image segmentation using stochastic inference. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 16–20 June 2019; pp. 5267–5276. [Google Scholar]
- Zhou, T.; Zhang, M.; Zhao, F.; Li, J. Regional semantic contrast and aggregation for weakly supervised semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, Louisiana, USA, 19–24 June 2022; pp. 4299–4309. [Google Scholar]
- Yi, R.; Weng, Y.; Yu, M.; Lai, Y.-K.; Liu, Y.-J. Lesion region segmentation via weakly supervised learning. Quant. Biol. 2021, 10, 239–252. [Google Scholar] [CrossRef]
- Kim, W.-S.; Lee, D.-H.; Kim, T.; Kim, H.; Sim, T.; Kim, Y.-J. Weakly Supervised Crop Area Segmentation for an Autonomous Combine Harvester. Sensors 2021, 21, 4801. [Google Scholar] [CrossRef] [PubMed]
- Wiesner-Hanks, T.; Stewart, E.L.; Kaczmar, N.; DeChant, C.; Wu, H.; Nelson, R.J.; Lipson, H.; Gore, M.A. Image set for deep learning: Field images of maize annotated with disease symptoms. BMC Res. Notes 2018, 11, 440. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Russell, B.C.; Torralba, A.; Murphy, K.P.; Freeman, W.T. LabelMe: A database and web-based tool for image annotation. Int. J. Comput. Vis. 2008, 77, 157–173. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Zhang, Z.; Zhang, X.; Peng, C.; Xue, X.; Sun, J. Exfuse: Enhancing feature fusion for semantic segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 269–284. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Wang, C.; Du, P.; Wu, H.; Li, J.; Zhao, C.; Zhu, H. A cucumber leaf disease severity classification method based on the fusion of DeepLabV3+ and U-Net. Comput. Electron. Agric. 2021, 189, 106373. [Google Scholar] [CrossRef]
Ref. | Crop | Disease | Data Type | Method | Accuracy |
---|---|---|---|---|---|
[3] | Soybean | Foliar diseases | RGB UAV images | SLIC | 98.3% for height between 1 and 2 m |
[5] | Tea | Diseased yellow leaves | Infrared thermal canopy UAV Images | HSV transform and thresholding | R2 of 0.97 |
[10] | Potato | Vascular wilt | Multispectral UAV images | Generalized linear model and supervised random forest classification | Classification accuracy of 73.5–82.5% in Plot 1 |
[11] | Banana | Fusarium wilt | Multispectral images | Binary logistic regression | More than 80% |
[12] | Pear | Fire Blight | Aerial multispectral imagery of trees crown | Support vector machine | Classification accuracy of 95% |
[15] | Winter wheat | Fusarium head blight | Hyperspectral UAV image | BP neural network | Accuracy of 98% |
[16] | Oil palms | Basal stem rot | Infrared UAV images | Artificial neural network | Classification accuracy of 97.5% |
[17] | Radish | Fusarium | RGB UAV images | CNN | Accuracy of 93.3% |
[20] | Winter wheat | Yellow rust disease | RGB UAV images | PSPNet | Accuracy of 98% |
[21] | Maize | NLB | RGB UAV images | CNN | Accuracy of 95.1% |
[28] | Winter wheat | Yellow rust disease | Red-Edge multispectral UAV images | Ir-Net | Accuracy of 97.1% |
[30] | Tea | Tea red scab, red leaf spot, and leaf blight | Hand-held digital camera images and UAV images | Generative adversarial networks | Accuracy of 90% |
[33] | Maize | NLB | RGB UAV images | Mask R-CNN model | IoU of 0.73 |
Model | Pseudo-Label IoU(%) | Precision | Recall | F1-Score |
---|---|---|---|---|
MDC | 28.8 | 48.5 | 41.6 | 44.8 |
ACoL | 41.6 | 64.7 | 53.8 | 58.7 |
RCA | 33.0 | 42.3 | 60.0 | 49.6 |
Proposed Model | 43.1 | 71.4 | 52.1 | 60.2 |
Model Name | Running Time/s |
---|---|
ACoL | 0.02 |
MDC | 1.06 |
RCA | 0.36 |
Proposed Model | 0.08 |
Model | Disease IoU(%) | Precision | Recall | F1-Score |
---|---|---|---|---|
MDC | 34.0 | 43.8 | 60.4 | 50.8 |
ACoL | 45.5 | 81.4 | 50.7 | 62.5 |
RCA | 36.5 | 40.6 | 78.2 | 53.5 |
Proposed Model | 50.8 | 84.5 | 56.1 | 67.4 |
Baseline | ABB | FRM | IoU | Precision | Recall | F1 |
---|---|---|---|---|---|---|
√ | 14.2 | 94.8 | 14.3 | 24.9 | ||
√ | √ | 25.7 | 89.4 | 26.5 | 40.9 | |
√ | √ | 44.9 | 85.3 | 48.7 | 62.0 | |
√ | √ | √ | 50.9 | 84.6 | 56.1 | 67.4 |
Coefficient | IoU | Precision | Recall | F1 |
---|---|---|---|---|
0.3 | 36.6 | 45.2 | 65.9 | 53.6 |
0.5 | 43.1 | 71.4 | 52.1 | 60.2 |
0.7 | 34.3 | 38.1 | 77.4 | 51.1 |
Pseudo-Label No. | Ground Truth No. | IoU | Precision | Recall | F1 |
---|---|---|---|---|---|
750 | 100 | 52.8 | 86.2 | 57.7 | 69.1 |
700 | 150 | 55.2 | 86.9 | 60.2 | 71.1 |
650 | 200 | 56.4 | 87.4 | 61.4 | 72.1 |
450 | 400 | 64.9 | 87.9 | 71.3 | 78.7 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, S.; Zhang, K.; Wu, S.; Tang, Z.; Zhao, Y.; Sun, Y.; Shi, Z. A Weakly Supervised Approach for Disease Segmentation of Maize Northern Leaf Blight from UAV Images. Drones 2023, 7, 173. https://doi.org/10.3390/drones7030173
Chen S, Zhang K, Wu S, Tang Z, Zhao Y, Sun Y, Shi Z. A Weakly Supervised Approach for Disease Segmentation of Maize Northern Leaf Blight from UAV Images. Drones. 2023; 7(3):173. https://doi.org/10.3390/drones7030173
Chicago/Turabian StyleChen, Shuo, Kefei Zhang, Suqin Wu, Ziqian Tang, Yindi Zhao, Yaqin Sun, and Zhongchao Shi. 2023. "A Weakly Supervised Approach for Disease Segmentation of Maize Northern Leaf Blight from UAV Images" Drones 7, no. 3: 173. https://doi.org/10.3390/drones7030173
APA StyleChen, S., Zhang, K., Wu, S., Tang, Z., Zhao, Y., Sun, Y., & Shi, Z. (2023). A Weakly Supervised Approach for Disease Segmentation of Maize Northern Leaf Blight from UAV Images. Drones, 7(3), 173. https://doi.org/10.3390/drones7030173