{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,10,8]],"date-time":"2024-10-08T04:14:13Z","timestamp":1728360853457},"reference-count":41,"publisher":"MDPI AG","issue":"19","license":[{"start":{"date-parts":[[2024,10,7]],"date-time":"2024-10-07T00:00:00Z","timestamp":1728259200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"NRT EmPowerment Fellowship and Ohio Agricultural Research and Development Center","award":["2022-017"]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"Plant counting is a critical aspect of crop management, providing farmers with valuable insights into seed germination success and within-field variation in crop population density, both of which are key indicators of crop yield and quality. Recent advancements in Unmanned Aerial System (UAS) technology, coupled with deep learning techniques, have facilitated the development of automated plant counting methods. Various computer vision models based on UAS images are available for detecting and classifying crop plants. However, their accuracy relies largely on the availability of substantial manually labeled training datasets. The objective of this study was to develop a robust corn counting model by developing and integrating an automatic image annotation framework. This study used high-spatial-resolution images collected with a DJI Mavic Pro 2 at the V2\u2013V4 growth stage of corn plants from a field in Wooster, Ohio. The automated image annotation process involved extracting corn rows and applying image enhancement techniques to automatically annotate images as either corn or non-corn, resulting in 80% accuracy in identifying corn plants. The accuracy of corn stand identification was further improved by training four deep learning (DL) models, including InceptionV3, VGG16, VGG19, and Vision Transformer (ViT), with annotated images across various datasets. Notably, VGG16 outperformed the other three models, achieving an F1 score of 0.955. When the corn counts were compared to ground truth data across five test regions, VGG achieved an R2 of 0.94 and an RMSE of 9.95. The integration of an automated image annotation process into the training of the DL models provided notable benefits in terms of model scaling and consistency. The developed framework can efficiently manage large-scale data generation, streamlining the process for the rapid development and deployment of corn counting DL models.<\/jats:p>","DOI":"10.3390\/s24196467","type":"journal-article","created":{"date-parts":[[2024,10,7]],"date-time":"2024-10-07T11:30:18Z","timestamp":1728300618000},"page":"6467","source":"Crossref","is-referenced-by-count":0,"title":["Integrating Automated Labeling Framework for Enhancing Deep Learning Models to Count Corn Plants Using UAS Imagery"],"prefix":"10.3390","volume":"24","author":[{"given":"Sushma","family":"Katari","sequence":"first","affiliation":[{"name":"Department of Food, Agricultural, and Biological Engineering, Ohio State University, 590 Woody Hayes Dr, Columbus, OH 43210, USA"}]},{"given":"Sandeep","family":"Venkatesh","sequence":"additional","affiliation":[{"name":"Google, Kirkland, WA 98033, USA"}]},{"given":"Christopher","family":"Stewart","sequence":"additional","affiliation":[{"name":"Department of Computer Science and Engineering, Ohio State University, 590 Woody Hayes Dr, Columbus, OH 43210, USA"}]},{"ORCID":"http:\/\/orcid.org\/0000-0003-3875-4054","authenticated-orcid":false,"given":"Sami","family":"Khanal","sequence":"additional","affiliation":[{"name":"Department of Food, Agricultural, and Biological Engineering, Ohio State University, 590 Woody Hayes Dr, Columbus, OH 43210, USA"}]}],"member":"1968","published-online":{"date-parts":[[2024,10,7]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1611","DOI":"10.13031\/2013.27676","article-title":"Soil Management Effects on Planting and Emergence of No-till Corn","volume":"39","author":"Perfect","year":"1996","journal-title":"Trans. ASAE"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"399","DOI":"10.13031\/aea.12408","article-title":"Corn Emergence and Yield Response to Row-Unit Depth and Downforce for Varying Field Conditions","volume":"35","author":"Poncet","year":"2019","journal-title":"Appl. Eng. Agric."},{"key":"ref_3","first-page":"300","article-title":"Effects of Seed Priming, Planting Density and Row Spacing on Seedling Emergence and Some Phenological Indices of Corn (Zea mays L.)","volume":"97","author":"Mohammadi","year":"2014","journal-title":"Philipp. Agric. Sci."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"480","DOI":"10.1080\/01904167.2012.639926","article-title":"Effect of Delayed Emergence on Corn Grain Yields","volume":"35","author":"Lawles","year":"2012","journal-title":"J. Plant Nutr."},{"key":"ref_5","first-page":"366","article-title":"Plant Density and Tillage Effects on Forage Corn Quality","volume":"10","author":"Baghdadi","year":"2012","journal-title":"J. Food Agric. Environ."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Yang, T., Zhu, S., Zhang, W., Zhao, Y., Song, X., Yang, G., Yao, Z., Wu, W., Liu, T., and Sun, C. (2024). Unmanned Aerial Vehicle-Scale Weed Segmentation Method Based on Image Analysis Technology for Enhanced Accuracy of Maize Seedling Counting. Agriculture, 14.","DOI":"10.3390\/agriculture14020175"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"685","DOI":"10.1007\/s10980-020-01180-9","article-title":"Drones Provide Spatial and Volumetric Data to Deliver New Insights into Microclimate Modelling","volume":"36","author":"Duffy","year":"2021","journal-title":"Landsc. Ecol."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"107064","DOI":"10.1016\/j.compag.2022.107064","article-title":"A Review of Unmanned Aerial Vehicle-Based Methods for Plant Stand Count Evaluation in Row Crops","volume":"198","author":"Pathak","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"106273","DOI":"10.1016\/j.compag.2021.106273","article-title":"Computer Vision-Based Citrus Tree Detection in a Cultivated Environment Using UAV Imagery","volume":"187","author":"Donmez","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Xia, L., Zhang, R., Chen, L., Huang, Y., Xu, G., Wen, Y., and Yi, T. (2019). Monitor Cotton Budding Using SVM and UAV Images. Appl. Sci., 9.","DOI":"10.3390\/app9204312"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Banerjee, B.P., Sharma, V., Spangenberg, G., and Kant, S. (2021). Machine Learning Regression Analysis for Estimation of Crop Emergence Using Multispectral UAV Imagery. Remote Sens., 13.","DOI":"10.3390\/rs13152918"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Tavus, M.R., Eker, M.E., Senyer, N., and Karabulut, B. Plant Counting By Using k-NN Classification on UAVs Images. Proceedings of the 2015 23rd Signal Processing and Communications Applications Conference (SIU).","DOI":"10.1109\/SIU.2015.7130015"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.isprsjprs.2021.01.024","article-title":"A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows from UAV Imagery","volume":"174","author":"Osco","year":"2021","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Zhang, J., Zhao, B., Yang, C., Shi, Y., Liao, Q., Zhou, G., Wang, C., Xie, T., Jiang, Z., and Zhang, D. (2020). Rapeseed Stand Count Estimation at Leaf Development Stages With UAV Imagery and Convolutional Neural Networks. Front. Plant Sci., 11.","DOI":"10.3389\/fpls.2020.00617"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"280","DOI":"10.1016\/j.isprsjprs.2020.09.025","article-title":"Identifying and Mapping Individual Plants in a Highly Diverse High-Elevation Ecosystem Using UAV Imagery and Deep Learning","volume":"169","author":"Zhang","year":"2020","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"106214","DOI":"10.1016\/j.compag.2021.106214","article-title":"Early Corn Stand Count of Different Cropping Systems Using UAV-Imagery and Deep Learning","volume":"186","author":"Vong","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Wang, L., Xiang, L., Tang, L., and Jiang, H. (2021). A Convolutional Neural Network-Based Method for Corn Stand Counting in the Field. Sensors, 21.","DOI":"10.3390\/s21020507"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Machefer, M., Lemarchand, F., Bonnefond, V., Hitchins, A., and Sidiropoulos, P. (2020). Mask R-CNN Refitting Strategy for Plant Counting and Sizing in UAV Imagery. Remote Sens., 12.","DOI":"10.3390\/rs12183015"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Lu, H., and Cao, Z. (2020). TasselNetV2+: A Fast Implementation for High-Throughput Plant Counting From High-Resolution RGB Imagery. Front. Plant Sci., 11.","DOI":"10.3389\/fpls.2020.541960"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Li, H., Wang, P., and Huang, C. (2022). Comparison of Deep Learning Methods for Detecting and Counting Sorghum Heads in UAV Imagery. Remote Sens., 14.","DOI":"10.3390\/rs14133143"},{"key":"ref_21","unstructured":"Robey, A., Hassani, H., and Pappas, G.J. (2020). Model-Based Robust Deep Learning: Generalizing to Natural, Out-of-Distribution Data. arXiv."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Wang, Y., and Bansal, M. (2018). Robust Machine Comprehension Models via Adversarial Training. arXiv.","DOI":"10.18653\/v1\/N18-2091"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Alnaasan, N., Lieber, M., Shafi, A., Subramoni, H., Shearer, S., and Panda, D.K. (2023, January 15\u201318). HARVEST: High-Performance Artificial Vision Framework for Expert Labeling Using Semi-Supervised Training. Proceedings of the 2023 IEEE International Conference on Big Data (BigData), Sorrento, Italy.","DOI":"10.1109\/BigData59044.2023.10386339"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1503","DOI":"10.1007\/s11760-022-02359-0","article-title":"Unsupervised Adversarial Domain Adaptation Leaf Counting with Bayesian Loss Density Estimation","volume":"17","author":"Mei","year":"2023","journal-title":"Signal Image Video Process."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Rodriguez-Vazquez, J., Fernandez-Cortizas, M., Perez-Saura, D., Molina, M., and Campoy, P. (2023). Overcoming Domain Shift in Neural Networks for Accurate Plant Counting in Aerial Images. Remote Sens., 15.","DOI":"10.20944\/preprints202302.0070.v1"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Shi, M., Li, X.-Y., Lu, H., and Cao, Z.-G. (2022). Background-Aware Domain Adaptation for Plant Counting. Front. Plant Sci., 13.","DOI":"10.3389\/fpls.2022.731816"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"1720","DOI":"10.1007\/s11119-022-09907-1","article-title":"A Fast and Robust Method for Plant Count in Sunflower and Maize at Different Seedling Stages Using High-Resolution UAV RGB Imagery","volume":"23","author":"Bai","year":"2022","journal-title":"Precis. Agric."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Wu, J., Yang, G., Yang, X., Xu, B., Han, L., and Zhu, Y. (2019). Automatic Counting of in Situ Rice Seedlings from UAV Images Based on a Deep Fully Convolutional Neural Network. Remote Sens., 11.","DOI":"10.3390\/rs11060691"},{"key":"ref_29","unstructured":"(2024, September 05). Corn Growth and Development: Crop Staging|Agronomic Crops Network. Available online: https:\/\/agcrops.osu.edu\/newsletter\/corn-newsletter\/2022-18\/corn-growth-and-development-crop-staging."},{"key":"ref_30","unstructured":"Simonyan, K., and Zisserman, A. (2015, January 7\u20139). Very Deep Convolutional Networks for Large-Scale Image Recognition. Proceedings of the 3rd International Conference on Learning Representations (ICLR 2015), San Diego, CA, USA."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2015, January 7\u201312). Deep Residual Learning for Image Recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., and Rabinovich, A. (2015, January 7\u201312). Going Deeper with Convolutions. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA.","DOI":"10.1109\/CVPR.2015.7298594"},{"key":"ref_33","unstructured":"Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., Dehghani, M., Minderer, M., Heigold, G., and Gelly, S. (2021). An Image Is Worth 16 \u00d7 16 Words: Transformers for Image Recognition at Scale. arXiv."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"27","DOI":"10.1186\/s40537-019-0192-5","article-title":"Survey on Deep Learning with Class Imbalance","volume":"6","author":"Johnson","year":"2019","journal-title":"J. Big Data"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Ahmed, W., and Karim, A. (2020, January 16\u201318). The Impact of Filter Size and Number of Filters on Classification Accuracy in CNN. Proceedings of the 2020 International Conference on Computer Science and Software Engineering (CSASE), Duhok, Iraq.","DOI":"10.1109\/CSASE48920.2020.9142089"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Chen, P., Ma, X., Wang, F., and Li, J. (2021). A New Method for Crop Row Detection Using Unmanned Aerial Vehicle Images. Remote Sens., 13.","DOI":"10.3390\/rs13173526"},{"key":"ref_37","first-page":"1079203","article-title":"Automatic Palm Trees Detection from Multispectral UAV Data Using Normalized Difference Vegetation Index and Circular Hough Transform","volume":"Volume 10792","author":"Huang","year":"2018","journal-title":"Proceedings of the High-Performance Computing in Geoscience and Remote Sensing Viii"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"13","DOI":"10.1080\/08839514.2020.1831226","article-title":"Automatic Detection of Oil Palm Tree from UAV Images Based on the Deep Learning Method","volume":"35","author":"Liu","year":"2021","journal-title":"Appl. Artif. Intell."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Feng, Y., Chen, W., Ma, Y., Zhang, Z., Gao, P., and Lv, X. (2023). Cotton Seedling Detection and Counting Based on UAV Multispectral Images and Deep Learning Methods. Remote Sens., 15.","DOI":"10.3390\/rs15102680"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"64","DOI":"10.1186\/s13007-019-0449-1","article-title":"Estimation of Crop Plant Density at Early Mixed Growth Stages Using UAV Imagery","volume":"15","author":"Koh","year":"2019","journal-title":"Plant Methods"},{"key":"ref_41","unstructured":"Poncet, A., Fulton, J., Port, K., McDonald, T., and Pate, G. (2024, September 19). Optimizing Field Traffic Patterns to Improve Machinery Efficiency: Path Planning Using Guidance Lines. Available online: https:\/\/ohioline.osu.edu\/factsheet\/fabe-5531."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/19\/6467\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,10,7]],"date-time":"2024-10-07T11:52:00Z","timestamp":1728301920000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/19\/6467"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,10,7]]},"references-count":41,"journal-issue":{"issue":"19","published-online":{"date-parts":[[2024,10]]}},"alternative-id":["s24196467"],"URL":"https:\/\/doi.org\/10.3390\/s24196467","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,10,7]]}}}