Abstract
Image-based plant phenotyping is getting considerable attention with the advancement in computer vision technologies. In the past few years, the use of deep neural networks (DNNs) is well-known for segmentation and detection tasks. However, most DNN-based methods require high computational resources, thus making them unsuitable for real-time decision-making. This study presents a real-time plant phenotyping system using leaf counting and tracking individual leaf growth. For leaf localization and counting, a Tiny-YOLOv4 network is utilized, which provides faster processing, and is easily deployable on low-end hardware. Leaf growth tracking is performed by active contour segmentation of leaf localized using the Tiny-YOLOv4 network. The proposed system is implemented for top-view RGB images of the Arabidopsis thaliana’ plants. And its performance for leaf counting is evaluated against Tiny-YOLOv3 and Faster R-CNN using the difference in count (DiC), accuracy, and F1-score measures. The model achieves an improved accuracy of 90%, absolute DiC of 0.42, F1-score of 96%, and inference time of 15 milliseconds. Further, the segmentation accuracy measures using Dice and Jaccard scores are 0.91 and 0.86, with a computing time of 0.96 s. These obtained results depict the effectiveness of the proposed system for real-time plant phenotyping.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aich, S., Stavness, I.: Leaf counting with deep convolutional and deconvolutional networks. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 2080–2089 (2017). https://doi.org/10.1109/ICCVW.2017.244
Buzzy, M., Thesma, V., Davoodi, M., Mohammadpour Velni, J.: Real-time plant leaf counting using deep object detection networks. Sensors 20(23), 6896 (2020). https://doi.org/10.3390/s20236896
Choudhury, S.D., Stoerger, V., Samal, A., Schnable, J.C., Liang, Z., Yu, J.G.: Automated vegetative stage phenotyping analysis of maize plants using visible light images. In: KDD Workshop on Data Science for Food, Energy and Water, San Francisco, California, USA (2016)
Cruz, J.A., et al.: Multi-modality imagery database for plant phenotyping. Mach. Vis. Appl. 27(5), 735–749 (2015). https://doi.org/10.1007/s00138-015-0734-6
Das Choudhury, S., Bashyam, S., Qiu, Y., Samal, A., Awada, T.: Holistic and component plant phenotyping using temporal image sequence. Plant Methods 14(1), 1–21 (2018). https://doi.org/10.1186/s13007-018-0303-x
Das Choudhury, S., Samal, A., Awada, T.: Leveraging image analysis for high-throughput plant phenotyping. Front. Plant Sci. 10, 508 (2019). https://doi.org/10.3389/fpls.2019.00508
Dobrescu, A., Valerio Giuffrida, M., Tsaftaris, S.A.: Leveraging multiple datasets for deep leaf counting. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 2072–2079 (2017). https://doi.org/10.1109/ICCVW.2017.243
Farjon, G., Itzhaky, Y., Khoroshevsky, F., Bar-Hillel, A.: Leaf counting: fusing network components for improved accuracy. Front. Plant Sci. 12, 575751 (2021). https://doi.org/10.3389/fpls.2021.575751
Gibbs, J.A., Pound, M.P., French, A.P., Wells, D.M., Murchie, E.H., Pridmore, T.P.: Active vision and surface reconstruction for 3d plant shoot modelling. IEEE/ACM Trans. Comput. Biol. Bioinf. 17(6), 1907–1917 (2019). https://doi.org/10.1109/TCBB.2019.2896908
Giuffrida, M.V., Doerner, P., Tsaftaris, S.A.: Pheno-deep counter: a unified and versatile deep learning architecture for leaf counting. Plant J. 96(4), 880–890 (2018). https://doi.org/10.1111/tpj.14064
He, J.Q., Harrison, R.J., Li, B.: A novel 3D imaging system for strawberry phenotyping. Plant Methods 13(1), 1–8 (2017). https://doi.org/10.1186/s13007-017-0243-x
Koornneef, M., Hanhart, C., van Loenen-Martinet, P., Blankestijn de Vries, H.: The effect of daylength on the transition to flowering in phytochrome-deficient, late-flowering and double mutants of arabidopsis thaliana. Physiol. Plantarum 95(2), 260–266 (1995). https://doi.org/10.1111/j.1399-3054.1995.tb00836.x
Lee, U., Chang, S., Putra, G.A., Kim, H., Kim, D.H.: An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis. PLoS One 13(4), e0196615 (2018). https://doi.org/10.1371/journal.pone.0196615
Li, Z., Guo, R., Li, M., Chen, Y., Li, G.: A review of computer vision technologies for plant phenotyping. Comput. Electron. Agric. 176, 105672 (2020). https://doi.org/10.1016/j.compag.2020.105672
Minervini, M., Fischbach, A., Scharr, H., Tsaftaris, S.A.: Finely-grained annotated datasets for image-based plant phenotyping. Pattern Recogn. Lett. 81, 80–89 (2016). https://doi.org/10.1016/j.patrec.2015.10.013
Minervini, M., Scharr, H., Tsaftaris, S.A.: Image analysis: the new bottleneck in plant phenotyping [applications corner]. IEEE Signal Process. Mag. 32(4), 126–131 (2015). https://doi.org/10.1109/MSP.2015.2405111
Mochida, K., et al.: Computer vision-based phenotyping for improvement of plant productivity: a machine learning perspective. GigaScience 8(1), giy153 (2019). https://doi.org/10.1093/gigascience/giy153
Mukhiddinov, M., Abdusalomov, A.B., Cho, J.: Automatic fire detection and notification system based on improved YOLOv4 for the blind and visually impaired. Sensors 22(9), 3307 (2022). https://doi.org/10.3390/s22093307
Mukhiddinov, M., Cho, J.: Smart glass system using deep learning for the blind and visually impaired. Electronics 10(22), 2756 (2021). https://doi.org/10.3390/electronics10222756
Redmon, J.: Darknet: open source neural networks in C. https://pjreddie.com/darknet/ (2021)
Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 779–788 (2016). https://doi.org/10.1109/CVPR.2016.91
Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. In: Advances in Neural Information Processing Systems, vol. 28 (2015). https://doi.org/10.5555/2969239.2969250
Romera-Paredes, B., Torr, P.H.S.: Recurrent instance segmentation. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9910, pp. 312–329. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46466-4_19
Roy, A.M., Bhaduri, J.: Real-time growth stage detection model for high degree of occultation using densenet-fused YOLOv4. Comput. Electron. Agric. 193, 106694 (2022). https://doi.org/10.1016/j.compag.2022.106694
Tu, Y.-L., Lin, W.-Y., Lin, Y.-C.: Toward automatic plant phenotyping: starting from leaf counting. Multimed. Tools Appl. 81(9), 11865–11879 (2022). https://doi.org/10.1007/s11042-021-11886-w
Wang, C.Y., Bochkovskiy, A., Liao, H.Y.M.: Scaled-YOLOv4: scaling cross stage partial network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 13029–13038 (2021). https://doi.org/10.1109/CVPR46437.2021.01283
Weyler, J., Milioto, A., Falck, T., Behley, J., Stachniss, C.: Joint plant instance detection and leaf count estimation for in-field plant phenotyping. IEEE Robot. Autom. Lett. 6(2), 3599–3606 (2021). https://doi.org/10.1109/LRA.2021.3060712
Xu, L., Li, Y., Sun, Y., Song, L., Jin, S.: Leaf instance segmentation and counting based on deep object detection and segmentation networks. In: 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems (SCIS) and 19th International Symposium on Advanced Intelligent Systems (ISIS), pp. 180–185. IEEE (2018). https://doi.org/10.1109/SCIS-ISIS.2018.00038
Yin, X., Liu, X., Chen, J., Kramer, D.M.: Joint multi-leaf segmentation, alignment, and tracking for fluorescence plant videos. IEEE Trans. Pattern Anal. Mach. Intell. 40(6), 1411–1423 (2017). https://doi.org/10.1109/TPAMI.2017.2728065
Acknowledgements
This work is supported by the grant received from DST, Govt. of India for the Technology Innovation Hub at the IIT Ropar in the framework of the National Mission on Interdisciplinary Cyber-Physical Systems.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jain, S., Mahapatra, D., Saini, M. (2023). Real-Time Image Based Plant Phenotyping Using Tiny-YOLOv4. In: Zaynidinov, H., Singh, M., Tiwary, U.S., Singh, D. (eds) Intelligent Human Computer Interaction. IHCI 2022. Lecture Notes in Computer Science, vol 13741. Springer, Cham. https://doi.org/10.1007/978-3-031-27199-1_28
Download citation
DOI: https://doi.org/10.1007/978-3-031-27199-1_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-27198-4
Online ISBN: 978-3-031-27199-1
eBook Packages: Computer ScienceComputer Science (R0)