{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,2,21]],"date-time":"2025-02-21T14:49:24Z","timestamp":1740149364939,"version":"3.37.3"},"reference-count":26,"publisher":"MDPI AG","issue":"20","license":[{"start":{"date-parts":[[2020,10,20]],"date-time":"2020-10-20T00:00:00Z","timestamp":1603152000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100000266","name":"Engineering and Physical Sciences Research Council","doi-asserted-by":"publisher","award":["EP\/R02572X\/1"],"id":[{"id":"10.13039\/501100000266","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.<\/jats:p>","DOI":"10.3390\/s20205933","type":"journal-article","created":{"date-parts":[[2020,10,21]],"date-time":"2020-10-21T00:50:07Z","timestamp":1603241407000},"page":"5933","source":"Crossref","is-referenced-by-count":0,"title":["Plant Leaf Position Estimation with Computer Vision"],"prefix":"10.3390","volume":"20","author":[{"given":"James","family":"Beadle","sequence":"first","affiliation":[{"name":"Engineering Department, Lancaster University, Lancaster LA1 4YW, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5247-5193","authenticated-orcid":false,"given":"C. James","family":"Taylor","sequence":"additional","affiliation":[{"name":"Engineering Department, Lancaster University, Lancaster LA1 4YW, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5627-3014","authenticated-orcid":false,"given":"Kirsti","family":"Ashworth","sequence":"additional","affiliation":[{"name":"Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YW, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1353-0329","authenticated-orcid":false,"given":"David","family":"Cheneler","sequence":"additional","affiliation":[{"name":"Engineering Department, Lancaster University, Lancaster LA1 4YW, UK"}]}],"member":"1968","published-online":{"date-parts":[[2020,10,20]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1186\/s13007-017-0173-7","article-title":"A real-time phenotyping framework using machine learning for plant stress severity rating in soybean","volume":"13","author":"Naik","year":"2017","journal-title":"Plant Methods"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"187","DOI":"10.1016\/j.molp.2020.01.008","article-title":"Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives","volume":"13","author":"Yang","year":"2020","journal-title":"Mol. Plant"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Erahaman, M., Echen, D., Egillani, Z., Eklukas, C., and Chen, M. (2015). Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci., 6.","DOI":"10.3389\/fpls.2015.00619"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"179","DOI":"10.1007\/s11633-019-1212-9","article-title":"Electronic Nose and Its Applications: A Survey","volume":"17","author":"Karakaya","year":"2019","journal-title":"Int. J. Autom. Comput."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Webster, J., Shakya, P., Kennedy, E., Caplan, M., Rose, C., and Rosenstein, J.K. (2018, January 17\u201319). TruffleBot: Low-Cost Multi-Parametric Machine Olfaction. Proceedings of the 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA.","DOI":"10.1109\/BIOCAS.2018.8584767"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Burrell, T., Fozard, S., Holroyd, G.H., French, A.P., Pound, M.P., Bigley, C.J., Taylor, C., and Forde, B.G. (2017). The Microphenotron: A robotic miniaturized plant phenotyping platform with diverse applications in chemical biology. Plant Methods, 13.","DOI":"10.1186\/s13007-017-0158-6"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Yalcin, H. (2017, January 7\u201310). Plant phenology recognition using deep learning: Deep-Pheno. Proceedings of the 2017 6th International Conference on Agro-Geoinformatics, Fairfax, VA, USA.","DOI":"10.1109\/Agro-Geoinformatics.2017.8046996"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Namin, S.T., Esmaeilzadeh, M., Najafi, M., Brown, T.B., and Borevitz, J.O. (2018). Deep phenotyping: Deep learning for temporal phenotype\/genotype classification. Plant Methods, 14.","DOI":"10.1186\/s13007-018-0333-4"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Hall, D., McCool, C., Dayoub, F., Sunderhauf, N., and Upcroft, B. (2015, January 5\u20139). Evaluation of Features for Leaf Classification in Challenging Conditions. Proceedings of the 2015 IEEE Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA.","DOI":"10.1109\/WACV.2015.111"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., and McCool, C. (2016). DeepFruits: A Fruit Detection System Using Deep Neural Networks. Sensors, 16.","DOI":"10.3390\/s16081222"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"6901","DOI":"10.1007\/s13369-018-03695-5","article-title":"Ripeness Classification of Bananas Using an Artificial Neural Network","volume":"44","author":"Mazen","year":"2019","journal-title":"Arab. J. Sci. Eng."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Fuentes, A., Yoon, S., Kim, S.C., and Park, D.S. (2017). A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors, 17.","DOI":"10.3390\/s17092022"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1155\/2016\/3289801","article-title":"Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification","volume":"2016","author":"Sladojevic","year":"2016","journal-title":"Comput. Intell. Neurosci."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"45","DOI":"10.26564\/19001355.17","article-title":"Understanding Microsoft Kinect for application in 3-D vision processes","volume":"10","author":"Moreno","year":"2014","journal-title":"Rev. Clepsidra"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"El-Laithy, R.A., Huang, J., and Yeh, M. (2012, January 23\u201326). Study on the use of Microsoft Kinect for robotics applications. Proceedings of the Proceedings of the 2012 IEEE\/ION Position, Location and Navigation Symposium, Myrtle Beach, SC, USA.","DOI":"10.1109\/PLANS.2012.6236985"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"A Nair, S.K., Joladarashi, S., and Ganesh, N. (2019, January 23\u201325). Evaluation of ultrasonic sensor in robot mapping. Proceedings of the 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India.","DOI":"10.1109\/ICOEI.2019.8862659"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Lakovic, N., Brkic, M., Batinic, B., Bajic, J., Rajs, V., and Kulundzic, N. (2019, January 20\u201322). Application of low-cost VL53L0X ToF sensor for robot environment detection. Proceedings of the 2019 18th International Symposium INFOTEH-JAHORINA (INFOTEH), East Sarajevo, Bosnia and Herzegovina.","DOI":"10.1109\/INFOTEH.2019.8717779"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Falkenhagen, L. (1995). Depth Estimation from Stereoscopic Image Pairs Assuming Piecewise Continuos Surfaces. Workshops in Computing, Springer Science and Business Media LLC.","DOI":"10.1007\/978-1-4471-3035-2_9"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Yaguchi, H., Nagahama, K., Hasegawa, T., and Inaba, M. (2016, January 9\u201314). Development of an autonomous tomato harvesting robot with rotational plucking gripper. Proceedings of the 2016 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea.","DOI":"10.1109\/IROS.2016.7759122"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Kaczmarek, A.L. (2015). Improving depth maps of plants by using a set of five cameras. J. Electron. Imaging, 24.","DOI":"10.1117\/1.JEI.24.2.023018"},{"key":"ref_21","unstructured":"Laga, H., and Miklavcic, S.J. (2013, January 1\u20136). Curve-based stereo matching for 3D modeling of plants. Proceedings of the 20th International Congress on Modelling and Simulation, Adelaide, SA, Australia."},{"key":"ref_22","first-page":"599","article-title":"Image-based plant modelling","volume":"Volume 25","author":"Quan","year":"2006","journal-title":"ACM SIGGRAPH 2006 Research Posters on SIGGRAPH \u201906"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"177","DOI":"10.1016\/j.ifacol.2016.10.033","article-title":"Autonomous Leaf Picking Using Deep Learning and Visual-Servoing","volume":"49","author":"Ahlin","year":"2016","journal-title":"IFAC-PapersOnLine"},{"key":"ref_24","unstructured":"Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G.S., Davis, A., Dean, J., and Devin, M. (2016). Tensorflow: Large-scale machine learning on heterogeneous systems. arXiv."},{"key":"ref_25","unstructured":"(2020, October 03). How to Train an Object Detection Classifier for Multiple Objects Using TensorFlow (GPU) on Windows 10. Available online: https:\/\/github.com\/EdjeElectronics\/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10."},{"key":"ref_26","unstructured":"(2020, October 03). TensorFlow 1 Detection Model Zoo. Available online: https:\/\/github.com\/tensorflow\/models\/blob\/master\/research\/object_detection\/g3doc\/tf1_detection_zoo.md."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/20\/5933\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,7,4]],"date-time":"2024-07-04T11:30:29Z","timestamp":1720092629000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/20\/5933"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,10,20]]},"references-count":26,"journal-issue":{"issue":"20","published-online":{"date-parts":[[2020,10]]}},"alternative-id":["s20205933"],"URL":"https:\/\/doi.org\/10.3390\/s20205933","relation":{},"ISSN":["1424-8220"],"issn-type":[{"type":"electronic","value":"1424-8220"}],"subject":[],"published":{"date-parts":[[2020,10,20]]}}}