Improving High-Throughput Phenotyping Using Fusion of Close-Range Hyperspectral Camera and Low-Cost Depth Sensor - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Aug 17;18(8):2711.
doi: 10.3390/s18082711.

Improving High-Throughput Phenotyping Using Fusion of Close-Range Hyperspectral Camera and Low-Cost Depth Sensor

Affiliations

Improving High-Throughput Phenotyping Using Fusion of Close-Range Hyperspectral Camera and Low-Cost Depth Sensor

Peikui Huang et al. Sensors (Basel). .

Abstract

Hyperspectral sensors, especially the close-range hyperspectral camera, have been widely introduced to detect biological processes of plants in the high-throughput phenotyping platform, to support the identification of biotic and abiotic stress reactions at an early stage. However, the complex geometry of plants and their interaction with the illumination, severely affects the spectral information obtained. Furthermore, plant structure, leaf area, and leaf inclination distribution are critical indexes which have been widely used in multiple plant models. Therefore, the process of combination between hyperspectral images and 3D point clouds is a promising approach to solve these problems and improve the high-throughput phenotyping technique. We proposed a novel approach fusing a low-cost depth sensor and a close-range hyperspectral camera, which extended hyperspectral camera ability with 3D information as a potential tool for high-throughput phenotyping. An exemplary new calibration and analysis method was shown in soybean leaf experiments. The results showed that a 0.99 pixel resolution for the hyperspectral camera and a 3.3 millimeter accuracy for the depth sensor, could be achieved in a controlled environment using the method proposed in this paper. We also discussed the new capabilities gained using this new method, to quantify and model the effects of plant geometry and sensor configuration. The possibility of 3D reflectance models can be used to minimize the geometry-related effects in hyperspectral images, and to significantly improve high-throughput phenotyping. Overall results of this research, indicated that the proposed method provided more accurate spatial and spectral plant information, which helped to enhance the precision of biological processes in high-throughput phenotyping.

Keywords: close-range hyperspectral camera; fusion; high-throughput phenotyping; low-cost depth sensor; plant 3D model.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
(a) Middleton Spectral Vision MSV 101 Hyperspectral Camera. (b) Kinect V2 depth sensors.
Figure 2
Figure 2
(a) Reference gauge with a regular chess pattern with 1 cm edge length, red circles are the selected reference points numbered from 1 to 8; (b) Technical drawing with size specifications.
Figure 3
Figure 3
The structure of imaging station.
Figure 4
Figure 4
The imaging principle of the pushbroom camera model. The hyperspectral camera with focal length f and camera center C. The image coordinate is o-uv, while the system 3D coordinate is O-XYZ. The object point in X axis is projected to the image coordinate v, while the object point in Y axis is projected to the image coordinate u.
Figure 5
Figure 5
The fusion principle of pushbroom camera and depth sensor model. Starting from T, the line camera with focal length f and principal point pv is moved along V. The object point x is projected to the image point x’, with the coordinate (u; v).The coordinates of the base platform, hyperspectral camera, and Kinect V2 were O-X1Y1Z1, H-X2Y2Z2, and K-X3Y3Z3, respectively, which were aligned with each other.
Figure 6
Figure 6
(a) The 3D coordinate of the selected reference points from Kinect V2 3D point clouds; (b) The hyperspectral image coordinate of the selected reference points achieved automatically using the corner detected method (red marks).
Figure 7
Figure 7
The sample soybean leaf attached to the base platform covered by a sheet of black cloth.
Figure 8
Figure 8
The segmentation 3-D point cloud of soybean leaf from Kinect V2.
Figure 9
Figure 9
Using the fusion model proposed in this paper to derivate Kinect V2 depth information to hyperspectral image coordinate information.
Figure 10
Figure 10
Soybean leaf hyperspectral 3D model, based on the fusion model proposed in this paper.

Similar articles

Cited by

References

    1. Bai G., Ge Y., Hussain W., Baenziger P.S., Graef G. A multi-sensor system for high throughput field phenotyping in soybean and wheat breeding. Comput. Electron. Agric. 2016;128:181–192. doi: 10.1016/j.compag.2016.08.021. - DOI
    1. Li L., Zhang Q., Huang D. A review of imaging techniques for plant phenotyping. Sensors. 2014;14:20078–20111. doi: 10.3390/s141120078. - DOI - PMC - PubMed
    1. Thenkabail P.S., Lyon J.G., Huete A. Hyperspectral Remote Sensing of Vegetation. CRC Press; Boca Raton, FL, USA: 2012. Advances in Hyperspectral Remote Sensing of Vegetation and Agricultural Croplands; pp. 3–32.
    1. Gonzalezdugo V., Hernandez P., Solis I., Zarcotejada P. Using high-resolution hyperspectral and thermal airborne imagery to assess physiological condition in the context of wheat phenotyping. Remote Sens. 2015;7:13586–13605. doi: 10.3390/rs71013586. - DOI
    1. Thomas S., Kuska M.T., Bohnenkamp D., Brugger A., Alisaac E., Wahabzada M. Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J. Plant Dis. Prot. 2018;125:5–20. doi: 10.1007/s41348-017-0124-6. - DOI

LinkOut - more resources