Imaging Science Ph.D. Defense: Fei Zhang
Ph.D. Dissertation Defense
Toward Structural Characterization of Broadacre Crops Using UAS-based 3D Point Clouds
Fei Zhang
Imaging Science Ph.D. Candidate
Chester F. Carlson Center for Imaging Science, RIT
Register Here for Zoom Link
Abstract:
The use of unmanned aerial systems (UAS)-based remote sensing methods in precision agriculture (PA) has seen rapid development in recent years. These technologies are expected to revolutionize crop management by capturing imagery data with a high spatial, temporal, and spectral resolution, thereby enabling the decision-making of farm inputs at the sub-field level and on an almost daily basis. However, in real-world operational applications, the potential of UAS-based remote sensing methods has not yet been fully exploited. One of the main research avenues is that of structural characterization of crops in order to assess plant density, leaf density, i.e., overall crop health, and ultimately, crop yield. Using a UAS-based imagery system, we concurrently collected multi-source imagery data. We used structure-from-motion (SfM; photogrammetry) and light detection and ranging (LiDAR) point clouds to observe snap bean fields across two years. We hypothesized that the 3D point clouds represent essential structural information of the crop and that by extracting various features from the oversampled (dense) 3D data, we could retrieve critical structural characteristics of the crops and eventually relate them to high-level objectives, including disease risk and yield modeling. We further explored the effectiveness of feature-level data fusion between LiDAR point clouds and multispectral imagery, coupled with machine learning algorithms, for yield modeling and disease detection applications. We found that both SfM and LiDAR point clouds achieved similar high accuracies for assessment of crop height (CH) and row width (RW) (RMSE of ~0.02 m for CH and ~0.05 m for RW). For measuring the leaf area index (LAI), the LiDAR-derived models achieved the highest accuracy (R2= 0.61, nRMSE = 19%), while the SfM-derived models exhibited slightly lower values with a predicted R2โ0.5 and nRMSEโ22%. We found that the fusion of LiDAR and MSI data yielded good results for prediction of the snap bean yield, with an ๐ด๐๐. ๐
ยฒ = 0.827 and n๐
๐๐๐ธ = 9.4%. Finally, for early detection of white mold, a common disease in snap beans, our preliminary results showed that a custom neural network achieved an overall accuracy of 75% in classifying white mold contaminated plants versus healthy plants, as early as three weeks before harvest. This work demonstrated the potential of 3D point cloud data in PA applications and the performance of a UAS-based remote sensing system in monitoring short broadacre crops, such as snap bean.
Intended Audience:
Undergraduates, graduates, and experts. Those with interest in the topic.
To request an interpreter, please visit https://myaccess.rit.edu
Event Snapshot
When and Where
Who
This is an RIT Only Event
Interpreter Requested?
No