Next Article in Journal
Partial Substitution of Chemical N with Solid Cow Manure Improved Soil Ecological Indicators and Crop Yield in a Wheat–Rice Rotation System
Previous Article in Journal
A Method for Analyzing the Phenotypes of Nonheading Chinese Cabbage Leaves Based on Deep Learning and OpenCV Phenotype Extraction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Millet Aboveground Biomass Utilizing Multi-Source UAV Image Feature Fusion

1
College of Agricultural, Shanxi Agricultural University, Taigu, Jinzhong 030801, China
2
State Key Laboratory of Sustainable Dryland Agriculture (In Preparation), Shanxi Agricultural University, Taiyuan 030031, China
3
Department of Basic Sciences, Shanxi Agricultural University, Taigu, Jinzhong 030801, China
*
Authors to whom correspondence should be addressed.
Agronomy 2024, 14(4), 701; https://doi.org/10.3390/agronomy14040701
Submission received: 20 February 2024 / Revised: 22 March 2024 / Accepted: 26 March 2024 / Published: 28 March 2024
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
Aboveground biomass (AGB) is a key parameter reflecting crop growth which plays a vital role in agricultural management and ecosystem assessment. Real-time and non-destructive biomass monitoring is essential for accurate field management and crop yield prediction. This study utilizes a multi-sensor-equipped unmanned aerial vehicle (UAV) to collect remote sensing data during critical growth stages of millet, including spectral, textural, thermal, and point cloud information. The use of RGB point cloud data facilitated plant height extraction, enabling subsequent analysis to discern correlations between spectral parameters, textural indices, canopy temperatures, plant height, and biomass. Multiple linear regression (MLR), support vector machine (SVM), and random forest (RF) models were constructed to evaluate the capability of different features and integrated multi-source features in estimating the AGB. Findings demonstrated a strong correlation between the plant height derived from point cloud data and the directly measured plant height, with the most accurate estimation of millet plant height achieving an R2 of 0.873 and RMSE of 7.511 cm. Spectral parameters, canopy temperature, and plant height showed a high correlation with the AGB, and the correlation with the AGB was significantly improved after texture features were linearly transformed. Among single-factor features, the RF model based on textural indices showcased the highest accuracy in estimating the AGB (R2 = 0.698, RMSE = 0.323 kg m−2, and RPD = 1.821). When integrating two features, the RF model incorporating textural indices and canopy temperature data demonstrated optimal performance (R2 = 0.801, RMSE = 0.253 kg m−2, and RPD = 2.244). When the three features were fused, the RF model constructed by fusing spectral parameters, texture indices, and canopy temperature data was the best (R2 = 0.869, RMSE = 0.217 kg m−2, and RPD = 2.766). The RF model based on spectral parameters, texture indices, canopy temperature, and plant height had the highest accuracy (R2 = 0.877, RMSE = 0.207 kg m−2, and RPD = 2.847). In this study, the complementary and synergistic effects of multi-source remote sensing data were leveraged to enhance the accuracy and stability of the biomass estimation model.

1. Introduction

Millet is among the most significant grain crops in northern China [1], known for its high nutritional value. Understanding millets’ growth conditions and implementing timely and rational field management can enhance both yield and quality. Aboveground biomass (AGB) [2] acts as a critical metric that reflects crop growth [3], which significantly influences the eventual formation of crop yield [4,5]. Conventional AGB monitoring primarily relies on field sampling and laboratory measurements, which suffer from drawbacks such as being time-consuming, labor-intensive, and expensive, as well as the use of destructive methods. Thus, rapid and accurate AGB monitoring is essential for acquiring real-time growth dynamics and yield progress in millet.
The swift advancements in remote sensing technology have enabled non-destructive access to real-time crop growth conditions across vast areas. Research in remote sensing monitoring across diverse platforms has exhibited promising initial outcomes. Satellite remote sensing, adept at swift crop monitoring over large areas, faces limitations related to resolution and cloud cover [6]. Data obtained from ground platforms offers insights solely into localized crop growth, which is suitable for small-scale monitoring. However, the constraints related to cost and scope poses challenges in executing large-scale monitoring. The unmanned aerial vehicle (UAV) remote sensing platform presents numerous advantages, including cost-effectiveness, structural simplicity, increased mobility, and superior spatial and temporal resolution. These advantages compensate for the constraints of ground-based and satellite remote sensing platforms, enabling swift and efficient non-destructive crop monitoring [7,8]. With an expanded variety of sensor types, the UAV platform can combine different types of sensors to obtain crucial information such as spectra, textures, temperatures, and point clouds [9] to develop models for monitoring crop growth parameters and estimating yields [10,11].
Several studies were conducted to monitor the AGB of crops using UAV platforms. For example, Lu [12] and Gnyp [13] constructed models to estimate the dry matter of AGB and wheat leaves, respectively, by utilizing spectral information, achieving a high model accuracy. Solely relying on spectral information to estimate crop traits presented limitations, especially when evaluating the AGB of high-density or environmentally susceptible crops [14]. To address these limitations, researchers initiated the integration of texture information, temperature information, and plant height to optimize AGB remote sensing monitoring methods [15]. For example, Zheng et al. [16] developed a model for monitoring rice AGB by combining texture and vegetation indices, resulting in a significant improvement in the estimation accuracy. Similarly, Yue et al. [17] estimated winter wheat AGB using spectral and texture information from UAV imagery, achieving an R2 of 0.89. In addition, other scholars have also used plant height, canopy temperature, and other factors to monitor crop AGB. Jiang et al. [18] improved the accuracy of rice AGB estimation by utilizing the features of spectra, plant height, and meteorological temperature, achieving an R2 of 0.86. Bendig et al. [19] used an unmanned aerial vehicle to capture a high-resolution digital image of the barley canopy. They estimated the aboveground fresh biomass and dry biomass of barley by using the plant height, achieving an R2 of 0.81 and 0.82, respectively. Yue et al. [20] improved the accuracy of the winter wheat AGB model by combining spectral parameters and crop height. Texture information, temperature information, and plant height are important factors that can affect the accuracy of monitoring crop AGB. The information content of a single UAV image is limited, and the exclusive reliance on spectral parameters exhibits inherent limitations, complicating the concurrent consideration of spectral, temporal, and spatial resolutions. This complexity may lead to phenomena such as “same spectrum, different objects” or “same object, different spectra”. Therefore, the assessment of the AGB should be conducted across multiple data dimensions, incorporating aspects such as the crop canopy’s structural features, canopy temperature, spectral information, point cloud data, and texture characteristics. The integration of remote sensing data from various sources can enhance the accuracy of AGB monitoring. However, different remote sensing features may respond differently to the AGB, necessitating further research into models for monitoring AGB based on multi-source remote sensing data. This research targets millet across different locations and varieties, leveraging a UAV remote sensing platform equipped with a RedEdge-MX multispectral imaging system and a Zenmuse XT2 gimbal camera. It gathers multi-source remote sensing data on the millet canopy, including spectral, texture, temperature, and structural information, at a field scale. By integrating the ground-measured millet AGB, the study identifies image features sensitive to millet AGB as predictors. Employing multiple linear regression, support vector regression, and random forest regression algorithms, the research aims to achieve non-destructive, efficient, and precise monitoring of millet AGB. The objectives of this study include the following four main objectives: (1) to evaluate the capability of estimating millet plant height using UAV-obtained RGB image point clouds; (2) to investigate the impact of different remote sensing features on AGB estimation; (3) to analyze the effects of different feature combinations on millet AGB estimation; (4) to assess the potential of integrating multi-source remote sensing data with machine-learning algorithms for estimating millet AGB.

2. Materials and Methods

2.1. Experimental Design

The experiment took place in the Taigu District (37°25′ N, 112°29′ E), Jinzhong City, Shanxi Province, during 2022 and 2023 (Figure 1). The region has a temperate continental monsoon climate, with an average annual temperature of 6–10 °C and an elevation of approximately 780 m. The average annual precipitation ranges between 410 and 450 mm.
Experiment 1: The study was conducted at the Wujiapu experimental field (Figure 1a). The split-plot design was adopted. The main plot was cultivar and the subplot was nitrogen treatment. The tested cultivars were Changsheng 13 and Qinzhouhuang. Four nitrogen levels were established (113, 149, 185, and 221 kg ha). At sowing, the row spacing was set at 0.4 m, employing a base fertilizer of N (113 kg ha), P2O5 (68 kg ha), and K2O (58 kg ha), followed by topdressing during the jointing stage (0, 36, 72, and 108 kg ha) with 3 replicates of 24 m2 per plot, totaling 24 communities.
Experiment 2: At the Shenfeng experimental field (Figure 1b), a similar split-plot approach was taken, focusing on variety as the main area and nitrogen treatment as the sub-area. The tested varieties were Jingu 21 and Jingu 28. Four nitrogen levels were established (113, 149, 185, and 221 kg ha). At sowing, the row spacing was set at 0.4 m, employing a base fertilizer of N (113 kg ha), P2O5 (68 kg ha), and K2O (58 kg ha), followed by topdressing during the jointing stage (0, 36, 72, and 108 kg ha) with 3 replicates of 24 m2 per plot, totaling 24 communities.
Experiment 3: Taking place in the Taoyuanbao experimental field, illustrated in Figure 1c, this experiment also adopted a split-plot design. It focused on variety as the main plot and nitrogen treatment as the subplot. The tested varieties were Changsheng 13, Jingu 21, Zhangzagu 10, Zhangzagu 13, Jingu 56, and Changnong 48. Four nitrogen fertilizer levels (0, 120, 180, and 240 kg ha) were designated for the base fertilizer. During sowing, a row spacing of 0.4 m was set, with additional base fertilizers of P2O5 (45 kg ha) and K2O (45 kg ha) applied, utilizing three replicates per plot area of 20 m2, totaling 72 plots. Fertilization and irrigation methods adhered to local high-standard farmland management practices.

2.2. Data Acquisition

2.2.1. Ground Data Acquisition

Plant height and AGB were measured in areas exhibiting a high vegetation coverage and uniform growth during the four significant growth stages of millet (Table 1), namely, the jointing, booting, heading, and filling stages. Three plants were selected, and their natural height was measured using a tape measure. Within each plot’s sampling area, plants with a row length of 0.1 m were chosen for destructive sampling. Samples were oven-dried at 80 °C until reaching a constant weight, and the dry biomass was obtained by weighing the dried samples in the laboratory. In total, 456 plant samples were collected across the four growth stages. The AGB was calculated using Formula (1).
AGB = W 0.1 × 0.4
In the formula, AGB represents biomass (kg m−2) and W is the 0.04 m2 aboveground biomass weight (kg).

2.2.2. UAV Data Acquisition and Processing

UAV remote sensing data acquisition was synchronized with ground data acquisition. In this study, a four-rotor DJI M210 V2 UAV (SZ DJI Technology Co., Shenzhen, China) was used as the flight platform, equipped with a RedEdge-MX multi-spectral imaging system (MicaSense, Seattle, WA, USA) and a Zenmuse XT2 gimbal camera (SZ DJI Technology Co., Shenzhen, China) as the UAV high-throughput remote sensing platform (Figure 2). Multi-spectral images, thermal infrared images, and visible-light images of millet at major growth stages were obtained (Table 1). The RedEdge-MX dual-camera imaging system features 10 spectral channels covering a range of 444–842 nm. It can capture 10 multi-spectral images simultaneously, each with a resolution of 1280 × 960 pixels. The band parameters for the specific colors were as follows: coastal blue (B444), 444± 28 nm; blue (B475), 475 ± 32 nm; green (G531), 531 ± 14 nm; green (G560), 560 ± 27 nm; red (R650), 650 ± 16 nm; red (R668), 668 ± 14 nm; red edge (RE705), 705 ± 10 nm; red edge (RE717), 717 ± 12 nm; red edge (RE740), 740 ± 18 nm; near-infrared (NIR842), 842 ± 57 nm. The Zenmuse XT2 thermal imaging camera integrates high-resolution thermal infrared and visible light lenses to capture high-definition thermal infrared and visible-light images simultaneously.
The flight parameters were specified as follows: the flight duration spanned from 10:00 to 14:00, the flight altitude was maintained at 30 m, and the flight speed was set at 3 m/s with an 80% forward and side overlap rate. The imaging process utilized equal time shooting, and the lens direction of the multi-spectral and thermal infrared camera remained consistently vertical and downward throughout the task execution. Image data for the standard reflectance gray plate were manually collected before the aircraft took off.
The UAV acquired multi-spectral, thermal infrared, and visible-light images, along with images of a standard reflective gray board. Subsequently, these images underwent image stitching, radiometric calibration, and point cloud data extraction using Pix4D mapper (Pix4D S.A., Lausanne, Switzerland). The resultant output comprised of orthophotos in the multi-spectral, thermal infrared, and visible-light spectra.

2.3. Data Processing and Analysis

2.3.1. Spectral Information and Canopy Temperature Acquisition

The multi-spectral image and the designated region of interest (ROI) were imported into ENVI 5.5 software (Harris Geospatial Solutions, Inc., Broomfield, CO, USA) to extract and analyze the data. Spectral reflectance was extracted from each cell and computed the mean value as the cell’s multi-spectral reflectance. Furthermore, the thermal infrared image and its respective region of interest (ROI) underwent data extraction in ENVI 5.5 software for canopy temperature details per plot, including the maximum temperature (Tmax), minimum temperature (Tmin), and average temperature (Tmean).

2.3.2. Plant Height Acquisition

The point cloud method was used to obtain the plant height (PH) of the millet, and the Pix4Dmapper was used to extract the point cloud data, which was processed with reference to the processing flow of Malambo et al. [21]. The point cloud of each period was preprocessed by filtering, point cloud denoising, and point cloud smoothing, and the point cloud was classified to obtain the surface point cloud and vegetation point cloud. The ground point cloud was relatively normalized. Through relative normalization, the calculation of a specific date was independent of other dates, thereby avoiding further registration to minimize deviations. In order to avoid the influence of local depression on the PH extraction, the 95th and 99th percentiles of the point cloud height and the height of the maximum (Max) point cloud height (Figure 3) were selected to estimate the plant height of each plot.
In this study, the Canopy Height Model (CHM) was extracted from the point cloud to predict the plant height. To obtain the CHM, the Digital Surface Model (DSM) and Digital Elevation Model (DEM) generated by the point cloud were first acquired [22]. Since the DSM represents the height of all objects on the ground, and the DEM only represents the height of the ground, the CHM can be obtained by subtracting the DEM from the DSM as the plant height of each plot. The specific calculation formula is as follows
CHM = DSM DEM

2.3.3. Construction of Vegetation Index

In this study, 10 commonly used vegetation indices were selected to estimate the millet biomass. The calculation formula is shown in Table 2.

2.3.4. Texture Feature Extraction and Transformation

Texture was an essential feature in the UAV remote sensing images, representing the spatial correlation of image grayscale levels. We extracted texture features from 10 image bands, utilizing the Gray Level Co-occurrence Matrix (GLCM), a widely used method for texture feature extraction [32]. The GLCM has found applications in machine vision, image classification, and image recognition [33,34]. Following radiometric correction and image fusion, texture features were derived from the multi-spectral images using the GLCM, extracting the following eight texture features: mean (mean), variance (var), homogeneity (hom), contrast (con), heterogeneity (dis), entropy (ent), second-order moment (sm), and correlation (cor). The average value of all pixels within the region of interest was computed as the texture feature value for the corresponding image.
To explore the potential application of texture features in UAV multi-spectral images for estimating millet aboveground biomass (AGB), this study randomly combined the extracted texture features. Subsequently, we computed three types of texture indices (TIS) based on prior research experience. They encompassed the Normalized Difference Texture Index (NDTI) [35], Ratio Texture Index (RTI), and Difference Texture Index (DTI) [36]. The specific calculation formulas were as follows
NDTI = ( T 1 T 2 ) / ( T 1 + T 2 )
RTI = T 1 / T 2
DTI = T 1 T 2
In the formula, T1 and T2 are the texture eigenvalues of random arbitrary bands.

2.4. Model Construction and Evaluation

The AGB was modeled using linear and machine-learning algorithms, including multiple linear regression (MLR), support vector machine (SVM), and random forest (RF) [37,38,39]. In the modeling process, all sample data were randomly divided into 5 parts using the 5-fold cross-validation method. Each training set consisted of 80% (365 datasets) for model development, and the remaining 20% (91 datasets) were used as the test set for model verification. Each model underwent 10 rounds of training to ensure robustness.
The coefficient of determination (R2), root mean square error (RMSE), and relative percent deviation (RPD) served as evaluation indices for the millet AGB monitoring model. Higher R2 and RPD values indicate greater accuracy, while a smaller RMSE value suggests higher model accuracy. Their calculation formulae are as follows:
R 2 = 1 i = 1 n y i x i 2 i = 1 n y i y ¯ 2
R M S E = i = 1 n x i y i 2 n
RPD = SD RMSE
In the formulae, xi and yi represent the predicted value and the measured value, respectively, y ¯ is the average value of the measured value, n is the number of samples, SD is the standard deviation of the measured value, and RMSE is the root mean square error of the predicted value.

3. Results

3.1. The Change Rule and Statistical Analysis of Millet Plant Height and AGB

As depicted in Figure 4, the AGB of the millet demonstrated a progressive increase throughout the growth stages, whereas the millet plant height exhibited an initial rise followed by a decline, depicting the growth dynamics during distinct growth phases. Table 3 illustrates the AGB values ranging from 0.651 to 3.483, indicating a wide range, with both the maximum and minimum values present in the training set, which is beneficial for model training. With kurtosis and skewness values approximating zero, the datasets tended toward a normal distribution, meeting statistical requirements and suitability for further analysis.

3.2. Comparison of Millet Plant Height Obtained by UAV with the Real Value on the Ground

In this study, point cloud data was used to obtain the PH value of millet. The 95th and 99th percentiles and maximum values of the point cloud height of the RGB images and the CHM were employed to estimate the PH. The plant height obtained by different methods was significantly correlated with the measured plant height (p < 0.01) (Table 4), and the 99th percentile of the point cloud height had the strongest correlation (r = 0.935). The R2 of the 95th and 99th percentiles and maximum values of the point cloud height and the estimated and measured values of the CHM (Figure 5) are 0.805, 0.873, 0.808 and 0.668, respectively, and the RMSEs are 11.241 cm, 7.511 cm, 11.029 cm, and 13.451 cm, respectively. Therefore, the 99th percentile of the point cloud height can accurately estimate the PH value of millet.

3.3. Correlation Analysis between Multi-Source Remote Sensing Features and AGB of Millet

Correlation analysis between the spectral parameters, canopy temperature, plant height, and the AGB of the millet was performed (Figure 6). The AGB exhibited a significant negative correlation with the canopy reflectance within the 444–740 nm range (p < 0.01) and a positive correlation at 842 nm (p < 0.01). The selected MCARI showed a significant negative correlation with the AGB (p < 0.01), while other vegetation indices displayed a significant positive correlation with the AGB (p < 0.01). Notably, the MTCI exhibited the strongest correlation with the AGB (r = 0.660). Canopy temperature displayed a significant negative correlation with the AGB (p < 0.01), with Tmean showing the highest correlation (r = −0.425). Additionally, there was a significant positive correlation between the plant height and the AGB (p < 0.01), with the 99th percentile exhibiting the highest correlation (r = 0.534).
Correlation analysis between texture features and the AGB (Table 5) revealed varied correlations, with correlation coefficients ranging from 0.059 to 0.662. Except for the mean texture feature under the NIR842 band, all other texture features displayed significant correlations with the winter millet AGB at the 0.01 level (p < 0.01). Particularly, the mean of the RE717 band exhibited a substantial correlation with the AGB (r = −0.662).
To enhance the utility of texture features in the AGB monitoring of foxtail millet, three texture indices combining different features were developed (Figure 7). The correlation between the texture features and AGB notably improved after linear transformation. Particularly, the RTI (mean740, mean842) exhibited the strongest correlation with the AGB (r = 0.703), with its correlation coefficient being 52.21% higher than that of mean740. Six texture indices were selected, including the NDTI (mean740, mean842), NDTI (mean717, sm842), DTI (mean717, dis705), DTI (mean717, corr444), RTI (mean740, mean842), and RTI (mean717, sm842), all displaying correlation coefficients above 0.675, showcasing their potential as effective indices.

3.4. Results of AGB Estimation Model Based on Different Remote Sensing Features

In order to compare the estimation effects of the spectral parameters, texture indices, canopy temperature, and plant height from the UAV remote sensing images on millet’s AGB, correlation analysis was conducted to identify the best predictors. Five spectral parameters, including B444, MCARI, NDRE, CIRE, and MTCI, along with six texture indices, including NDTI (mean740, mean842), NDTI (mean717, sm842), DTI (mean717, dis705), DTI (mean717, corr444), RTI (mean740, mean842), and RTI (mean717, sm842), were selected. Additionally, three temperature parameters (Tmin, Tmax, and Tmean) and the 99th percentile of the point cloud height (PH) were chosen to construct the AGB estimation models using the MLR, SVM, and RF methods, respectively.
The results of the AGB estimation models based on single-factor input variables were analyzed (Table 6). It can be observed from the table that the RF model performed best in the modeling algorithm (Train R2 = 0.535–0.864; Test R2 = 0.323–0.698), followed by the SVM model. The training set’s R2 ranged from 0.273 to 0.756, and the test set’s R2 ranged from 0.282 to 0.671, while the MLR model performed the worst. The training set’s R2 ranged from 0.172 to 0.631, and the test set’s R2 ranged from 0.206 to 0.611. Regarding different data types, the model based on the texture index performed the best (Train R2 = 0.631–0.864; Test R2 = 0.611–0.698). Among them, the RF model based on the texture index performed the best (R2 = 0.698, RMSE = 0.323 kg m−2, and RPD = 1.821), followed by the SVM model (R2 = 0.671, RMSE = 0.312 kg m−2, and RPD = 1.546). The performance of the model based on spectral parameters was also good. The training set’s R2 ranged from 0.604 to 0.848, and the test set’s R2 ranged from 0.583 to 0.665. However, the performance of the estimation model constructed by the canopy temperature and plant height was relatively lower. The training set’s R2 ranged from 0.172 to 0.698 and 0.293 to 0.535, respectively, and the test set’s R2 ranged from 0.206 to 0.571 and 0.255 to 0.323, respectively.

3.5. Results of AGB Estimation Model Based on Multi-Source Remote Sensing Features

The estimation results of the millet AGB based on multi-source remote sensing feature fusion are presented in Table 7. When comparing with the single-factor features, the model’s estimation accuracy significantly improved with the simultaneous fusion of the spectral parameters, texture indices, canopy temperature, and plant height. Among these, the RF model exhibited the highest accuracy (Train: R2 = 0.937, RMSE = 0.149 kg m−2, and RPD = 3.981; Test: R2 = 0.877, RMSE = 0.207 kg m−2, and RPD = 2.847). Notably, the RF model constructed by the TIS + T combination exhibited the highest estimation accuracy (Train: R2 = 0.915, RMSE = 0.173 kg m−2, and RPD = 3.439; Test: R2 = 0.801, RMSE = 0.253 kg m−2, and RPD = 2.244), surpassing the TIS-RF model, with an increase in the R2 from 0.698 to 0.801. The next best performers were the RF models constructed by SPs + TIS and TIS + PH, achieving R2 values of 0.796 and 0.787, respectively. The accuracy of the RF model constructed by T + PH was moderate, yet it improved the R2 from 0.323 to 0.704 compared to the PH-RF model. Moreover, the model’s accuracy improved when using a combination of three characteristic variables. The RF model based on SPs + TIS + T showed the highest estimation accuracy (Train: R2 = 0.932, RMSE = 0.153 kg m−2, and RPD = 3.850; Test: R2 = 0.869, RMSE = 0.217 kg m−2, and RPD = 2.766). Additionally, the RF models based on TIS + T + PH and SPs + TIS + PH also demonstrated good accuracy, achieving R2 values of 0.855 and 0.835, respectively. Although the accuracy of the RF model based on SPs + T + PH was moderate, the R2 increased from 0.748 to 0.808 compared to the SPs + T − RF model. These results indicated that the model based on multi-source remote sensing information fusion significantly enhanced the estimation performance. Furthermore, considering the modeling algorithms, the RF and SVM modeling performances were generally higher than the MLR under the same feature input in the past (Train: R2 = 0.793–0.937; Test: R2 = 0.704–0.877), suggesting that the RF model exhibited better stability. As the number of input features increased in the past, the R2 and RPD improved, the RMSE decreased, and the model’s estimation accuracy gradually increased, enhancing the stability of the estimation model.
In order to understand the spatial and temporal changes in the AGB according to the optimal estimation model, the AGB inversion map of the key growth period was created. As shown in Figure 8, with the growth of millet, most of the AGB showed a significant increase, and some of the AGB showed a trend of increasing first and then decreasing, which was in line with the growth status of the millet, indicating that the use of UAV multi-source remote sensing information to monitor the growth and development of millet has certain applicability.

4. Discussion

4.1. Estimation of Millet Plant Height

Plant height constituted a pivotal aspect of the crop structure, playing a crucial role in biomass monitoring. Certain scholars contend that plant height is the primary determinant of biomass [40,41], indirectly mirroring the growth variations across crop cycles. The UAV remote sensing platform served as a vital technological tool for rapid plant height acquisition [42], demonstrating successful applications in estimating plant height across diverse crop types [43,44,45]. Prior studies employed the crop surface model derived from RGB images to estimate plant height, disregarding information beneath the crop canopy, leading to substantial errors in height estimation [46,47,48]. Conversely, point clouds enabled the precise capture of vertical plant information, yielding more reliable and accurate plant height estimations by selecting suitable plant height characteristics for representation. The findings demonstrated that the 99th percentile of the point cloud exhibited the strongest correlation with the measured plant height value, delivering the most accurate estimation of the millet’s plant height (R2 = 0.873; RMSE = 7.511 cm). Niu et al. [42] reported the highest correlation between the plant height value derived from the point cloud’s 99th percentile and the directly measured plant height. The comparatively lower accuracy in estimating other plant heights may have stemmed from a higher occurrence of outliers in the maximum value and variations in the plant height due to millet ear drooping during the filling stage, leading to reduced estimation precision.

4.2. Correlation Analysis between Multi-Source Remote Sensing Features and AGB of Millet

The vegetation index combines the information of different bands, reduces the influence of noise, and improves the sensitivity to target traits. Therefore, it is often used for crop AGB, leaf area index, chlorophyll content and yield monitoring. [49,50,51,52]. Initially, this study examined the relationship between spectral bands, the vegetation index, and the AGB. The findings indicated a notable correlation between the vegetation index and the AGB, with the MTCI demonstrating the highest correlation with the AGB (r = 0.660), signifying its robust sensitivity in assessing foxtail millet biomass. Numerous studies demonstrated the strong correlation between the vegetation index derived from red and near-infrared bands and the AGB [53,54], facilitating the estimation of the foxtail millet AGB. Crop canopy temperature served as a comprehensive reflection of environmental and physiological factors directly influencing the crop growth rate, photosynthesis, and biomass accumulation, thereby serving as an indicator to assess the growth status of the crop AGB [55,56]. Moreover, a noteworthy correlation existed between the plant height and AGB in foxtail millet (r = 0.534), as the plant height served as a crucial characterization of the crop structure and morphology, reflecting crop development and playing a pivotal role in biomass monitoring [57].
Texture features facilitated the extraction of the local structure, surface details, and texture information from the images, aiding in acquiring spatial distribution information of the vegetation [58]. Prior research indicated the feasibility of estimating the AGB using texture features [59]. Consequently, texture features underwent linear processing to construct three texture indices (NDTI, DTI, and RTI). This ability stemmed from the texture indices’ capability to emphasize ground object characteristics while diminishing the impact of the soil background, terrain, shadow, illumination angle, and sensor perspective [36,60], thereby enhancing the accuracy of the AGB estimation.

4.3. Monitoring Millet AGB with Different Remote Sensing Characteristics

Utilizing canopy spectral information acquired from UAV multi-spectral and hyperspectral images for estimating crop traits represented the predominant method for monitoring crop growth [61,62]. However, findings from this study indicated that the spectral parameter construction model’s accuracy was inferior to that of the texture index construction model. This disparity could be attributed to the spectral information primarily reflecting the interaction between light and plants or soil. The dense planting of millet could have led to saturation, as the plants grew vigorously, thereby diminishing the accuracy of the spectral information estimation. Previous studies suggested that texture information was more accurate than the vegetation index in assessing rice AGB. Zheng et al. [16] employed the texture index to assess the accuracy of the rice AGB model throughout the growth stages and before heading, demonstrating higher accuracy compared to the vegetation index model. In this study, the RF model based on the texture index achieved the highest estimation accuracy (R2 = 0.698; RMSE = 0.323 kg m−2), surpassing the accuracy of the models constructed by other single-factor features. This superiority might have stemmed from the high planting density of the millet. Texture information enabled the acquisition of crop surface structure details and spatial changes, mitigating soil interference, vegetation shadow, and other factors, thereby enhancing the accuracy of the biomass estimation. The performance of the estimation model based on canopy temperature in this study was moderate. This could be attributed to the close relationship between the crop canopy temperature, plant water content, and environmental factors. The relatively lower performance of the plant height construction estimation model might have been attributed to the morphological changes in the millet ears during the filling stage. Increased ear weight caused the ear shape to transition from upright to drooping, thereby affecting the biomass estimation accuracy [63].

4.4. AGB Estimation Model of Multi-Source Remote Sensing Features

This study integrated multiple remote sensing data sources to explore the potential for estimating millet AGB using different feature combinations. The RF model constructed by the TIS + T data fusion demonstrated superior performance when these two feature combinations were combined, likely attributed to the inclusion of texture information. Texture information captured the diverse surface structures of the millet across various growth stages, while temperature data provided essential environmental information. This fusion offered a more comprehensive overview of millet growth characteristics, significantly enhancing the accuracy of the AGB assessment. The estimation accuracy of the SPs + PH fusion surpassed that of the SPs + T fusion. This outcome aligned with findings from Yue et al. [17] and Wang et al. [35], indicating that combining spectral information and texture features markedly enhanced the crop AGB estimation capabilities. Spectral information, primarily from optical sensors, encountered limitations due to asymptotic saturation, especially within millet canopies. Integrating structural features compensated for these limitations to some extent by providing insights into canopy growth and structure [64,65], mitigating the inherent issues associated with spectral features [66]. Combining all three features, the RF model of the SPs + TIS + T data fusion attained the highest estimation accuracy, likely owing to the unique and complementary information contributed by spectral, texture, and thermal features. Zhang et al. [67] also showcased the synergistic relationship between spectral, texture, and thermal information fusion, notably improving wheat AGB prediction accuracy, further validating the complementary nature of these three sources. Moreover, the TIS + T + PH fusion outperformed the SPs + T + PH fusion, implying that merging texture, temperature, and structure information could surmount limitations within dense millet canopies. Texture information served to supplement spectral features, while temperature and plant height data reflected vegetation morphology, structure, and growth states, offering multifaceted insights for AGB estimation. In this research, a millet AGB estimation model was formulated based on fusing multi-source remote sensing information, including spectra, texture, canopy temperature, and plant height. Compared to single-image feature estimation models, the RF model constructed via multi-source remote sensing data fusion exhibited higher accuracy (R2 = 0.877, RMSE = 0.207 kg m−2, and RPD = 2.847). Maimaitijiang et al. [68] supported these findings, highlighting that multi-source data fusion enabled more a comprehensive and accurate monitoring of millet growth status, significantly enhancing the millet AGB estimation precision.
Machine-learning algorithms, combined with remote sensing data, became widely utilized in crop monitoring and other domains due to their efficacy in handling complex, high-dimensional data, thereby enhancing model accuracy [69]. Accordingly, this study employed the MLR, SVM, and RF algorithms to construct the model. The RF model, demonstrating the highest accuracy in estimating millet AGB (R2 = 0.877, RMSE = 0.207 kg m−2, and RPD = 2.847), aligned with the findings by Han et al. [70] in maize AGB monitoring. The RF model leveraging a decision tree system adeptly processed substantial datasets, accurately assessed feature importance, mitigated overfitting, and exhibited robustness to outliers, making it an apt solution for solving inversion problems [64].

5. Conclusions

In this study, UAV visible-light, multi-spectral, thermal infrared, and other image data were leveraged to acquire spectral parameters, texture indices, canopy temperature, and plant height. Subsequently, the MLR, SVM, and RF algorithms were applied to construct an AGB estimation model for millet. The results indicated a strong correlation between the plant height derived from the UAV RGB point cloud and the directly measured plant height. The height at the 99th percentile of the point cloud offered the most accurate estimation of the PH value of the millet (R2 = 0.873; RMSE = 7.511 cm). The vegetation index demonstrated a good correlation with the AGB, particularly the MTCI (r = 0.660). Canopy temperature and plant height exhibited significant correlation with the AGB. The correlation among most texture features and the AGB was limited. Nevertheless, after applying a linear transformation, the correlation between the texture index and the AGB markedly improved. The capability to estimate the AGB using single-factor features ranked as TIS > SPs > T > PH. Additionally, the RF model based on the texture index demonstrated the highest estimation accuracy (R2 = 0.698). Upon fusing the two features, the RF model constructed using the TIS + T data fusion achieved the highest accuracy (R2 = 0.801). The RF model constructed by the SPs + TIS + T data fusion attained the highest accuracy (R2 = 0.869). The RF model incorporating spectral parameters, texture indices, canopy temperature, and plant height demonstrated the highest accuracy (R2 = 0.877, RMSE = 0.207 kg m−2, and RPD = 2.847). The findings of this study indicate that the millet AGB estimation method utilizing UAV remote sensing images combined with multi-source remote sensing data fusion possesses significant application potential for the accurate monitoring of millet growth. Furthermore, it holds reference value in popularizing and applying precision agriculture.

Author Contributions

M.F., J.S. and Z.Y. (Zhongyu Yang) conceived and designed the experiments; Z.Y. (Zhongyu Yang), Z.Y. (Zirui Yu), X.W., W.Y. (Wugeng Yan), S.S., P.S., X.S. (Xinkai Sun), and Z.W. performed the experiments; Z.Y. (Zhongyu Yang) analyzed the data and wrote the original manuscript; M.F., J.S., C.Y., C.W., Y.Z., X.S. (Xiaoyan Song), M.Z., L.X. and W.Y. (Wude Yang) reviewed and revised the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Research Program Sponsored by the State Key Laboratory of Sustainable Dryland Agriculture (in preparation), Shanxi Agricultural University (No. 202003-6), the Key Research and Development Program of Shanxi Province, China (201903D211002-01), the Shanxi Province Basic Research Project (20210302124236), the Shanxi Agricultural University Doctoral Research Project (2021BQ99), and the Shanxi Province Graduate Education Innovation Project (2023KY330).

Data Availability Statement

This research project is ongoing, and a part of the data is available upon request.

Acknowledgments

We are grateful to the Shanxi Agricultural University for providing the trial site. We are also grateful to the editor and reviewers.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Diao, X.; Jia, G. Origin and domestication of foxtail millet. In Genetics and Genomics of Setaria; Plant Genetics and Genomics: Crops and Models; Springer: Cham, Switzerland, 2017; pp. 61–72. [Google Scholar] [CrossRef]
  2. Zhou, X.; Zhu, X.; Dong, Z.; Guo, W. Estimation of biomass in wheat using random forest regression algorithm and remote sensing data. Crop J. 2016, 4, 212–219. [Google Scholar] [CrossRef]
  3. Serrano, L.; Filella, I.; Penuelas, J. Remote sensing of biomass and yield of winter wheat under different nitrogen supplies. Crop Sci. 2000, 4, 723–731. [Google Scholar] [CrossRef]
  4. Hensgen, F.; Bühle, L.; Wachendorf, M. The effect of harvest, mulching and low-dose fertilization of liquid digestate on above ground biomass yield and diversity of lower mountain semi-natural grasslands. Agric. Ecosyst. Environ. 2016, 216, 283–292. [Google Scholar] [CrossRef]
  5. Huang, J.; Sedano, F.; Huang, Y.; Ma, H.; Li, X.; Liang, S.; Tian, L.; Zhang, X.; Fan, J.; Wu, W. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation. Agric. For. Meteorol. 2016, 216, 188–202. [Google Scholar] [CrossRef]
  6. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.H. Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  7. Kanning, M.; Kühling, I.; Trautz, D.; Jarmer, T. High-Resolution UAV-Based Hyperspectral Imagery for LAI and Chlorophyll Estimations from Wheat for Yield Prediction. Remote Sens. 2018, 10, 2000. [Google Scholar] [CrossRef]
  8. Khan, Z.; Rahimi-Eichi, V.; Haefele, S.; Garnett, T.; Miklavcic, S.J. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods 2018, 14, 20. [Google Scholar] [CrossRef] [PubMed]
  9. Olson, D.; Anderson, J. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agron. J. 2021, 113, 971–992. [Google Scholar] [CrossRef]
  10. Li, S.; Yuan, F.; Ata-UI-Karim, S.T.; Zheng, H.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Combining color indices and textures of UAV-based digital imagery for rice LAI estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef]
  11. Feng, A.; Zhou, J.; Vories, E.D.; Sudduth, K.A.; Zhang, M. Yield estimation in cotton using UAV-based multi-sensor imagery. Biosyst. Eng. 2020, 193, 101–114. [Google Scholar] [CrossRef]
  12. Lu, N.; Wang, W.; Zhang, Q.; Li, D.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Baret, F.; Liu, S. Estimation of nitrogen nutrition status in winter wheat from unmanned aerial vehicle based multi-angular multispectral imagery. Front. Plant Sci. 2019, 10, 1601. [Google Scholar] [CrossRef] [PubMed]
  13. Gnyp, M.L.; Miao, Y.; Yuan, F.; Ustin, S.L.; Yu, K.; Yao, Y.; Huang, S.; Bareth, G. Hyperspectral canopy sensing of paddy rice aboveground biomass at different growth stages. Field Crop Res. 2014, 155, 42–55. [Google Scholar] [CrossRef]
  14. Qiao, L.; Gao, D.; Zhao, R.; Tang, W.; An, L.; Li, M.; Sun, H. Improving estimation of LAI dynamic by fusion of morphological and vegetation indices based on UAV imagery. Comput. Electron. Agric. 2022, 192, 106603. [Google Scholar] [CrossRef]
  15. Zhu, W.; Sun, Z.; Peng, J.; Huang, Y.; Li, J.; Zhang, J.; Yang, B.; Liao, X. Estimating maize above-ground biomass using 3D point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales. Remote Sens. 2019, 11, 2678. [Google Scholar] [CrossRef]
  16. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  17. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  18. Jiang, Q.; Fang, S.; Peng, Y.; Gong, Y.; Zhu, R.; Wu, X.; Ma, Y.; Duan, B.; Liu, J. UAV-based biomass estimation for rice-combining spectral, TIN-based structural and meteorological features. Remote Sens. 2019, 11, 890. [Google Scholar] [CrossRef]
  19. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  20. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  21. Malambo, L.; Popescu, S.C.; Murray, S.C.; Putman, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. 2018, 64, 31–42. [Google Scholar] [CrossRef]
  22. Oehme, L.H.; Reineke, A.-J.; Weiß, T.M.; Würschum, T.; He, X.; Müller, J. Remote Sensing of Maize Plant Height at Different Growth Stages Using UAV-Based Digital Surface Models (DSM). Agronomy 2022, 12, 958. [Google Scholar] [CrossRef]
  23. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  24. Pearson, R.L.; Miller, L.D. Remote Mapping of Standing Crop Biomass for Estimation of the Productivity of the Shortgrass Prairie; Department of Watershed Sciences, College of Forestry and Natural Resources, Colorado State University: Fort Collins, CO, USA, 1972; Volume 1355, Available online: https://ui.adsabs.harvard.edu/abs/1972rse..conf.1355P (accessed on 5 December 2023).
  25. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  26. Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ 2002, 81, 337–354. [Google Scholar] [CrossRef]
  27. Motohka, T.; Nasahara, K.N.; Oguma, H.; Tsuchida, S. Applicability of green-red vegetation index for remote sensing of vegetation phenology. Remote Sens. 2010, 2, 2369–2387. [Google Scholar] [CrossRef]
  28. Lu, J.; Ehsani, R.; Shi, Y.; Abdulridha, J.; de Castro, A.I.; Xu, Y. Field detection of anthracnose crown rot in strawberry using spectroscopy technology. Comput. Electron. Agric. 2017, 135, 289–299. [Google Scholar] [CrossRef]
  29. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  30. Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B 1994, 22, 247–252. [Google Scholar] [CrossRef]
  31. Dash, J.; Curran, P. The MERIS Terrestrial Chlorophyll Index. Doctoral Thesis, University of Southampton, Southampton, UK, 2004. Available online: http://eprints.soton.ac.uk/id/eprint/465751 (accessed on 10 December 2023).
  32. Haralick, R.M.; Shanmugham, K.; Dinstein, I. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 1973, 6, 610–621. [Google Scholar] [CrossRef]
  33. Kavdır, I.; Guyer, D. Comparison of artificial neural networks and statistical classifiers in apple sorting using textural features. Biosyst. Eng. 2004, 89, 331–344. [Google Scholar] [CrossRef]
  34. Adjed, F.; Safdar Gardezi, S.J.; Ababsa, F.; Faye, I.; Chandra Dass, S. Fusion of structural and textural features for melanoma recognition. IET Comput. Vis. 2018, 12, 185–195. [Google Scholar] [CrossRef]
  35. Wang, F.; Yi, Q.; Hu, J.; Xie, L.; Yao, X.; Xu, T.; Zheng, J. Combining spectral and textural information in UAV hyperspectral images to estimate rice grain yield. Int. J. Appl. Earth Obs. 2021, 102, 102397. [Google Scholar] [CrossRef]
  36. Xu, L.; Zhou, L.; Meng, R.; Zhao, F.; Lv, Z.; Xu, B.; Zeng, L.; Yu, X.; Peng, S. An improved approach to estimate ratoon rice aboveground biomass by integrating UAV-based spectral, textural and structural features. Precis. Agric. 2022, 23, 1276–1301. [Google Scholar] [CrossRef]
  37. Berger, K.; Verrelst, J.; Féret, J.-B.; Wang, Z.; Wocher, M.; Strathmann, M.; Danner, M.; Mauser, W.; Hank, T. Crop nitrogen monitoring: Recent progress and principal developments in the context of imaging spectroscopy missions. Remote Sens. Environ. 2020, 242, 111758. [Google Scholar] [CrossRef] [PubMed]
  38. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  39. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  40. Fernandez, M.G.S.; Becraft, P.W.; Yin, Y.; Lübberstedt, T. From dwarves to giants? Plant height manipulation for biomass yield. Trends Plant Sci. 2009, 14, 454–461. [Google Scholar] [CrossRef] [PubMed]
  41. Montes, J.M.; Technow, F.; Dhillon, B.S.; Mauch, F.; Melchinger, A.E. High-throughput non-destructive biomass determination during early plant development in maize under field conditions. Field Crop Res. 2011, 121, 268–273. [Google Scholar] [CrossRef]
  42. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating above-ground biomass of maize using features derived from UAV-based RGB imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
  43. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  44. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  45. Tao, H.; Feng, H.; Xu, L.; Miao, M.; Yang, G.; Yang, X.; Fan, L. Estimation of the Yield and Plant Height of Winter Wheat Using UAV-Based Hyperspectral Images. Sensors 2020, 20, 1231. [Google Scholar] [CrossRef] [PubMed]
  46. Tirado, S.B.; Hirsch, C.N.; Springer, N.M. UAV-based imaging platform for monitoring maize growth throughout development. Plant Direct 2020, 4, e00230. [Google Scholar] [CrossRef] [PubMed]
  47. Volpato, L.; Pinto, F.; González-Pérez, L.; Thompson, I.G.; Borém, A.; Reynolds, M.; Gérard, B.; Molero, G.; Rodrigues, F.A., Jr. High throughput field phenotyping for plant height using UAV-based RGB imagery in wheat breeding lines: Feasibility and validation. Front. Plant Sci. 2021, 12, 591587. [Google Scholar] [CrossRef] [PubMed]
  48. Madec, S.; Baret, F.; De Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.; Comar, A. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR estimates. Front. Plant Sci. 2017, 8, 2002. [Google Scholar] [CrossRef] [PubMed]
  49. Zhai, W.; Li, C.; Cheng, Q.; Mao, B.; Li, Z.; Li, Y.; Ding, F.; Qin, S.; Fei, S.; Chen, Z. Enhancing Wheat Above-Ground Biomass Estimation Using UAV RGB Images and Machine Learning: Multi-Feature Combinations, Flight Height, and Algorithm Implications. Remote Sens. 2023, 15, 3653. [Google Scholar] [CrossRef]
  50. Wu, S.; Deng, L.; Guo, L.; Wu, Y. Wheat leaf area index prediction using data fusion based on high-resolution unmanned aerial vehicle imagery. Plant Methods 2022, 18, 1–16. [Google Scholar] [CrossRef] [PubMed]
  51. Qi, H.; Wu, Z.; Zhang, L.; Li, J.; Zhou, J.; Jun, Z.; Zhu, B. Monitoring of peanut leaves chlorophyll content based on drone-based multispectral image feature extraction. Comput. Electron. Agric. 2021, 187, 106292. [Google Scholar] [CrossRef]
  52. Liu, J.; Zhu, Y.; Tao, X.; Chen, X.; Li, X. Rapid prediction of winter wheat yield and nitrogen use efficiency using consumer-grade unmanned aerial vehicles multispectral imagery. Front. Plant Sci. 2022, 13, 1032170. [Google Scholar] [CrossRef] [PubMed]
  53. Shi, Y.; Gao, Y.; Wang, Y.; Luo, D.; Chen, S.; Ding, Z.; Fan, K. Using unmanned aerial vehicle-based multispectral image data to monitor the growth of intercropping crops in tea plantation. Front. Plant Sci. 2022, 13, 820585. [Google Scholar] [CrossRef]
  54. Tang, Z.; Guo, J.; Xiang, Y.; Lu, X.; Wang, Q.; Wang, H.; Cheng, M.; Wang, H.; Wang, X.; An, J. Estimation of Leaf Area Index and Above-Ground Biomass of Winter Wheat Based on Optimal Spectral Index. Agronomy 2022, 12, 1729. [Google Scholar] [CrossRef]
  55. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Odindi, J.; Mutanga, O.; Naiken, V.; Chimonyo, V.G.; Mabhaudhi, T. Estimation of maize foliar temperature and stomatal conductance as indicators of water stress based on optical and thermal imagery acquired using an unmanned aerial vehicle (UAV) platform. Drones 2022, 6, 169. [Google Scholar] [CrossRef]
  56. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  57. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  58. Zheng, H.; Ma, J.; Zhou, M.; Li, D.; Yao, X.; Cao, W.; Zhu, Y.; Cheng, T. Enhancing the nitrogen signals of rice canopies across critical growth stages through the integration of textural and spectral information from unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2020, 12, 957. [Google Scholar] [CrossRef]
  59. Liu, Y.; Liu, S.; Li, J.; Guo, X.; Wang, S.; Lu, J. Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images. Comput. Electron. Agric. 2019, 166, 105026. [Google Scholar] [CrossRef]
  60. Huete, A.R.; Jackson, R.D.; Post, D. Spectral response of a plant canopy with different soil backgrounds. Remote Sens. Environ. 1985, 17, 37–53. [Google Scholar] [CrossRef]
  61. Hassan, M.A.; Yang, M.; Rasheed, A.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef]
  62. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  63. Duan, L.; Huang, C.; Chen, G.; Xiong, L.; Liu, Q.; Yang, W. Determination of rice panicle numbers during heading by multi-angle imaging. Crop J. 2015, 3, 211–219. [Google Scholar] [CrossRef]
  64. Aghighi, H.; Azadbakht, M.; Ashourloo, D.; Shahrabi, H.S.; Radiom, S. Machine learning regression techniques for the silage maize yield prediction using time-series images of Landsat 8 OLI. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 4563–4577. [Google Scholar] [CrossRef]
  65. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment. J. Appl. Remote Sens. 2017, 11, 026035. [Google Scholar] [CrossRef]
  66. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  67. Zhang, S.-H.; He, L.; Duan, J.-Z.; Zang, S.-L.; Yang, T.-C.; Schulthess, U.; Guo, T.-C.; Wang, C.-Y.; Feng, W. Aboveground wheat biomass estimation from a low-altitude UAV platform based on multimodal remote sensing data fusion with the introduction of terrain factors. Precis. Agric. 2023, 25, 119–145. [Google Scholar] [CrossRef]
  68. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  69. Holloway, J.; Mengersen, K. Statistical machine learning methods and remote sensing for sustainable development goals: A review. Remote Sens. 2018, 10, 1365. [Google Scholar] [CrossRef]
  70. Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Geographic locations and drone imagery of the experimental sites in 2022 and 2023. (a) Depicts the location of the Wujiabao experimental field. (b) Depicts the location of the Shenfeng experimental field. (c) Depicts the location of the Taoyuanbao experimental field.
Figure 1. Geographic locations and drone imagery of the experimental sites in 2022 and 2023. (a) Depicts the location of the Wujiabao experimental field. (b) Depicts the location of the Shenfeng experimental field. (c) Depicts the location of the Taoyuanbao experimental field.
Agronomy 14 00701 g001
Figure 2. UAV near-ground remote sensing platform. (a) Four-rotor UAV; (b) Multi-spectral imaging system; (c) Thermal imaging camera.
Figure 2. UAV near-ground remote sensing platform. (a) Four-rotor UAV; (b) Multi-spectral imaging system; (c) Thermal imaging camera.
Agronomy 14 00701 g002
Figure 3. The cross-sectional perspective of image-based point clouds in millet plots.
Figure 3. The cross-sectional perspective of image-based point clouds in millet plots.
Agronomy 14 00701 g003
Figure 4. The boxplots of millet plant height and AGB at different growth stages. (a) AGB for the year 2022; (b) AGB for the year 2023; (c) Plant height for the year 2022; (d) Plant height for the year 2023.
Figure 4. The boxplots of millet plant height and AGB at different growth stages. (a) AGB for the year 2022; (b) AGB for the year 2023; (c) Plant height for the year 2022; (d) Plant height for the year 2023.
Agronomy 14 00701 g004
Figure 5. The relationship between the predicted PH values and measured PH values in point cloud data from UAV-based RGB images. (a) Maximum point cloud height; (b) 99th percentile of point cloud height; (c) 95th percentile of point cloud height; (d) CHM.
Figure 5. The relationship between the predicted PH values and measured PH values in point cloud data from UAV-based RGB images. (a) Maximum point cloud height; (b) 99th percentile of point cloud height; (c) 95th percentile of point cloud height; (d) CHM.
Agronomy 14 00701 g005
Figure 6. The correlation analysis of spectral parameters, canopy temperature, plant height, and foxtail millet AGB reveals significant differences at the 0.05 and 0.01 levels for variables * and **, respectively. The red area indicates positive correlation, while the blue area indicates negative correlation. The darker the color and the larger the circle, the stronger the correlation.
Figure 6. The correlation analysis of spectral parameters, canopy temperature, plant height, and foxtail millet AGB reveals significant differences at the 0.05 and 0.01 levels for variables * and **, respectively. The red area indicates positive correlation, while the blue area indicates negative correlation. The darker the color and the larger the circle, the stronger the correlation.
Agronomy 14 00701 g006
Figure 7. The correlation coefficients between AGB and texture indices (a) Normalized Difference Texture Index. (b) Difference Texture Index. (c) Ratio Texture Index. Any point in the figure represents the correlation coefficient between the texture index, which is obtained by the normalization, ratio, and difference in the two texture feature values corresponding to the horizontal and vertical coordinates of the point, and the AGB.
Figure 7. The correlation coefficients between AGB and texture indices (a) Normalized Difference Texture Index. (b) Difference Texture Index. (c) Ratio Texture Index. Any point in the figure represents the correlation coefficient between the texture index, which is obtained by the normalization, ratio, and difference in the two texture feature values corresponding to the horizontal and vertical coordinates of the point, and the AGB.
Agronomy 14 00701 g007
Figure 8. AGB at different growth stages using the RF model incorporating SPs + TIS + T + PH. Experiment 1 (ad); Experiment 2 (eg); Experiment 3 (hk); Jointing (a,h); Booting (b,e,i); Heading (c,f,j); Grain filling (d,g,k).
Figure 8. AGB at different growth stages using the RF model incorporating SPs + TIS + T + PH. Experiment 1 (ad); Experiment 2 (eg); Experiment 3 (hk); Jointing (a,h); Booting (b,e,i); Heading (c,f,j); Grain filling (d,g,k).
Agronomy 14 00701 g008
Table 1. Summary of field campaigns for the millet experiments.
Table 1. Summary of field campaigns for the millet experiments.
ExperimentDate of UAV FlightsDate of Field SamplingNumbersGrowth Stage
13 August 20223 August 202224Jointing
12 August 202212 August 202224Booting
19 August 202219 August 202224Heading
16 September 202216 September 202224Filling
228 July 202228 July 202224Booting
4 August 20224 August 202224Heading
30 August 202230 August 202224Filling
319 July 202319 July 202372Jointing
27 July 202327 July 202372Booting
8 August 20238 August 202372Heading
24 August 202324 August 202372Filling
Table 2. Formula for calculating vegetation indices.
Table 2. Formula for calculating vegetation indices.
DescriptionFormulaReference
Normalized Difference Vegetation Index (NDVI) ( NIR 842 R 668 ) / ( NIR 842 + R 668 ) Tucker et al. [23]
Ratio Vegetation Index (RVI) NIR 842 / R 668 Pearson et al. [24]
Green Normalized Difference Vegetation Index (GNDVI) ( NIR 842 G 531 ) / ( NIR 842 + G 531 ) Gitelson et al. [25]
Modified Simple Ratio (MSR) ( NIR 842 / R 668 1 ) / ( NIR 842 / R 668 + 1 Sims et al. [26]
Green Ratio Vegetation Index (GRVI) NIR 842 / G 531 Motohka et al. [27]
Modified Chlorophyll Absorption in the Reflectance Index (MCARI) [ ( NIR 842 RE 717 ) 0.2   ×   ( RE 717 G 531 ) ] × ( RE 717 / R 668 ) Lu et al. [28]
Green Chlorophyll Index (CIg) NIR 842 / G 531 1 Gitelson et al. [29]
Red Edge Normalized Difference Vegetation Index (NDRE) ( NIR 842 RE 740 ) / ( NIR 842 + R E 740 ) Gitelson A et al. [30]
Red Edge Chlorophyll Index (CIRE) NIR 842 / RE 740 1 Gitelson et al. [29]
Modified Triangular Vegetation Index (MTCI) ( NIR 842 RE 740 ) / ( RE 740 + R 650 ) Dash et al. [31]
Table 3. The descriptive statistics table for millet AGB.
Table 3. The descriptive statistics table for millet AGB.
DatasetNumberMinimumMaximumMeanKurtosisSkewness
Train3650.6513.4831.856−0.1850.451
Test910.6623.3531.774−0.4630.442
All datasets4560.6513.4831.840−0.2590.432
Table 4. The correlation coefficients between plant heights obtained from different methods and the measured plant height.
Table 4. The correlation coefficients between plant heights obtained from different methods and the measured plant height.
Plant HeightMaximum99th95thCHM
Correlation coefficient0.893 **0.935 **0.900 **0.829 **
Notes: ** is significant at the 0.01 levels.
Table 5. The correlation coefficients between texture features and AGB.
Table 5. The correlation coefficients between texture features and AGB.
Texture FeaturesCorrelation Coefficients
B444B475G531G560R650R668RE705RE717RE740NIR842
Mean (mean)−0.329 **−0.373 **−0.411 **−0.506 **−0.377 **−0.403 **−0.452 **−0.662 **−0.336 **−0.059
Variance (var)−0.121 **−0.120 *−0.199 **−0.262 **−0.147 **−0.156 **−0.180 **−0.348 **−0.316 **−0.285 **
Homogeneity (hom)0.273 **0.329 **0.391 **0.408 **0.261 **0.272 **0.307 **0.442 **0.353 **0.305 **
Contrast (con)−0.161 **−0.157 **−0.242 **−0.298 **−0.164 **−0.170 **−0.226 **−0.375 **−0.303 **−0.280 **
Dissimilarity (dis)−0.218 **−0.237 **−0.302 **−0.341 **−0.202 **−0.216 **−0.259 **−0.401 **−0.319 **−0.289 **
Entropy (ent)−0.318 **−0.395 **−0.432 **−0.479 **−0.346 **−0.339 **−0.400 **−0.187 **−0.445 **−0.382 **
Second moment (sm)0.341 **0.421 **0.493 **0.495 **0.393 **0.385 **0.439 **0.545 **0.455 **0.367 **
Correlation (corr)0.285 **0.297 **0.289 **0.273 **0.281 **0.290 **0.285 **0.294 **0.296 **0.300 **
Notes: * and ** are significant at the 0.05 and 0.01 levels.
Table 6. The results of the AGB estimation model based on the single-input feature.
Table 6. The results of the AGB estimation model based on the single-input feature.
Input VariableDatasetMLRSVMRF
R2RMSERPDR2RMSERPDR2RMSERPD
SPsTrain0.6040.3571.5890.7560.2831.8000.8480.2332.564
Test0.5830.3841.5490.6540.3321.4190.6650.3341.729
TISTrain0.6310.3501.6460.7230.3081.6650.8640.2162.712
Test0.6110.3421.6040.6710.3121.5460.6980.3231.821
TTrain0.1720.5231.0990.5950.3621.1870.6980.3231.820
Test0.2060.4981.1220.4480.4360.8840.5710.4131.526
PHTrain0.2930.4821.1890.2730.4800.5540.5350.3971.467
Test0.2550.4911.1590.2820.5230.4980.3230.4621.215
Notes: SPs, TIS, T, and PH denote the spectral parameters, texture indices, canopy temperature, and plant height, respectively.
Table 7. Results of the AGB estimation model based on multi-source features.
Table 7. Results of the AGB estimation model based on multi-source features.
Input VariableDatasetMLRSVMRF
R2RMSERPDR2RMSERPDR2RMSERPD
SPs + TISTrain0.7290.2931.9230.9200.1623.2460.9220.1623.591
Test0.6660.3511.7310.7800.2791.9060.7960.2742.214
SPs + TTrain0.7350.3061.9430.8980.1852.9560.9140.1753.419
Test0.6360.2981.6570.7120.3001.6020.7480.2751.993
SPs + PHTrain0.6510.3441.6920.8550.2142.3220.8630.2172.705
Test0.6480.3191.6850.7440.3151.7020.7600.2992.042
TIS + TTrain0.6430.3491.6880.9100.1783.0610.9150.1733.439
Test0.6340.3161.5910.7900.2252.0500.8010.2532.244
TIS + PHTrain0.6560.3381.7050.8000.2551.9980.8860.1962.958
Test0.6440.3301.6760.7650.2841.8620.7870.2802.168
T + PHTrain0.3600.4591.2500.7610.2821.6430.7930.2662.197
Test0.3830.4471.2730.6200.3521.3570.7040.3721.840
SPs + TIS + TTrain0.7480.2891.9920.9140.1663.2750.9320.1533.850
Test0.7370.2891.9510.8550.2272.5250.8690.2172.766
SPs + TIS + PHTrain0.7450.2961.9820.8930.1882.8370.9230.1633.608
Test0.6480.3061.6870.8290.2322.1760.8350.2412.462
SPs + T + PHTrain0.7380.2981.9550.9080.1733.1020.9170.1693.470
Test0.7090.2891.8540.8010.2611.8940.8080.2782.279
TIS + T + PHTrain0.6410.3401.6690.9030.1782.9410.9200.1663.538
Test0.6540.3541.7000.8360.2352.3420.8550.2302.626
SPs + TIS + T + PHTrain0.7630.2772.0550.9220.1623.4010.9370.1493.981
Test0.7450.3051.9820.8670.2082.4790.8770.2072.847
Notes: SPs, TIS, T, and PH denote spectral parameters, texture indices, canopy temperature, and plant height, respectively.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, Z.; Yu, Z.; Wang, X.; Yan, W.; Sun, S.; Feng, M.; Sun, J.; Su, P.; Sun, X.; Wang, Z.; et al. Estimation of Millet Aboveground Biomass Utilizing Multi-Source UAV Image Feature Fusion. Agronomy 2024, 14, 701. https://doi.org/10.3390/agronomy14040701

AMA Style

Yang Z, Yu Z, Wang X, Yan W, Sun S, Feng M, Sun J, Su P, Sun X, Wang Z, et al. Estimation of Millet Aboveground Biomass Utilizing Multi-Source UAV Image Feature Fusion. Agronomy. 2024; 14(4):701. https://doi.org/10.3390/agronomy14040701

Chicago/Turabian Style

Yang, Zhongyu, Zirui Yu, Xiaoyun Wang, Wugeng Yan, Shijie Sun, Meichen Feng, Jingjing Sun, Pengyan Su, Xinkai Sun, Zhigang Wang, and et al. 2024. "Estimation of Millet Aboveground Biomass Utilizing Multi-Source UAV Image Feature Fusion" Agronomy 14, no. 4: 701. https://doi.org/10.3390/agronomy14040701

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop