Next Article in Journal
Deep Prior Approach for Room Impulse Response Reconstruction
Next Article in Special Issue
Supervised and Weakly Supervised Deep Learning for Segmentation and Counting of Cotton Bolls Using Proximal Imagery
Previous Article in Journal
A Taper-in-Taper Structured Interferometric Optical Fiber Sensor for Cu2+ ion Detection
Previous Article in Special Issue
An Automated, Clip-Type, Small Internet of Things Camera-Based Tomato Flower and Fruit Monitoring and Harvest Prediction System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sugarcane Nitrogen Concentration and Irrigation Level Prediction Based on UAV Multispectral Imagery

1
Guangxi Key Laboratory of Sugarcane Biology, Guangxi University, Nanning 530004, China
2
School of Electrical Engineering, Guangxi University, Nanning 530004, China
3
Beijing Institute of Remote Sensing Equipment, Beijing 100854, China
4
IRREC-IFAS, University of Florida, Fort Pierce, FL 34945, USA
5
School of Agriculture, Guangxi University, Nanning 530004, China
6
Department of Bioproducts and Biosystems Engineering, University of Minnesota, Saint Paul, MN 55108, USA
*
Author to whom correspondence should be addressed.
Submission received: 25 February 2022 / Revised: 29 March 2022 / Accepted: 30 March 2022 / Published: 1 April 2022
(This article belongs to the Special Issue AI-Based Sensors and Sensing Systems for Smart Agriculture)

Abstract

:
Sugarcane is the main industrial crop for sugar production, and its growth status is closely related to fertilizer, water, and light input. Unmanned aerial vehicle (UAV)-based multispectral imagery is widely used for high-throughput phenotyping, since it can rapidly predict crop vigor at field scale. This study focused on the potential of drone multispectral images in predicting canopy nitrogen concentration (CNC) and irrigation levels for sugarcane. An experiment was carried out in a sugarcane field with three irrigation levels and five fertilizer levels. Multispectral images at an altitude of 40 m were acquired during the elongating stage. Partial least square (PLS), backpropagation neural network (BPNN), and extreme learning machine (ELM) were adopted to establish CNC prediction models based on various combinations of band reflectance and vegetation indices. The simple ratio pigment index (SRPI), normalized pigment chlorophyll index (NPCI), and normalized green-blue difference index (NGBDI) were selected as model inputs due to their higher grey relational degree with the CNC and lower correlation between one another. The PLS model based on the five-band reflectance and the three vegetation indices achieved the best accuracy (Rv = 0.79, RMSEv = 0.11). Support vector machine (SVM) and BPNN were then used to classify the irrigation levels based on five spectral features which had high correlations with irrigation levels. SVM reached a higher accuracy of 80.6%. The results of this study demonstrated that high resolution multispectral images could provide effective information for CNC prediction and water irrigation level recognition for sugarcane crop.

1. Introduction

As the most important sugar crop, sugarcane is mainly grown in tropical and subtropical areas and provides approximately 80% of the world’s sugar [1,2]. The growth of sugarcane is closely related to fertilizer, water, and radiation intensity. Evaluating the growth situation of sugarcane in a timely manner, and adjusting the field management strategy accordingly, is of great significance to the yield and quality of sugarcane. In recent years, remote sensing with spectral images at different scales has been considered an effective high-throughput phenotyping solution for predicting the growth and yield of crops.
Large-scale spectral imagery can cover an area ranging from 25 to 3600 km2 per image. Spatial resolutions generally range from more than 1 m to tens of meters, and some data can reach 0.3 m [3,4]. They are mainly obtained by satellites, such as Sentinel [5], Gaofen (GF) [6], Landsat [7,8], GeoEye [9], and QuickBird [10], which are normally provided by governments or commercial companies. Their main agricultural applications include land cover and land use investigation, vegetation classification, and crop yield forecasting. However, due to the limitations of low spatial resolution and fixed revisit cycles, it has formidable deficiencies for small-scale applications, which usually need more subtle and frequent data acquisition for crop growth monitoring [11].
Middle-scale spectral imagery can provide data with submeter spatial resolutions ranging from less than 1 m to a few meters [4,12]. These kinds of images are mainly acquired by aviation aircraft platforms, integrated with multispectral or hyperspectral imaging sensors, at an altitude of several kilometers with fairly large coverage and high spatial resolutions [13,14]. However, these kinds of platforms are not popular due to their high costs.
Small-scale spectral imagery is usually acquired by unmanned aerial vehicles (UAVs) and can normally cover up to hundreds of hectares [3,4]. With the rapid development of UAVs, combined with the increasing availability and decreasing cost of spectral imaging sensors, opportunities to capture spectral images with high spatial and spectral resolutions have abounded. UAV-based remote sensing systems can easily reach spatial resolution of centimeter, which means that they are more sensitive to spatially heterogeneous information. Over the past 10 years, UAVs, especially drones, have been rapidly accepted and popularized to acquire reliable field crop information, weather permitting [15]. They can provide subtle information about the crop canopies in every inch of a field, which is difficult to acquire via ground-based scouting by people, especially for tall plants. As such, these systems save labor and time [16,17].
Many previous studies have shown that crop yield [18,19], nitrogen (N) status [20,21,22], protein content [23,24] and water stress [25,26] can be predicted by drone-based multispectral and RGB imagery. When establishing models for different crops, various spectral features including spectral reflectance, existed vegetation indices (VIs), and newly proposed VIs can be used as input variables. Taking N prediction as an example, Peng et al. used the ratio vegetation index (RVI), the normalized difference red-edge index (NDRE) and the terrestrial chlorophyll index (TCI), to predict potato N status [27]. Zhang et al. used RVI and the normalized difference vegetation index (NDVI) to predict rice N status [28]. Osco et al. used NDVI, NDRE, the green normalized difference vegetation (GNDVI), and the soil-adjusted vegetation index (SAVI) to predict maize leaf nitrogen concentration (LNC) [29]. For water status prediction, SAVI [25], the normalized green-red difference index (NGRDI) [26], NDVI [30], NDRE [31] and so on, were reported in different studies for different crops. One of the main reasons why different spectral features are used in different crops is that the physiological characteristics and canopy distribution characteristics of different crops are different. Sugarcane is a tall and dense sugar crop. Unlike other crops, its stalk is the main raw material for sugar production, and it is an important organ for accumulating nutrients. Sugarcane has a long growing season, blooms late, and most of the time its canopy contains only leaves. Therefore, it is of practical significance to find suitable spectral features and establish corresponding growth prediction models for sugarcane.
Preliminary studies of remote sensing for sugarcane have also been conducted in recent years. Sugarcane planting areas classification [32], and large-scale yield prediction, [33,34] were reported based on satellite images. Predictions of sugarcane canopy nitrogen concentration (CNC) or LNC based on hyperspectral data [35] or hyperspectral imagery [36] were also reported. However, studies on CNC prediction and irrigation level classification based on high resolution multispectral imagery were seldomly reported.
In terms of modeling algorithms, both traditional machine learning algorithms and newly developed deep learning algorithms were used. Each has its own advantages and disadvantages. Deep learning algorithms have better performance in the case of sufficient samples. Ma et al. developed a county-level corn yield prediction model based on the Bayesian Neural Network (BNN) using multiple publicly available data sources over 20 years, including satellite images, climate observations, soil property maps and historical yield records [37]. Khaki et al. proposed a convolutional neural network model called YieldNet to predict corn and soybean yield based on MODIS products [38]. Yang et al. tried to use one-year hyperspectral imagery to train a CNN classification model to estimate corn grain yield [39]. Prodhan et al. monitored drought over South Asia using a deep learning approach with 16 years of remote sensing data [40]. It can be seen that a large volume of image data, as well as ground truth data in years, were commonly needed to provide a sufficient dataset to train a deep learning network. To collect this large number of data samples is very challenging. Therefore, the dataset of deep learning is difficult to produce in some circumstances. By contrast, the traditional machine learning methods which are generally based on statistics are suitable for most of the modeling problems when relatively small number of samples are available [41,42]. Partial least squares (PLS), extreme learning machines (ELMs), backpropagation neural networks (BPNNs), support vector machine (SVM), and others, have been widely used in crop nutrient predictions. For example, Li et al. [43] used PLS to establish 12 models of fruits and seeds for rapid analysis and quality assessment. Kira et al. established a model for estimating the chlorophyll and carotenoid contents of three tree varieties based on BPNN [44]. Chen et al. constructed a BPNN model to invert rice pigment content with several spectral parameters as input [45]. Pal et al. used an ELM algorithm to classify land covers with multispectral and hyperspectral data [46]; it achieved a better classification accuracy than models established with BPNN and support vector machine (SVM), with far less computational complexity. Different machine learning methods can suit for different cases depending on variable quantity, sample quantity, and the potential relationship between inputs and output.
In this study, in order to monitor the growth status of sugarcane canopies by a high throughput method, high-resolution multispectral images of an experimental sugarcane field were obtained by a low-altitude UAV. The objectives of this study were (1) to determine the sensitive spectral features for the predictions of the CNC and irrigation levels; (2) to establish the prediction models of the CNC based on different machine learning algorithms such as PLS, BPNN, and ELM; (3) to establish classification methods of irrigation levels based on SVM and BPNN.

2. Materials and Methods

2.1. Study Area

The sugarcane experimental field was in Nanning, Guangxi Autonomous Region, China (latitude 22.84° N, longitude 108.33° E), as shown in Figure 1. From the captured multispectral image (displayed in RGB) in the right of Figure 1, it can be seen that the experimental field had 12 plots with concrete partitions. Three irrigation treatments and five fertilization treatments were applied in the field. Urea, calcium magnesium phosphate, and potassium chloride were chosen as N, phosphorus (P), and potassium (K) fertilizers, respectively. Eight plots with different irrigations and fertilizers and four blank plots without fertilizer and irrigation (denoted by BL) were set in the field. Concrete partitions at a depth of 1.2 m were built between each plot to prevent water and fertilizer infiltration. The planted seedlings were limited to 975,000 plants per hectare. The two irrigation treatments included 180 m3/ha (denoted by W0.6) and 300 m3/ha (denoted by W1.0), while the four fertilizer treatments included F1.0 (250 kg/ha of N, 150 kg/ha of P2O5, 200 kg/ha of K2O), F0.9 (90% of the amount of F1.0), F1.1 (110% of F1.0) and F1.2 (120% of F1.0). Water and fertilizer were applied via drip irrigation pipes. Micronutrient fertilizers were equally applied to all the plots except the blank plots. The eight plots had the same size of 20 m × 6 m, with their different treatments denoted by W0.6F0.9, W0.6F1.0, W0.6F1.1, W0.6F1.2, W1.0F0.9, W1.0F1.0, W1.0F1.1 and W1.0F1.2.
The seed canes were planted on 24 March 2018. The seedling fertilizers, which accounted for 30% of the total fertilizer application, were applied on 11 May (28 days after planting). The tillering fertilizers, which accounted for 70% of the total fertilizer application, were applied on 29 June (67 days after planting). The irrigation schedule is listed in Table 1.
Rainfall was another way that water entered the open field. The rainfall in this field was 509.8 mm from the day of planting (24 March) to the day of image acquisition (11 July), and the monthly average rainfall was 127 mm. The meteorological conditions of the experiment field, including precipitation (without the irrigation), temperature and mean relative humidity, are shown in Figure 2. It can be seen that there was almost no rainfall for 15 days before the day of image acquisition. As such, the last event of water input in a large amount was the controlled irrigation on 4 July, which was a week before canopy image acquisition. This means that the rainfall had a very limited influence on the remote evaluation of water stress conditions under specific irrigation amounts.

2.2. Data Collection

The multispectral images were captured at noon on 11 July 2018 (109 days after planting), in the elongating stage. The weather was sunny, cloudless, and windless. The image acquisition system was mainly composed of a drone modeled Phantom 4 Pro (DJI, Shenzhen, China) and a multispectral image sensor RedEdge-MX (MicaSense, Seattle, WA, USA), as shown in Figure 3a,b, respectively. RedEdge-MX image sensor has five spectral bands at 475 nm (blue, B), 560 nm (green, G), 668 nm (red, R), 717 nm (red edge, RE), and 840 nm (near infrared, NIR), and is equipped with a light intensity sensor and a reflectance correction panel (Group VIII, USA, Figure 3c) for radiation correction. The optical intensity sensor can correct the influence caused by changes in sunlight on the spectral images during a flight, and the fixed reflectance correction panel can be used for reflectance transformation. The drone flew at an altitude of 40 m, with 85% forward overlap and 85% side overlap. The time interval of image acquisition was 2 s, and the ground sample distance (GSD) was 2.667 cm. Four calibration tarps with reflectivity of 5%, 20%, 40% and 60%, respectively, were also placed at the open space next to the field before image acquisition, as shown in Figure 3d. Two hundred and sixty multispectral images were finally collected.

2.3. Ground Sampling and CNC Determination

Each plot was divided into three sampling areas. Each sampling area was divided into nine grids, and one plant was randomly selected to collect the first fully unfolded leaf for each grid. Nine leaves were collected to form a leaf sample for each sampling area. A total of 36 samples were finally collected, and these were immediately brought back to the laboratory for N determination. All the samples were oven-dried at 105 °C for 30 min and afterward at 75 °C for about 24 h until at a constant weight. The dried leaves were ground and weighed to 0.3 g, and the Kjeldahl method [47] was used to determine the total nitrogen (TN, %) content. The TN of those first-leaf samples, which were then considered as the CNC (%), could be calculated by Equation (1).
T N   % = ( V 1 V 0 ) × C × 0.014 m × 100 %
where % represents the unit of TN and CNC; V1 is the consumption volume of the acid standard solution, mL; V0 is the titration blank volume, mL; C is the concentration of the acid standard solution, mol/L; 0.014 is the 1 mol standard titration solution equivalent to the weight of N, g; m is the weight of the sample, g.

2.4. Multispectral Image Preprocessing

Pix4DMapper software (Pix4D, Prilly, Switzerland) was used to generate the mosaic image from the 260 original multispectral images, as shown in Figure 4. The mosaic image was then imported and processed in ENVI software (L3Harris Technologies, Melbourne, FL, USA). Two preprocessing steps were conducted in ENVI, including radiation correction and geometric correction.
Radiation calibration was implemented using the radiometric correction module in ENVI. The “empirical line” method was selected, since four calibration tarps with known reflectivity were captured in the image. An empirical line was fitted by comparing the DN values and the reflectivity of the tarps. Subsequently, all the DN values in the mosaic image were able to be converted into reflectivity.
Geometric correction was conducted to eliminate the distortion. Four ground control points were selected at four corners of the field, as marked in the false-color image in Figure 4. The “image to map” function was selected to implement geometric correction with the coordinate information of the ground control points. “Nearest neighbor”, which avoids introducing new pixel values, was used to resample the image to the same coordinate system (UTM projection, WGS-84 datum) as that of the ground control points.
In order to extract the region of interest (ROI) out of the background, a classification method, decision tree (DT), was used to extract the sugarcane canopy from the soil, weeds, shadow, concrete, and other interfering background features. Figure 5 shows the NDVI image of the extracted canopy, and the white dots in the figure represented 36 sampling areas. To enhance sample quantity, each area was further divided into nine grids, which were approximately 1.5 m × 2.0 m in size. The average value of each grid was calculated as the spectral sample. Therefore, a total of 324 spectral samples were extracted.

2.5. Feature Extraction and Data Analysis Methods

2.5.1. Extraction of VIs

VIs have been widely used to qualitatively and quantitatively evaluate vegetation cover varieties and crop vigor. NDVI is the most commonly used VI, and it is also one of the important parameters closely related to crop chlorophyll and N concentration. Besides NDVI, nine other commonly used VIs (as shown in Table 2) were also selected to compare their effects on predicting the CNC. The optimal VI or a combination of VIs was used to build the prediction models of the CNC and the irrigation levels.

2.5.2. Grey Relational Analysis

Grey relational analysis (GRA), also called grey incidence analysis (GIA), is an important part of grey system theory, which was developed by Julong Deng [57]. At its core, it works to determine the primary and secondary relationships between various factors by calculating grey relational degree (GRD). The higher the GRD value of any two factors, the more consistent the change between those two factors. Therefore, it can be used to select the factor with the greatest influence [58]. Let the reference sequence be X 0 = { x 0 ( k ) , k = 1 , 2 , , n } and the comparison sequence be X i = { x i ( k ) , k = 1 , 2 , , n } . The GRD value between X0 and Xi is calculated by Equations (2) and (3).
G R D = 1 n k = 1 n γ ( x 0 ( k ) , x i ( k ) )
γ ( x 0 ( k ) , x i ( k ) ) = min i min k | x 0 ( k ) x i ( k ) | + ρ min i min k | x 0 ( k ) x i ( k ) | | x 0 ( k ) x i ( k ) | + ρ min i min k | x 0 ( k ) x i ( k ) |
where ρ is the identification coefficient, and its value range is 0–1, taken here as 0.5.
The GRA was conducted between all the spectral features and the CNC, which were all normalized. The GRD, which was higher than 0.8, reflected that the VI had a very strong influence on the CNC.

2.5.3. Correlation Analysis

Correlation coefficient (R) [59] can reflect the degree of the linear correlation between two datasets. It can be calculated by Equation (4).
R = i = 1 n ( x i x ¯ ) ( y i y ¯ ) i = 1 n ( x i x ¯ ) 2 i = 1 n ( y i y ¯ ) 2
where n is sample size, x i and y i are the individual sample points indexed with i; x ¯ and y ¯ are the means of x i and y i for n samples.
The higher the absolute value of R, the higher the linear correlation between the two factors. It is generally considered that 0.7 ≤ |R| < 1 indicates a very high correlation, when 0.4 ≤ |R| < 0.7, it indicates a significant correlation, and when |R| < 0.4, it indicates a low correlation. Correlation analysis can be applied for multiple purposes in modeling, including: (1) to analyze the correlations between input variables and predictors to determine sensitive variables; (2) to analyze the correlations between multiple variables, during which only the variables with significant correlations should be utilized in order to simplify the complexity of the model; (3) to analyze the correlation between the predicted values of a model and the measured values, and to evaluate the effect of the model.
In this study, the correlations between the spectral features were analyzed to pick proper variables with less redundant information for CNC modeling and irrigation level classification.

2.6. Modeling Algorithms

At present, there are many machine learning algorithms. Based on previous researches, four algorithms were selected after comprehensive consideration, as shown in Table 3.
PLS, BPNN and ELM were selected for CNC modeling, and the simple validation method, hold-out [60], was selected for model validation. All 324 samples were divided into calibration set and validation set according to the ratio of 7:3.
SVM and BPNN were selected for irrigation level classification. Three-fold cross validation [61] was used to produce more validation samples, and, therefore, to generate a comprehensive confusion matrix of the classification results.
The PLS algorithm builds a model by minimizing the sum of the squares of the errors. It combines the advantages of multiple linear regression, canonical correlation analysis and principal component analysis. BPNN has the characteristics of self-learning and self-adaptation, showing a strong ability to fit nonlinear functions; it also has a strong anti-interference ability and may be suitable for complex field environments. The ELM algorithm allows for the random generation of the weights and thresholds between the input layer and hidden layers; users only need to denote the number of hidden layer neurons in the whole training process. Compared with the traditional classification algorithms, ELM has a fast-learning speed and strong generalization capability. These three algorithms have different characteristics and might achieve better prediction results under different conditions or scenarios, so all three algorithms were adopted and compared for the CNC prediction in this study. The number of principal components of the PLS was 6. The training epoch, the learning rate, and the number of hidden layers of the BPNN model were 1000, 0.05, and 22, respectively. The transfer function and the number of hidden layers of the ELM model were sigmoidal function and 50, respectively.
SVM is a classic machine learning method for classification. It maps data from a low-dimensional space to a high-dimensional space through a kernel function and separates the classes with a decision surface that maximizes the margin between the classes. Thus, SVM was selected for the irrigation levels classification in this study. Due to its strong ability to perform nonlinear mapping, BPNN is suitable for not only solving fitting problems, but also classification problems, so BPNN was also selected here for comparison with SVM in the classification of irrigation levels. The penalty factor and the kernel function of the SVM model were 10 and 0.167, respectively. The training epoch, the learning rate, and the number of hidden layers of the BPNN model were 1000, 0.1, and 10, respectively.

2.7. Accuracy Assessment Metrics

R and root mean square error (RMSE) were used to evaluate the accuracies of the CNC prediction models. R was introduced in Section 2.5, and here the correlation between the predicted values and the actual values were calculated to evaluate the accuracies of the prediction models. RMSE, which was calculated by Equation (5), can directly reflect the errors of the prediction models.
R M S E = 1 n i = 1 n ( y i y i ^ ) 2
where, y i and y i ^ represent the estimated value and actual value for sample i, respectively.
The confusion matrix is also known as the probability matrix or error matrix [62]. It is a specific matrix for visualizing algorithm performance, and is often used to evaluate the classification results. The rows in the matrix represent the actual irrigation levels and the columns represent the predicted irrigation levels. The confusion matrix is named because it can easily indicate whether multiple classes are confused (that is, one class is expected to be another class). Common indicators including producer’s accuracy (PA), user’s accuracy (UA), and overall accuracy (OA), can be calculated in the confusion matrix. PA refers to the ratio of the correctly classified sample numbers in a class to the actual total numbers of that class, also called true positive rate (TPR). UA refers to the ratio of the correctly classified sample numbers in a class to the classified total numbers of that class, also called positive predictive value (PPV). OA refers to the ratio of all correctly classified sample numbers to all sample numbers of all the classes. The calculation formulas of the two indicators are shown in Equations (6)–(8).
P A / T P R = T P T P + F N
U A / P P V = T P T P + F P
O A = T P + T N P + N = T P + T N T P + T N + F P + F N
TP, FP, TN and FN represent the numbers of true positive, false positive, true negative, and false negative samples in the classify result, respectively.

3. Results

3.1. CNC Prediction

3.1.1. Relation Analysis between the Spectral Features and the CNC

To eliminate the influence caused by different magnitudes of each variable, the datasets were firstly normalized before GRA. In the GRA results listed in Table 4, all the VI had a GRD larger than 0.5 with the CNC, while the SRPI had the strongest grey relation of 0.94.
Correlation analysis was also conducted for these ten VIs, and the results were shown in Table 5. Most of the VIs had a very high correlation (R > 0.9) with each other, except NGBDI and RVI2. Two variables with a higher correlation means more redundant information is contained in them, which means that selecting both as inputs could be avoided.
Based on the results in Table 4 and Table 5, we can find that though most of the VIs (SRPI, NPCI, RVI, MSRI, NDVI, SIPI, OSAVI and SAVI) had very high GRD (>0.8) with the CNC, they also had very high correlations (R > 0.9) with each other. Two basic rules, as follows, should be considered in the selection of spectral features, which could help by selecting the more sensitive and less-redundant spectral features as the input for CNC prediction based on the GRA and R results: (1) The GRD of the selected spectral feature(s) should be relatively high with the CNC (GRD > 0.65); (2) The R between the spectral feature(s) should be relatively low to avoid introducing redundant information and variable coupling. Therefore, not all the VIs with higher GRD could be selected as model inputs. Here, the first two VIs in Table 4, SRPI and NPCI, were recommended as efficient inputs. Furthermore, NGBDI, which had relatively higher grey relational degree with the CNC and a lower R with SRPI and NPCI, was also recommended as another efficient input variable.

3.1.2. Modeling with the Five-Band Reflectance

The five-band reflectance of the multispectral image were firstly taken as the input variables, and PLS, BPNN, and ELM algorithms were used to build the CNC prediction models, respectively.
The modeling results were listed in Table 6, while the scatter plots of the measured values and the predicted values of every model were also presented in Figure 6. As can be seen, the PLS model had the highest Rv value of 0.73 and the lowest RMSEv value of 0.13 in both the calibration set and the validation set compared to the ELM and BPNN model. This indicated that PLS had better modeling performance.

3.1.3. Modeling with VIs

In addition to the three recommended VIs (SRPI, NPCI and NGBDI), SIPI was also selected to build prediction models for comparison. Several models were established from a single VI or a combination of VIs with different modeling algorithms. The results were listed in Table 7. The PLS model still had a more accurate and balanced performance than the BPNN and ELM models. From the results of the single VI-based models, we could find that VIs which had higher GRD with the CNC had better performance in prediction models. From the results of the double and multiple VI-based models, the SRPI- and SIPI-based model showed lower accuracy than the SRPI- and NGBDI- based model, even though SIPI had a higher GRD than NGBDI. This proved the correctness of choosing NGBDI as the supplement variable rather than the others. The model with the highest accuracy (Rv = 0.63) was established based on SRPI and NPCI and NGBDI, rather than all VIs, indicating that simply increasing the number of input variables does not necessarily improve the accuracy of the model. As long as the variables are selected correctly, fewer variables may bring higher accuracy to the model.

3.1.4. Modeling with the Five-Band Reflectance and VIs

Other types of models based on different combinations of the five-band reflectance and VIs were established, and the results were listed in Table 8. Of the different modeling algorithms, PLS still had the best performance among the three algorithms, BPNN had slightly lower accuracy than PLS, and ELM had an obvious lower accuracy than the other two algorithms. The PLS model based on FR and the three recommended VIs (SRPI, NPCI and NGBDI) had the highest accuracy, with the highest Rv of 0.79 and the lowest RMSEv of 0.11. The scatter plots of the prediction result of the best model were presented in Figure 7. Higher accuracy based on the five-band reflectance combined with proper VIs was reached because this combination of input variables not only ensured the integrity of information, but also highlighted the spectral characteristic information. The results also demonstrated that the selection of input variables was crucial.

3.2. Irrigation Level Recognition

Sugarcane is a crop with high stem, large biomass, and long growth period. It has a large water requirement, as well as being dependent on fertilizers, especially in the early and middle growing stages (the seedling, tillering and elongating stage). Knowledge of water conditions during these stages is of great importance for sugarcane irrigation management.

3.2.1. Correlation Analysis

The correlations between the irrigation amounts and the spectral features, including the band reflectance and ten VIs, were firstly analyzed, as shown in Table 9. As can be seen, regardless of the amount of fertilizer applied, irrigation had high correlations of above 0.65 with red, blue, SRPI, NPCI and NGBDI. Red, blue and NPCI were negatively correlated with irrigation, while SRPI and NGBDI were positively correlated with irrigation. Among the five spectral bands, red, green and blue bands were negatively correlated with irrigation, while red edge and NIR bands were positively correlated with irrigation. This phenomenon is consistent with the spectral variation trend of the green plant (the healthier in growth condition, the lower the reflectance in visible range, and the higher the reflectance in red edge and NIR range).

3.2.2. Classification Results

Based on the five parameters of red, blue, SRPI, NPCI, and NGBDI, which had the highest correlations with irrigations SVM and BPNN were adopted to classify the irrigation levels. The results were listed in Table 10.
As shown in Table 10, the OA of SVM for irrigation level recognition was 80.6%, while the OA of BPNN was only 61.7%. From the classification results of SVM, the UA and PA of irrigation_0 were the highest, reaching 94.9% and 87.0%, respectively. The accuracies of the other two levels generally exceeded 70%. This result reflects that the larger the difference in irrigation amount, the higher the classification accuracy. In addition, most of the misclassified samples were misclassified due to adjacent irrigation levels. For example, among the 108 samples in Irrigation_0, 94 samples were correctly identified, 13 samples were misidentified as Irrigation_180, and only one sample was misidentified as Irrigation_300. Among the total number of 63 misclassified samples, 62 samples were misclassified into adjacent levels, and only one sample was misclassified into a level far away. This indicated that the identification of irrigation levels using multispectral images had great potential and could be used for the recognition of crop water stress condition.

4. Discussion

The results in Table 6, Table 7 and Table 8 indicated that the PLS models had the best performance for sugarcane CNC prediction based on different input combinations, which is consistent with research into citrus CNC prediction by Liu et al. [63], grapevine LNC prediction by Moghimi et al. [64], etc. The sugarcane CNC prediction model showed a highest accuracy of R = 0.79 and RMSE = 0.11, which was also close to the previous studies by Liu et al. (R = 0.65, RMSE = 0.13) [63] for the citrus CNC prediction and Moghimi et al. (R = 0.74, RMSE = 0.23) [64] for grapevine LNC prediction.
Compared to the five-band reflectance models in Table 6, VI-based models in Table 7 had obvious lower accuracy, indicating that taking VIs alone as the input would decrease the prediction accuracy comparing to taking the entire five-band reflectance as the inputs. It reflected that the VIs did not exert their characteristics, which should enhance the spectral features and reduce environmental interference [65]. The main reason was that VIs do have advantages when it comes to reducing the influence caused by uneven light (illumination) and different backgrounds. However, in this study, only one image of a small field was acquired in a very short time, meaning that the light difference and background difference was not significant. This made the contribution of VIs less than that of the whole spectral reflectance [66,67].
Regardless, VIs could still help to improve the modeling accuracy, and this was proved in the results listed in Table 8. Among different combinations of input spectral features, the five-band reflectance combined with the three VIs (SRPI, NPCI, and NGBDI) had the highest accuracy in CNC prediction, with the Rv of 0.79 and the RMSEv of 0.11; this was 8.2% higher in Rv, and 15.4% lower in RMSEv, than the five-band prediction model (Rv = 0.73, RMSEv = 0.13).
Moreover, all three of those VIs were calculated from the visible bands (SRPI and NPCI are both calculated from the green and red bands, while NGBDI is calculated from the blue and green bands), indicating that the visible bands contained more sensitive information for sugarcane CNC prediction. Ranjan et al. [68] explored the spectral characteristic of CNC in crops, with characteristic wavelengths mainly in the range of 430 nm, 460 nm, 640 nm, 910 nm, 1510 nm, 1940 nm, 2060 nm, 2180 nm, 2300 nm, and 2350 nm. In this study, the multispectral camera had a spectral range of about 450 nm–850 nm, which contained only the visible sensitive bands. This proves the rationality that the VI selected in this study is mainly concentrated in the visible range. As is generally known, different N inputs could lead to different leaf pigment concentrations, leaf internal structures, and canopy structures [64,69,70]. The visible bands are closely related to leaf pigments and canopy structures and, as such, offer a great potential for N prediction.
Furthermore, this research also discovered that the irrigation levels could be effectively classified based on the reflectance at red and blue, combined with the SRPI, NPCI and, NGBDI, which were the spectral features in the visible bands. Indeed, the NIR bands are generally more sensitive to plant water content. However, the most sensitive bands which sit between 1480 and 1500 nm are out of the range of the multispectral camera used in this study [69,70]. Insufficient water input could obviously affect plant metabolism, which indirectly affects the leaf pigment concentrations. Therefore, the irrigation level recognition model can achieve better classification accuracy by only using the three visible bands.
This study has achieved good results for CNC prediction and irrigation level classification based on multispectral remote sensing. Although research on crop monitoring based on UAV multispectral imagery have been widely carried out for more 10 years, but it is rarely applied in wide field management. Several bottlenecks need to be addressed at present. Sozzi et al. [4] compared the advantages and disadvantages of satellite-, plane-, and UAV-based multispectral imagery in variable rate N application in terms of cost, economic benefit, optical quality, and usage scenarios. It was pointed out that although satellite- and plane-based imagery have low optical quality and low resolution, they can provide applicable variable N rate suggestions, and bring economic benefits for large-scaled farms due to their relatively low cost. The UAV platform does have limits in acquisition cost and flight coverage at present. However, as the development of UAV technology and the increase requirement of UAVs, the cost can be significantly reduced and the battery performance can be enhanced in the future. With the emergence of automated UAV base stations and the reduction of image processing costs, the large-scale application of UAVs is just around the corner. By then, UAV remote sensing technology can be widely accepted in farm-scale crop monitoring with its flexible and autonomous acquisition style, high-quality image data, and low-cost validity.

5. Conclusions

High resolution multispectral images of a sugarcane field were collected by UAV, and its ability to predict CNC and irrigation levels was evaluated. The main conclusions were as follows:
  • Ten VIs were used to determine sensitive spectral features for CNC and irrigation level prediction. The SRPI, NPCI, and NGBDI composed of the visible bands were sensitive to the sugarcane CNC as well as the irrigation level, and had a notable contribution to the accuracy improvement of the CNC prediction model.
  • Different modeling algorithms based on different spectral features were compared in predicting sugarcane CNC. The PLS model had a clearly superior performance than the BPNN and ELM models. It was also crucial to select proper features among the band reflectance and VIs. The PLS model, based on the five-band reflectance combined with SRPI, NPCI, and NGBDI, had the highest Rv of 0.79, and the lowest RMSEv of 0.11.
  • Based on the correlation coefficients with irrigation levels, the red band, blue band, SRPI, NPCI, and NGBDI were adopted as the variables to classify irrigation levels based on SVM and BPNN, respectively. SVM reached an obviously superior performance compared to BPNN, with its overall classification accuracy of 80.6%.
High resolution multispectral images have been demonstrated as effective for CNC prediction and water irrigation level recognition. More adequate experiment could be conducted to collect more samples at different growth stages in sugarcane field. Time- or period-dependent prediction models could be studied to give users more accurate information.

Author Contributions

Conceptualization, X.L.; methodology, Y.B., S.Z., M.N. and C.Y.; investigation, Y.B., S.Z. and M.N.; writing—original draft preparation, X.L. and Y.B.; writing—review and editing, S.Z., X.L., C.Y. and M.Z.; supervision, X.L.; project administration, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Natural Science Foundation of China, grant number 31760342 and 31760603, the Science and Technology Major Project of Guangxi, China, grant number Gui Ke 2018-266-Z01.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the author (X.L.).

Acknowledgments

The authors wish to thank Nanning Irrigation Experiment Stations (Guangxi, China) for providing the experiment field and thank Weigang Luo and Ce Wang for their kind helps in the field management and survey.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ruggeri, G.; Corsi, S. An analysis of the Fairtrade cane sugar small producer organizations network. J. Clean. Prod. 2019, 240, 118191. [Google Scholar] [CrossRef]
  2. International Sugar Organization. About Sugar. Available online: https://www.isosugar.org/sugarsector/sugar (accessed on 14 March 2022).
  3. Zhang, C.; Marzougui, A.; Sankaran, S. High-resolution satellite imagery applications in crop phenotyping: An overview. Comput. Electron. Agric. 2020, 175, 105584. [Google Scholar] [CrossRef]
  4. Sozzi, M.; Kayad, A.; Gobbo, S.; Cogato, A.; Sartori, L.; Marinello, F. Economic Comparison of Satellite, Plane and UAV-Acquired NDVI Images for Site-Specific Nitrogen Application: Observations from Italy. Agronomy 2021, 11, 2098. [Google Scholar] [CrossRef]
  5. Jin, X.L.; Zarco-Tejada, P.; Schmidhalter, U.; Matthew, P.R.; Li, S.K. High-throughput Estimation of Crop Traits: A Review of Ground and Aerial Phenotyping Platforms. IEEE Geosci. Remote Sens. Mag. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  6. Zhang, Y.Y.; Yang, J.; Liu, X.G.; Du, L.; Shi, S.; Sun, J.; Chen, B.W. Estimation of Multi-Species Leaf Area Index Based on Chinese GF-1 Satellite Data Using Look-Up Table and Gaussian Process Regression Methods. Sensors 2020, 20, 2460. [Google Scholar] [CrossRef]
  7. Ovakoglou, G.; Alexandridis, T.K.; Clevers, J.G.P.W.; Cherif, I.; Kasampalis, D.A.; Navrozidis, I.; Iordanidis, C.; Moshou, D.; Laneve, G.; Beltran, J.S. Spatial Enhancement of Modis Leaf Area Index Using Regression Analysis with Landsat Vegetation Index. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018. [Google Scholar] [CrossRef]
  8. Amani, M.; Mobasheri, M.R. A parametric method for estimation of leaf area index using landsat ETM+ data. GISci. Remote Sens. 2015, 52, 478–497. [Google Scholar] [CrossRef]
  9. Qiu, L.; Xiong, Q.M.; Liu, Y.; Zhang, G. GeoEye Image Fusion Vegetation Information Extraction Based on Blue Noise Measurement Texture. IOP Conf. Ser. Earth Environ. Sci. 2017, 67, 012007. [Google Scholar] [CrossRef] [Green Version]
  10. Lin, C.; Popescu, S.C.; Gavin, T.; Khongor, T.; Chang, C.I.; Prasad, V.K. Classification of Tree Species in Overstorey Canopy of Subtropical Forest Using QuickBird Images. PLoS ONE 2015, 10, e0125554. [Google Scholar] [CrossRef] [Green Version]
  11. Duveiller, G.; Defourny, P. A conceptual framework to define the spatial resolution requirements for agricultural monitoring using remote sensing. Remote Sens. Environ. 2010, 114, 2637–2650. [Google Scholar] [CrossRef]
  12. Yang, C.; Everitt, J.H.; Bradford, J.M. Comparison of QuickBird Satellite Imagery and Airborne Imagery for Mapping Grain Sorghum Yield Patterns. Precis. Agric. 2006, 7, 33–44. [Google Scholar] [CrossRef]
  13. Pinter, P.J.; Jackson, R.D.; Moran, S.M. Bidirectional reflectance factors of agricultural targets: A comparison of ground-, aircraft-, and satellite-based observations. Remote Sens. Environ. 1990, 32, 215–228. [Google Scholar] [CrossRef]
  14. Qi, J.; Moran, M.S.; Cabot, F.; Dedieu, G. Normalization of sun/view angle effects using spectral albedo-based vegetation indices. Remote Sens. Environ. 1995, 52, 207–217. [Google Scholar] [CrossRef]
  15. Martin, C.; Morten, L.; Rasmus, J.; Skovsen, S.; Gislum, R. Designing and Testing a UAV Mapping System for Agricultural Field Surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [Green Version]
  16. Wang, T.; Liu, Y.; Wang, M.; Fan, Q.; Tian, H.; Qiao, X.; Li, Y. Applications of UAS in Crop Biomass Monitoring: A Review. Front. Plant Sci. 2021, 12, 616689. [Google Scholar] [CrossRef]
  17. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  18. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  19. Ballester, C.; Hornbuckle, J.; Brinkhoff, J.; Smith, J.; Quayle, W. Assessment of In-Season Cotton Nitrogen Status and Lint Yield Prediction from Unmanned Aerial System Imagery. Remote Sens. 2017, 9, 1149. [Google Scholar] [CrossRef] [Green Version]
  20. Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of Vegetation Indices to Determine Nitrogen Application and Yield Prediction in Maize (Zea mays L.) from a Standard UAV Service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef] [Green Version]
  21. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.-H. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef] [Green Version]
  22. Kefauver, S.C.; Vicente, R.; Vergara-Díaz, O.; Fernandez-Gallego, J.A.; Kerfal, S.; Lopez, A.; Melichar, J.P.E.; Molins, M.D.S.; Araus, J.L. Comparative UAV and Field Phenotyping to Assess Yield and Nitrogen Use Efficiency in Hybrid and Conventional Barley. Front. Plant Sci. 2017, 8, 1733. [Google Scholar] [CrossRef]
  23. Xue, L.-H.; Cao, W.-X.; Yang, L.-Z. Predicting Grain Yield and Protein Content in Winter Wheat at Different N Supply Levels Using Canopy Reflectance Spectra. Pedosphere 2007, 17, 646–653. [Google Scholar] [CrossRef]
  24. Tan, C.W.; Zhou, X.X.; Zhang, P.P.; Wang, Z.X.; Wang, D.L.; Guo, W.S.; Yun, F. Predicting grain protein content of field-grown winter wheat with satellite images and partial least square algorithm. PLoS ONE 2020, 15, e0228500. [Google Scholar] [CrossRef] [PubMed]
  25. Abuzar, M.; O’Leary, G.; Fitzgerald, G. Measuring water stress in a wheat crop on a spatial scale using airborne thermal and multispectral imagery. Field Crop. Res. 2009, 112, 55–65. [Google Scholar] [CrossRef]
  26. Zhou, Y.; Lao, C.; Yang, Y.; Zhang, Z.; Chen, H.; Chen, Y.; Chen, J.; Ning, J.; Yang, N. Diagnosis of winter-wheat water stress based on UAV-borne multispectral image texture and vegetation indices. Agric. Water Manag. 2021, 256, 107076. [Google Scholar] [CrossRef]
  27. Peng, J.; Manevski, K.; Kørup, K.; Larsen, R.; Andersen, M.N. Random forest regression results in accurate assessment of potato nitrogen status based on multispectral data from different platforms and the critical concentration approach. Field Crop. Res. 2021, 268, 108158. [Google Scholar] [CrossRef]
  28. Zhang, J.-H.; Wang, K.; Bailey, J.S.; Wang, R.-C. Predicting Nitrogen Status of Rice Using Multispectral Data at Canopy Scale. Pedosphere 2006, 16, 108–117. [Google Scholar] [CrossRef]
  29. Osco, L.P.; Junior, J.M.; Ramos, A.P.M.; Furuya, D.E.G.; Santana, D.C.; Teodoro, L.P.R.; Gonçalves, W.N.; Baio, F.H.R.; Pistori, H.; Junior, C.A.D.S.; et al. Leaf Nitrogen Concentration and Plant Height Prediction for Maize Using UAV-Based Multispectral Imagery and Machine Learning Techniques. Remote Sens. 2020, 12, 3237. [Google Scholar] [CrossRef]
  30. Mwinuka, P.R.; Mbilinyi, B.P.; Mbungu, W.B.; Mourice, S.K.; Mahoo, H.F.; Schmitter, P. The feasibility of hand-held thermal and UAV-based multispectral imaging for canopy water status assessment and yield prediction of irrigated African eggplant (Solanum aethopicum L.). Agric. Water Manag. 2021, 245, 106584. [Google Scholar] [CrossRef]
  31. Jorge, J.; Vallbé, M.; Soler, J.A. Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images. Eur. J. Remote Sens. 2019, 52, 169–177. [Google Scholar] [CrossRef] [Green Version]
  32. Zhou, Z.; Huang, J.F.; Wang, J.; Zhang, K.Y.; Kuang, Z.M.; Zhong, S.Q.; Song, X.D. Object-Oriented Classification of Sugarcane Using Time-Series Middle-Resolution Remote Sensing Data Based on AdaBoost. PLoS ONE 2015, 10, e0142069. [Google Scholar] [CrossRef] [Green Version]
  33. Luciano, A.C.D.S.; Picoli, M.C.A.; Duft, D.G.; Rocha, J.V.; Leal, M.R.L.V.; le Maire, G. Empirical model for forecasting sugarcane yield on a local scale in Brazil using Landsat imagery and random forest algorithm. Comput. Electron. Agric. 2021, 184, 106063. [Google Scholar] [CrossRef]
  34. Morel, J.; Todoroff, P.; Bégué, A.; Bury, A.; Martiné, J.-F.; Petit, M. Toward a Satellite-Based System of Sugarcane Yield Estimation and Forecasting in Smallholder Farming Conditions: A Case Study on Reunion Island. Remote Sens. 2014, 6, 6620–6635. [Google Scholar] [CrossRef] [Green Version]
  35. Martins, J.A.; Fiorio, P.R.; Barros, P.P.D.S.; Demattê, J.A.M.; Molin, J.P.; Cantarella, H.; Neale, C.M.U. Potential use of hyperspectral data to monitor sugarcane nitrogen status. Acta Sci. Agron. 2020, 43, e47632. [Google Scholar] [CrossRef]
  36. Miphokasap, P.; Wannasiri, W. Estimations of Nitrogen Concentration in Sugarcane Using Hyperspectral Imagery. Sustainability 2018, 10, 1266. [Google Scholar] [CrossRef] [Green Version]
  37. Ma, Y.; Zhang, Z.; Kang, Y.; Özdoğan, M. Corn yield prediction and uncertainty analysis based on remotely sensed variables using a Bayesian neural network approach. Remote Sens. Environ. 2021, 259, 112408. [Google Scholar] [CrossRef]
  38. Khaki, S.; Pham, H.; Wang, L.Z. Simultaneous corn and soybean yield prediction from remote sensing data using deep transfer learning. Sci. Rep. 2021, 11, 11132. [Google Scholar] [CrossRef]
  39. Yang, W.; Nigon, T.; Hao, Z.; Paiao, G.D.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
  40. Prodhan, F.A.; Zhang, J.; Yao, F.; Shi, L.; Sharma, T.P.; Zhang, D.; Cao, D.; Zheng, M.; Ahmed, N.; Mohana, H. Deep Learning for Monitoring Agricultural Drought in South Asia Using Remote Sensing Data. Remote Sens. 2021, 13, 1715. [Google Scholar] [CrossRef]
  41. Zha, H.N.; Miao, Y.X.; Wang, T.T.; Li, Y.; Zhang, J.; Sun, W.C.; Feng, Z.Q.; Kusnierek, K. Improving Unmanned Aerial Vehicle Remote Sensing-Based Rice Nitrogen Nutrition Index Prediction with Machine Learning. Remote Sens. 2020, 12, 215. [Google Scholar] [CrossRef] [Green Version]
  42. Jiang, Q.O.; Xu, L.D.; Sun, S.Y.; Wang, M.L.; Xiao, H.J. Retrieval model for total nitrogen concentration based on UAV hyper spectral remote sensing data and machine learning algorithms—A case study in the Miyun Reservoir, China. Ecol. Indic. 2021, 124, 107356. [Google Scholar] [CrossRef]
  43. Li, R.; Zeng, C.; Li, J.; Zhu, K.; Lin, X.X.; Wen, J.F.; Guo, H.J.; Weng, W.F.; Wang, D.; Ji, S.G. Characterization of the Fruits and Seeds of Alpinia Oxyphylla Miq. by High-Performance Liquid Chromatography (HPLC) and near-Infrared Spectroscopy (NIRS) with Partial Least-Squares (PLS) Regression. Anal. Lett. 2020, 53, 1667–1682. [Google Scholar] [CrossRef]
  44. Kira, O.; Linker, R.; Gitelson, A. Non-destructive estimation of foliar chlorophyll and carotenoid contents: Focus on informative spectral bands. Int. J. Appl. Earth Obs. Geoinf. 2015, 38, 251–260. [Google Scholar] [CrossRef]
  45. Chen, L.; Huang, J.F.; Wang, F.M.; Tang, Y.L. Comparison between back propagation neural network and regression models for the estimation of pigment content in rice leaves and panicles using hyperspectral data. Int. J. Remote Sens. 2007, 28, 3457–3478. [Google Scholar] [CrossRef]
  46. Pal, M.; Maxwell, A.E.; Warner, T.A. Kernel-based extreme learning machine for remote-sensing image classification. Remote Sens. Lett. 2013, 4, 853–862. [Google Scholar] [CrossRef]
  47. Kjeldahl, J. New Method for the Determination of Nitrogen. Sci. Am. 1883, 16, 6470. [Google Scholar] [CrossRef]
  48. Rouse, J.W., Jr.; Hass, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. In Proceedings of the Third Earth Resources Technology Satellite-1 (ERTS) Symposium, Washington, DC, USA, 10–14 December 1973; NASA SP-351: Washington, DC, USA, 1973; Volume 1, pp. 309–317. Available online: https://ntrs.nasa.gov/citations/19740022614 (accessed on 15 March 2022).
  49. Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 1996, 22, 229–242. [Google Scholar] [CrossRef]
  50. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  51. Mishra, S.; Mishra, D.R. Normalized difference chlorophyll index: A novel model for remote estimation of chlorophyll-a concentration in turbid productive waters. Remote Sens. Environ. 2012, 117, 394–406. [Google Scholar] [CrossRef]
  52. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  53. Peñuelas, J.; Filella, I.; Gamon, J.A. Assessment of photosynthetic radiation-use efficiency with spectral reflectance. New Phytol. 1995, 131, 291–296. [Google Scholar] [CrossRef]
  54. Peñuelas, J.; Gamon, J.A.; Fredeen, A.L.; Merino, J.; Field, C.B. Reflectance indices associated with physiological changes in nitrogen- and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
  55. Xue, L.; Cao, W.; Luo, W.; Dai, T.; Zhu, Y. Monitoring Leaf Nitrogen Status in Rice with Canopy Spectral Reflectance. Agron. J. 2004, 96, 135–142. [Google Scholar] [CrossRef]
  56. Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
  57. Ju-Long, D. Control problems of grey systems. Syst. Control Lett. 1982, 1, 288–294. [Google Scholar] [CrossRef]
  58. Wei, G. Grey relational analysis model for dynamic hybrid multiple attribute decision making. Knowl.-Based Syst. 2011, 24, 672–679. [Google Scholar] [CrossRef]
  59. Nahler, G. Pearson correlation coefficient. In Dictionary of Pharmaceutical Medicine; Springer: Vienna, Austria, 2009; p. 132. [Google Scholar] [CrossRef]
  60. Devroye, L.; Wagner, T. Distribution-free performance bounds for potential function rules. IEEE Trans. Inf. Theory 1979, 25, 601–604. [Google Scholar] [CrossRef] [Green Version]
  61. Arlot, S.; Celisse, A. A survey of cross-validation procedures for model selection. Statist. Surv. 2010, 4, 40–79. [Google Scholar] [CrossRef]
  62. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  63. Liu, X.F.; Liu, Q.; Shaolan, H.E.; Shilai, Y.; Xie, R.; Zheng, Y.; Deyu, H.U.; Wang, Z.; Deng, L. Estimation of carbon and nitrogen contents in citrus canopy by low-altitude Remote Sensing. Int. J. Agric. Biol. Eng. 2016, 9, 149–157. [Google Scholar] [CrossRef]
  64. Moghimi, A.; Pourreza, A.; Zuniga-Ramirez, G.; Williams, L.E.; Fidelibus, M.W. A Novel Machine Learning Approach to Estimate Grapevine Leaf Nitrogen Concentration Using Aerial Multispectral Imagery. Remote Sens. 2020, 12, 3515. [Google Scholar] [CrossRef]
  65. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  66. Lu, N.; Wang, W.H.; Zhang, Q.F.; Li, D.; Yao, X.; Tian, C.Y.; Zhu, Y.; Cao, W.X.; Baret, F.; Liu, S.Y. Estimation of Nitrogen Nutrition Status in Winter Wheat from Unmanned Aerial Vehicle Based Multi-Angular Multispectral Imagery. Front. Plant Sci. 2019, 10, 1601. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote. Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
  68. Ranjan, R.; Chopra, U.K.; Sahoo, R.N.; Singh, A.K.; Pradhan, S. Assessment of plant nitrogen stress in wheat (Triticum aestivum L.) through hyperspectral indices. Int. J. Remote Sens. 2012, 33, 6342–6360. [Google Scholar] [CrossRef]
  69. Tian, Y.; Zhu, Y.; Cao, W.; Dai, T. Relationship between canopy reflectance and plant water status of wheat. Chin. J. Appl. Ecol. 2004, 15, 2072–2076, (In Chinese with English abstract). Available online: https://d.wanfangdata.com.cn/periodical/yystxb200411016 (accessed on 24 February 2022).
  70. Govender, M.; Govender, P.; Weiersbye, I.; Witkowski, E.; Ahmed, F. Review of commonly used remote sensing and ground-based technologies to measure plant water stress. Water SA 2009, 35. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The study site and the field management layout with different irrigation and fertilization levels based on the captured multispectral image (displayed in RGB). W0.6 represents the irrigation rate of 180 m3/ha, W1.0 represents the irrigation of 300 m3/ha; F1.0 represents the standard fertilization rate, F0.9 represents 90% of the amount of F1.0, F1.1 represents 110% of F1.0 and F1.2 represents 120% of F1.0; BL1, BL2, BL3, BL4 indicate that four blank plots without fertilizer and irrigation.
Figure 1. The study site and the field management layout with different irrigation and fertilization levels based on the captured multispectral image (displayed in RGB). W0.6 represents the irrigation rate of 180 m3/ha, W1.0 represents the irrigation of 300 m3/ha; F1.0 represents the standard fertilization rate, F0.9 represents 90% of the amount of F1.0, F1.1 represents 110% of F1.0 and F1.2 represents 120% of F1.0; BL1, BL2, BL3, BL4 indicate that four blank plots without fertilizer and irrigation.
Sensors 22 02711 g001
Figure 2. Meteorological data of the experimental field.
Figure 2. Meteorological data of the experimental field.
Sensors 22 02711 g002
Figure 3. The image acquisition system. (a) DJI Phantom 4 Pro; (b) RedEdge-MX multispectral image sensor; (c) reflectance correction panel of RedEdge-MX; and (d) calibration tarps.
Figure 3. The image acquisition system. (a) DJI Phantom 4 Pro; (b) RedEdge-MX multispectral image sensor; (c) reflectance correction panel of RedEdge-MX; and (d) calibration tarps.
Sensors 22 02711 g003
Figure 4. The false-color (NIR, R, and G) mosaic image of the sugarcane experimental field. The white-circled numbers represent the four ground control points.
Figure 4. The false-color (NIR, R, and G) mosaic image of the sugarcane experimental field. The white-circled numbers represent the four ground control points.
Sensors 22 02711 g004
Figure 5. NDVI image of the extracted canopy. The 36 white circled numbers represent 36 sampling areas, each sampling area was further divided into nine grids as shown by the white dashed lines.
Figure 5. NDVI image of the extracted canopy. The 36 white circled numbers represent 36 sampling areas, each sampling area was further divided into nine grids as shown by the white dashed lines.
Sensors 22 02711 g005
Figure 6. The CNC prediction results of the PLS, and BPNN ELM models based on the five-band reflectance. (a) The calibration result of the PLS model; (b) the calibration result of the BPNN model; (c) the calibration result of the ELM model; (d) the validation result of the PLS model; (e) the validation result of the BPNN model; (f) the validation result of the ELM model.
Figure 6. The CNC prediction results of the PLS, and BPNN ELM models based on the five-band reflectance. (a) The calibration result of the PLS model; (b) the calibration result of the BPNN model; (c) the calibration result of the ELM model; (d) the validation result of the PLS model; (e) the validation result of the BPNN model; (f) the validation result of the ELM model.
Sensors 22 02711 g006
Figure 7. CNC prediction result of the PLS model based on the five-band reflectance combined with NGBDI, SRPI and NPCI. (a) The calibration result; (b) the validation result.
Figure 7. CNC prediction result of the PLS model based on the five-band reflectance combined with NGBDI, SRPI and NPCI. (a) The calibration result; (b) the validation result.
Sensors 22 02711 g007
Table 1. Irrigation amount at different growth stages.
Table 1. Irrigation amount at different growth stages.
Growth StageIrrigation DateIrrigation Level (m3/ha)
W0.6W1.0
Seedling10 April 20186090
Tillering29 May 20183060
Elongating4 July 201860120
Maturing12 October 20183030
Total 180300
Table 2. The selected VIs and their calculation formulas.
Table 2. The selected VIs and their calculation formulas.
VIsCalculation FormulaReferences
Normalized Difference Vegetation Index (NDVI)(NIR-R)/(NIR + R)[48]
Modified Simple Ratio Index (MSRI) ( NIR / R - 1 ) / NIR / R + 1 [49]
Optimized Soil-adjusted Vegetation Index (OSAVI)1.16(NIR-R)/(NIR + R + 0.16)[50]
Ratio Vegetation Index (RVI)NIR/R[51]
Soil-adjusted Vegetation Index (SAVI)1.5(NIR-R)/(NIR + R + 0.5)[52]
Structure Insensitive Pigment Index (SIPI)(NIR-B)/(NIR + B)[53]
Simple Ratio Pigment Index (SRPI)B/R[54]
Normalized Pigment Chlorophyll Index (NPCI)(R-B)/(R + B)[54]
Ratio Vegetation Index 2 (RVI2)NIR/G[55]
Normalized Green-Blue Difference Index (NGBDI)(G-B)/(G + B)[56]
Note: The spectral reflectance of B, G, R, RE and NIR is at the wavelength of 475 nm, 560 nm, 668 nm, 717 nm and 840 nm, respectively.
Table 3. Modeling algorithms.
Table 3. Modeling algorithms.
ModelsValidation MethodAlgorithmsRatio
CNCHold-outPLS7:3
BPNN
ELM
Irrigation level classificationThree-fold cross validationSVM2:1
BPNN
Note: Ratio means the calibration set to the validation set.
Table 4. GRA results between each VI and the CNC.
Table 4. GRA results between each VI and the CNC.
VIGRD with the CNCRank
SRPI0.941
NPCI0.932
RVI0.923
MSRI0.894
NDVI0.885
SIPI0.876
OSAVI0.847
SAVI0.828
NGBDI0.709
RVI20.5810
Table 5. Correlations between each VI.
Table 5. Correlations between each VI.
CorrelationsSRPINPCIRVIMSRINDVISIPIOSAVISAVINGBDIRVI2
SRPI1.00−0.970.980.980.98−0.960.980.97−0.540.15
NPCI 1.00−0.96−0.96−0.960.99−0.98−0.990.60−0.20
RVI 1.001.001.00−0.930.990.97−0.510.14
MSRI 1.001.00−0.930.990.98−0.520.14
NDVI 1.00−0.940.990.98−0.520.14
SIPI 1.00−0.96−0.970.62−0.21
OSAVI 1.001.00−0.530.13
SAVI 1.00−0.530.12
NGBDI 1.00−0.88
RVI2 1.00
Table 6. CNC prediction results with PLS, BPNN and ELM based on the five-band reflectance.
Table 6. CNC prediction results with PLS, BPNN and ELM based on the five-band reflectance.
Input VariablesAlgorithmCalibration SetValidation Set
RcRMSEcRvRMSEv
Five-band reflectancePLS0.810.180.730.13
BPNN0.780.210.720.20
ELM0.750.280.681.00
Note: Rc and RMSEc represent the R and RMSE in the calibration set, Rv and RMSEv represent the R and RMSE in the validation set.
Table 7. CNC prediction results with PLS, BPNN and ELM based on VIs.
Table 7. CNC prediction results with PLS, BPNN and ELM based on VIs.
Input VariablesAlgorithmCalibration SetValidation Set
RcRMSEcRvRMSEv
SRPIPLS0.630.130.560.11
BPNN0.800.940.590.89
ELM0.730.680.521.59
NPCIPLS0.440.150.580.15
BPNN0.810.620.501.73
ELM0.790.860.451. 95
SIPIPLS0.600.140.540.19
BPNN0.900.260.500.97
ELM0.720.670.430.60
NGBDIPLS0.570.100.420.15
BPNN0.790.890.461.05
ELM0.770.920.401.04
SRPI & NPCIPLS0.600.150.570.15
BPNN0.830.210.581.14
ELM0.800.860.490.98
SRPI & SIPIPLS0.490.150.490.17
BPNN0.721.280.481.25
ELM0.810.600.441.04
SRPI & NGBDIPLS0.530.120.550.18
BPNN0.760.960.541.30
ELM0.710.860.511.27
SRPI & NPCI & NGBDIPLS0.640.140.630.14
BPNN0.741.170.621.26
ELM0.751.730.621.96
Ten VIsPLS0.650.120.520.16
BPNN0.811.510.521.12
ELM0.781.620.501.19
Note: Rc and RMSEc represent the R and RMSE in the calibration set, Rv and RMSEv represent the R and RMSE in the validation set.
Table 8. CNC prediction results with PLS, BPNN and ELM based on different input variables (FR represents five-band reflectance).
Table 8. CNC prediction results with PLS, BPNN and ELM based on different input variables (FR represents five-band reflectance).
Input VariablesAlgorithmCalibration SetValidation Set
RcRMSEcRvRMSEv
FR & SRPIPLS0.820.080.710.17
BPNN0.910.010.720.14
ELM0.900.020.641.07
FR & SRPI & NPCIPLS0.820.140.720.27
BPNN0.920.010.660.39
ELM0.840.010.600.69
FR & SRPI & NPCI & NGBDIPLS0.850.040.790.11
BPNN0.870.130.790.39
ELM0.840.240.681.31
FR & ten-VIsPLS0.810.190.720.68
BPNN0.930.010.691.26
ELM0.840.180.531.68
Note: Rc and RMSEc represent the R and RMSE in the calibration set, Rv and RMSEv represent the R and RMSE in the validation set.
Table 9. Correlation analysis result between the spectral features and the irrigation levels.
Table 9. Correlation analysis result between the spectral features and the irrigation levels.
Spectral FeaturesR between the Spectral Features and the Irrigation Levels
Spectral reflectanceNIR0.48
Red edge0.25
Red−0.71
Green−0.45
Blue−0.69
VINDVI0.21
MSRI0.25
OSAVI0.29
RVI0.28
SAVI0.34
SIPI0.18
SRPI0.65
NPCI−0.68
RVI20.33
NGBDI0.75
Table 10. Confusion matrix of the classification results of the irrigation levels.
Table 10. Confusion matrix of the classification results of the irrigation levels.
ClassifierSVMBPNN
Predicted ClassIrrigation_0Irrigation_180Irrigation_300PAIrrigation_0Irrigation_180Irrigation_300PA
Actual Class
Irrigation_0 (108 samples)9413187.0%8025374.1%
Irrigation_180 (108 samples)5891482.4%15613256.5%
Irrigation_300 (108 samples)0307872.2%2475954.6%
Total9913293 9713394
UA94.9%67.4%83.9%OA = 80.6%82.5%45.9%62.8%OA = 61.7%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, X.; Ba, Y.; Zhang, M.; Nong, M.; Yang, C.; Zhang, S. Sugarcane Nitrogen Concentration and Irrigation Level Prediction Based on UAV Multispectral Imagery. Sensors 2022, 22, 2711. https://doi.org/10.3390/s22072711

AMA Style

Li X, Ba Y, Zhang M, Nong M, Yang C, Zhang S. Sugarcane Nitrogen Concentration and Irrigation Level Prediction Based on UAV Multispectral Imagery. Sensors. 2022; 22(7):2711. https://doi.org/10.3390/s22072711

Chicago/Turabian Style

Li, Xiuhua, Yuxuan Ba, Muqing Zhang, Mengling Nong, Ce Yang, and Shimin Zhang. 2022. "Sugarcane Nitrogen Concentration and Irrigation Level Prediction Based on UAV Multispectral Imagery" Sensors 22, no. 7: 2711. https://doi.org/10.3390/s22072711

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop