0% found this document useful (0 votes)
71 views12 pages

User Manual WheatCAP 09292022

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 12

Unmanned Aerial System (UAS) data transfer, processing, and delivery workflow for the

Wheat CAP project


Prepared by the Texas A&M AgriLife – Purdue Wheat CAP Group – December 2021
This user manual is based on the experience and efforts of a multi-disciplinary team of
scientists at Texas A&M AgriLife to develop Unmanned Aerial System (UAS)-based High
Throughput Phenotyping (UAS-HTP) tools for crop breeding and precision crop management.
The team has developed and tested standardized protocols for UAS data collection, processing,
and analysis to collect high-spatiotemporal phenotypic data on plant morphological traits such as
canopy height (CH) (Chang et al., 2017; Hu and Lanzon, 2018), canopy cover (CC) (Ashapure et
al, 2019a), canopy volume (CV) (Ashapure et al., 2019b), and several spectral vegetation indices
(Yeom et al., 2019). Obtained UAS-based phenotypic traits have been successfully used to (i)
assess disease severity (Bhandari et al., 2020) and drought in wheat (Bhandari et al., 2021), (ii)
evaluate the effect of tillage management practices on cotton growth and development (Ashapure
et al., 2019b), (iii) select high yielding cotton genotypes (Jung et al., 2018), (iii) monitor crop
germination (Chen et al., 2018), (iv) estimate plant population/stand count (Oh et al., 2020), (v)
model crop growth and estimate yield of cotton (Ashapure et al., 2020) and tomato (Ashapure et
al., 2019c; Chang et al., 2021), and (vii) characterize citrus greening disease (Chang et al., 2020).
This user manual is created to standardize Unmanned Aerial System (UAS) data collection,
processing, and data sharing procedures among the Wheat CAP breeding programs. Specifically,
we discuss the following components that breeders can follow to successfully collect high-
quality UAS data and utilize a web-based digital portal to transfer raw/processed data and
visualize processed data:
1. Basic protocols and procedures for UAS data collection (RGB and multispectral imagery
data)
2. Utilizing UAS data hub/portal (Wheat CAP UAS hub) for data handling

1. Basic protocols and procedures for UAS data collection


1.1. Preplanning
UAS operation in the United States is subject to Federal Aviation Administration (FAA) 14
CFR Part 107 Small UAS rules, available at https://www.faa.gov/uas/. These rules include pilot
requirements, aircraft requirements, location requirements, and operating procedures. Pilot
requirements are: 1) must possess a valid Remote Pilot Certificate, 2) be at least 16 years old, 3)
be able to read, write, speak, and understand English, and 4) be in a physical and mental
condition to safely fly a UAS. After studying the FAA Knowledge Test Suggested Study
Materials found at https://www.faa.gov/uas/resources/policy_library/#107, the pilot must obtain
an FAA Tracking Number (FTN) before the Knowledge Test can be scheduled at an FAA-
approved Knowledge Testing Center. Finally, the pilot must Complete FAA Form 8710-13.
More details are available at https://www.faa.gov/uas/commercial_operators/. It is up to the pilot
to maintain the currency of their Remote Pilot Certificate every two years and keep up to date on
changing FAA regulations. Aircraft requirements under Part 107 are as follows: 1) must weigh
less than 55 pounds, 2) be registered if over 0.55 pounds at https://faadronezone.faa.gov/#/, and
3) the pilot must undergo a pre-flight check to ensure UAS is in condition for safe operation, Part
107 governs the location of UAS operations, which can be checked with the FAA B4UFLY
mobile app. Flights conducted in Class G airspace do not require approval from the FAA. All
other airspace levels (Class B, C, D, and E) are restricted and must be approved in advance by
the FAA. General operating rules under Part 107 include: 1) must keep the aircraft in sight
(visual line-of-sight), 2) remain below 400 feet above ground level, 3) operate during the day, 4)
operate below 100 mph, 5) always yield the right of way to manned aircraft, 6) do not operate
over people, and 7) the pilot must not operate the UAS from a moving vehicle.
In terms of equipment, the UAVs flown under Part 107 must be registered before operation.
A pilot must carry the registration when operating the UAVs. Based on the requirements from
FAA, UAV must be available to the FAA for inspection or testing on request, and a pilot must
provide any associated records required to be kept under the rule. A pilot also must report any
operation that results in serious injury, loss of consciousness, or property damage of at least $500
to the FAA within 10 days (https://www.faa.gov/newsroom/small-unmanned-aircraft-systems-
uas-regulations-part-107). Although the UAS mission for agriculture would be conducted in a
lonesome region, if an operator is conducting business, flying on behalf of a
company/university/institute, or flying for some other kind of non-recreational purpose where
another stakeholder might be involved, it might be necessarily needed to purchase a liability
drone insurance policy.
Efficient data collection begins with planning the nursery field layout. Global Positioning System
(GPS) guidance and auto-trip capability on the planting tractor and planting equipment are vital in
laying out a uniform boundary of plots. Plots with
consistent size and shape are necessary for
automated data processing. Grouping germplasm or
trials for UAS data collection together in the field
will maximize the efficiency of data collection.
Permanent, semi-permanent, or temporary Ground
control points (GCPs) should be installed and
surveyed by a survey grade RTK (Real Time
Kinematic) GPS devices in the field for precision
georeferencing to conduct successful UAS-HTP
over the whole cropping season. It is strongly
recommended to distribute GCPs well around and in
the middle of the study area (Figure 1). We also
recommend using chess-board pattern GCPs
(square or circle) with about 1×1 foot (the size could
be determined by the flight altitude) (Figure 2).
As the accuracy of GPS measurements affects Figure 1. Ground Control Point (GCP) distribution
the quality of UAS-based products, the coordinate
of all GCPs should be measured by Differential GPS
(DGPS), which provides improved location
accuracy (< 2~3 cm). Most quadcopter batteries
offer less than 20~25 minutes of flight time;
therefore, multiple batteries are required to collect
data for larger areas. To collect high spatial
resolution images, low-altitude-flights will be Figure 2. Pattern examples of GCPs)
required with high image overlap and multiple
battery changes restricted to a portion of the entire nursery. In contrast, lower spatial resolution
images require less intensive image overlap, and flights may be conducted at higher altitudes and
with fewer battery changes. Other factors to consider are obstructions such as trees or utility
transmission lines, interference from other GPS guidance systems, and Wi-Fi/cellular data service
for the aircraft and controller.

1.3. Equipment
Our wheat breeding program adopted DJI platforms (SZ DJI Technology Co., Ltd.,
Shenzhen, China) equipped with RGB (DJI Phantom 4 RTK) and multispectral sensors (DJI
Phantom 4 Multispectral) in 2018-2021 (Bhandari et al., 2020; Bhandari et al., 2021). DJI
Phantom 4 and Mavic 2 Pro series equipped with RGB cameras and DJI Matrice 100 with a
SlantRange 3P multispectral camera (SlantRange, San Diego, CA, USA) were used for RGB and
multispectral imagery data collection.
In terms of multispectral sensors, radiometric calibration is an important component to
convert pixel values in raw images to spectral reflectance to see accurate crop traits such as
vegetation indices. Traditionally, radiometric calibration is conducted through the relationship
between actual reflectance values and pixel values in images of various reflectance panels
(Sapkota et al., 2020). Recently, multispectral cameras for UAV provide two different ways of
radiometric calibration: 1) using the images including a reflectance panel taken before and after
flights (Chang et al., 2021), and 2) using upward-light sensor recording illumination condition
(Change et al., 2020).
During the last four years of our work on UAS-HTP development, we found the following
basic equipment features for smooth and efficient UAS data collection: 1) a stable and uniform
UAS with the autonomous mode is needed to collect high-quality UAS data over a cropping
season consistently, 2) UAS that can measure light conditions such as the Ambient Illumination
Sensor (AIS) on SlantRange sensors or has calibrated reflectance panel (CRP) that comes with
MicaSense RedEdge sensors (AgEagle Aerial Systems Inc., Seattle, WA, USA) can be used for
radiometric calibration of multispectral images and avoid the need to add calibration panels in
the field during data collection.
For those programs that can purchase DJI platforms, below are some of the
recommendations with respect to UAS platforms and associated sensors (costs listed here might
be different currently):
Option #1 (1 multi-spectral platform)
1. DJI Phantom 4 RTK Multispectral (6 bands) with Ground Station: $9,100
2. Reach RS2 base/rover: $6,000 (rover and base is around $5,000 but you will need survey
rods for each, so it will be around $6,000 for everything)
3. Reflectance Tarps (optional): $1,200
Option #2 (1 multi-spectral platform)
1. DJI Matrice M200 V2: $6,000
2. SlantRange 4P+ (6 bands): $6,000 or MicaSense RedEdge MX (5 bands): $6,300
3. Reach RS2 base/rover: $6,000 (rover and base is around $5,000 but you will need survey
rods for each, so it will be around $6,000 for everything)
4. Reflectance Tarps (optional): $1,200
Option #3 (1 RGB and 1 multi-spectral platform)
1. DJI Phantom 4 RTK with Ground Station: $8,500
2. DJI Phantom 4 RTK Multispectral (6 bands): $6,500
3. Reach RS2 base/rover: $6,000 (rover and base is around $5,000 but you will need survey
rods for each, so it will be around $6,000 for everything)
4. Reflectance Tarps (optional): $1,200
Option #4
1. DJI Matrice 300 RTK: ~ $10,000
2. DJI ZenMuse P1 (45MP full frame RGB): $6,800
3. Reach RS2 base/rover: $6,000 (rover and base is around $5,000 but you will need survey
rods for each, so it will be around $6,000 for everything)
4. DJI Phantom 4 Multispectral: $6,500 or you should be able to use SlantRange 4P+ with
M300 as well.

1.2. UAS campaign preparation and mission planning


To prepare the UAS campaign for agriculture fields, weather conditions and flight
parameters should be carefully considered based on the targeted field. Those can strongly affect
actual flight time, mainly the battery life of the platform. Although the battery life could be
varied with the specifications of UAV platforms and sensors, the battery can drain more quickly
to balance its position under high wind speed. In addition, the overheating battery and sensor
may not work properly under high temperatures in the summer. In terms of flight parameters for
the imaging campaigns, there is a trade-off among flight altitude, image over-lap, and field size.
With the same image overlap, high flight altitude can cover a larger area, while lower flight
altitude yields high spatial image resolution for smaller areas. Therefore, an operator must seek
optimum weather and flight conditions when planning a UAS mission (de Lima et al., 2021).
Based on our experience of UAS missions under various conditions, we have established best
practices for mission planning: 1) preparing sufficient fully-charged batteries including one or
two extra, 2) setting up the optimum flight altitude and overlap according to required image
resolution and field size, 3) conducting UAS missions under low wind speed (< 15 mph) and
clear sky and no ice/water droplets on plants, and 4) selecting bright-colored platforms and
sensors, if possible, to avoid overheating.

Users can use DJI GO 4/DJI GS Pro/Pix4D capture apps (or CrystalSky for the latest
platforms) to plan flight missions and control the drones for aerial mapping for DJI Phantom and
Mavic series with an RGB camera. The software supports most DJI platforms and flight
parameters depending on the UAS models and camera specifications. DJI Matric 100 with
SlantRange 3P camera can be operated by DroneDeploy with an additional plug-in to set up
flight conditions for the multi-spectral camera. Based on the previous experience and research on
UAS data collection for breeding programs, we came up with flight specifications on image
overlap, flight altitude, and flight pattern to design UAS missions. For example, the RGB
platform was flown at 20-30m altitude with 80~85% forward and side overlap to obtain sub-
centimeter (0.5-1 cm/pixel) Ground Sampling Distance (GSD) orthomosaics (Bhandari et al.,
2021; Yeom et al., 2018). As the multi-spectral camera has a narrower field of view (FOV), a
multi-spectral platform was flown over the study area at a higher altitude (>50m) with lower
overlap (70~75%) than the RGB platform. 1.2–1.7 cm/pixel GSD orthomosaic images were
obtained from DJI Matric 100 with SlantRange 3P camera when flown at 30–35 m with a 70–
75% overlap (Bhandari et al., 2021; Yeom et al., 2018).
2. Utilizing UAS data portal/hub for data handling
A UAS data hub/portal was created specifically for the Wheat CAP project in the Oracle
cloud system. Below is the link to access the Wheat CAP UASHub (Figure 2).
https://wheatcap.uashubs.com/
Users can access the hub by submitting an email address and password. The hub is equipped
with data sharing, visualization, and analysis features. A project for each individual breeding
program will be created.

Figure 2. Wheat CAP UASHub Dashboard


2.1. Uploading raw UAS data:
The general rule before uploading raw UAS data:
1. Create a folder for a specific flight date. The format of the folder name is:
YYYYMMDD_location (two letters) crop name_experimental condition (if any)_flight
altitude(meters)_overlap. Example for Amarillo datasets:
20220124_ar_wheat_dryland_30m_75.
2. Create a sub-folder inside this folder for RGB and multi-spectral sensors separately.
Name of subfolders depend on platform used. Example for Mavic 2 Pro and SlantRange
3p (respectively):
m2p, s3p.
Note: Do not rename images or folders from the platform. Copy them from
the memory and paste them as they are into the new folder.
3. Include GCP information in this folder as well.
4. Zip it to reduce the file size and upload it in the UAS hub (.zip format) using the Upload
Raw UAS Data tool .

Hover the mouse on Manage data > Upload Raw UAS Data > Select a project, a platform, a
sensor, a date, and a flight (if not included, please click on the "Add Flight" button (Figure 3)).
To add a flight, fill out the input fields and click on the "Add" button > Click on the 'Upload'
button and select the file to upload (Figure 4).
Notes:
- Flight name is YYYYMMDD
- Use only numbers when filling out flight altitude, overlap, and name.
- Upload only ONE .zip file at the time. Once uploading has been
completed, change flight details as needed and upload the next .zip file.

Figure 3. Add Flight

Figure 4. Upload Raw UAS Data tool


2.2. Uploading excel sheet with field layout and plot identifiers
The field layout and plot identifiers should be uploaded with the first .zip file (File containing
raw images). The field layout should look something similar to Figure 5. The plot identifier
worksheet should contain information as shown in Table 1. This file is expected to be in excel
format (.xlsx or .csv).

Figure 5. Field layout template


Table 1: Sample of plot identifier information

2.3. Data processing pipeline (Figure 6) and data product visualization


The image processing workflow starts with the collection of raw images (Level 0 data
product from different sensors and platforms) (Figure 6). The Level 0 data is then processed
using the Structure from Motion (SfM) algorithm to generate Level 1 geospatial data products
such as Digital Elevation Models (DEM), orthomosaic images, and 3D point cloud data. The
Level 2 data products (obtained from Level 1 data) include crop features such as canopy height
(CH), canopy cover (CC), canopy volume (CV), Normalized Difference Vegetation Index
(NDVI), Normalized Difference Red-edge Index (NDRE) and Excessive Greenness Index
(ExG). Plot-level phenotypic features are extracted using plot boundaries. Level 1 data product
(Orthomosaic-RGB) can be visualized using a visualization tool in the UAS hub. Extracted plot
level phenotypic data (in excel file) will be either shared as an online excel spreadsheet or be
available to download from T3-Database.
Figure 6. Data processing pipeline
2.3. Downloading UAS data products (Orthomosaic):
Hover the mouse on Manage data > Download UAS Data > Data Product > Select a project, a
platform, a sensor, and type. Click on Search > Download file by clicking on blue icon next to
desired orthomosaic (Figure 7).
Figure 7. UAS data product (orthomosaic) list

Integrating Wheat CAP UAShub with T3 database:


We will work on integrating the current Wheat CAP UAShub to T3 database and make it
Breedbase complaint in the first year of Wheat CAP project.
References:
Ashapure, A., Jung, J., Chang, A., Oh, S., Maeda, M., Landivar, J. (2019a). A Comparative
Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using
Multi-Temporal UAS Data. Remote Sensing, 11, 2757. https://doi.org/10.3390/rs11232757
Ashapure, A., Jung, J., Chang, A., Oh, S., Yeom, J., Maeda, M., Maeda, A., Dube, N., Landivar,
J., Hague, S., Smith, W. (2020). Developing a Machine Learning Based Cotton Yield
Estimation Framework Using Multi-Temporal UAS Data. ISPRS Journal of Photogrammetry
and Remote Sensing, 169, 180–194. https://doi.org/10.1016/j.isprsjprs.2020.09.015
Ashapure, A., Jung, J., Yeom, J., Chang, A., Maeda, M., Maeda, A., Landivar, J. (2109b). A
Novel Framework to Detect Conventional Tillage and No-Tillage Cropping System Effect on
Cotton Growth and Development Using Multi-Temporal UAS Data. ISPRS Journal of
Photogrammetry and Remote Sens, 152, 49–64.
https://doi.org/10.1016/j.isprsjprs.2019.04.003
Ashapure, A., Oh, S., Marconi, T. G., Chang, A., Jung, J., Landivar, J., Enciso, J. (2019).
Unmanned Aerial System Based Tomato Yield Estimation Using Machine Learning.
Proceeding of SPIE 11008, Autonomous Air and Ground Sensing Systems for Agricultural
Optimization and Phenotyping IV, 22. https://doi.org/10.1117/12.2519129
Barnes, E. M., Clarke, T. R., Richards, S. E., Colaizzi, P. D., Haberland, J., Kostrzewski, M.,
Waller, P., Choi C., R. E., Thompson, T., Lascano, R. J., Li, H., Moran, M. S. (2000).
Coincident Detection of Crop Water Stress, Nitrogen Status and Canopy Density Using
Ground Based Multispectral Data. Proceeding of the 5th International Conference on
Precision Agriculture, 1-15.
Bendig, J., Yu, K., Aasen, H., Bolten, A., Bennertz, S., Broscheit, J., Gnyp, M. L., Bareth, G.
(2015). Combining UAV-Based Plant Height from Crop Surface Models, Visible, and near
Infrared Vegetation Indices for Biomass Monitoring in Barley. International Journal of
Applied Earth Observation and Geoinformation, 39, 79–87.
https://doi.org/10.1016/j.jag.2015.02.012
Bhandari, M., Baker, S., Rudd, J. C., Ibrahim, A. M. H., Chang, A., Xue, Q., Jung, J., Landivar,
J., Auvermann, B. (2021). Assessing the Effect of Drought on Winter Wheat Growth Using
Unmanned Aerial System (Uas)-Based Phenotyping. Remote Sensing, 13, 1144.
https://doi.org/10.3390/rs13061144
Bhandari, M., Ibrahim, A. M. H., Xue, Q., Jung, J., Chang, A., Rudd, J. C., Maeda, M., Rajan,
N., Neely, H., Landivar, J. (2020). Assessing Winter Wheat Foliage Disease Severity Using
Aerial Imagery Acquired from Small Unmanned Aerial Vehicle (UAV). Computers and
Electronics in Agriculture, 176, 105665. https://doi.org/10.1016/j.compag.2020.105665
Chang, A., Jung, J., Maeda, M. M., Landivar, J. (2017). Crop Height Monitoring with Digital
Imagery from Unmanned Aerial System (UAS). Computers and Electronics in Agriculture,
141, 232–237. https://doi.org/10.1016/j.compag.2017.07.008
Chang, A., Jung, J., Yeom, J., Maeda, M. M., Landivar, J. A., Enciso, J. M., Avila, C. A.,
Anciso, J. R. (2021). Unmanned Aircraft System- (UAS-) Based High-Throughput
Phenotyping (HTP) for Tomato Yield Estimation. Journal of Sensors, 2021, 8875606.
https://doi.org/10.1155/2021/8875606
Chang, A., Yeom, J., Jung, J., Landivar, J. (2020). Comparison of Canopy Shape and Vegetation
Indices of Citrus Trees Derived from UAV Multispectral Images for Characterization of
Citrus Greening Disease. Remote Sensing, 12, 4122. https://doi.org/10.3390/rs12244122
Chawade, A., Van Ham, J., Blomquist, H., Bagge, O., Alexandersson, E., Ortiz, R. (2019). High-
Throughput Field-Phenotyping Tools for Plant Breeding and Precision Agriculture.
Agronomy, 9, 258. https://doi.org/10.3390/agronomy9050258
Chen, R., Chu, T., Landivar, J. A., Yang, C., Maeda, M. M. (2018). Monitoring Cotton
(Gossypium Hirsutum L.) Germination Using Ultrahigh-Resolution UAS Images. Precision
Agriculture, 19, 161–177. https://doi.org/10.1007/s11119-017-9508-7
De Leon, N., Jannink, J. L., Edwards, J. W., Kaeppler, S. M. (2016). Introduction to a Special
Issue on Genotype by Environment In-teraction. Crop Science, 56, 2081–2089.
https://doi.org/10.2135/cropsci2016.07.0002in
De Lima, R. S., Lang, M., Burnside, N. G., Peciña, M.,V., Arumäe, T., Laarmann, D., Ward, R.
D., Vain, A., Sepp, K. (2021). An Evaluation of the Effects of UAS Flight Parameters on
Digital Aerial Photogrammetry Processing and Dense-Cloud Production Quality in a Scots
Pine Forest. Remote Sensing, 13, 1121. https://doi.org/10.3390/rs13061121
Fasoula, D. A., Ioannides, I. M., Omirou, M. (2020). Phenotyping and Plant Breeding:
Overcoming the Barriers. Frontier in Plant Science, 10, 1713.
https://doi.org/10.3389/fpls.2019.01713
Fehr, W. R. (1991). Principles of Cultivar Development: Theory and Technique, Agronomy
Books: New York, USA.
Gitelson, A. A., Gritz, Y., Merzlyak, M. N. (2003). Relationships between Leaf Chlorophyll
Content and Spectral Reflectance and Algorithms for Non-Destructive Chlorophyll
Assessment in Higher Plant Leaves. Journal of Plant Physiology, 160, 271–282.
https://doi.org/10.1078/0176-1617-00887
Hu, J., Lanzon, A. (2018). An innovative tri-rotor drone and associated distributed aerial drone
swarm control. Robotics and Autonomous Systems, 10, 162–174.
https://doi.org/10.1016/j.robot.2018.02.019
Hunt, R. (1983). Plant Growth Curves-the Functional Approach to Plant Growth Analysis.
Biometrics, 39, 537. https://doi.org/10.2307/2531040
Jung, J., Maeda, M., Chang, A., Landivar, J., Yeom, J., McGinty, J. (2018). Unmanned Aerial
System Assisted Framework for the Se-lection of High Yielding Cotton Genotypes.
Computers and Electronics in Agriculture, 152, 74–81.
https://doi.org/10.1016/j.compag.2018.06.051
Neto, J. C. (2010). A Combined Statistical-Soft Computing Approach for Classification and
Mapping Weed Species in Mini-mum-Tillage Systems. ETD collection for University of
Nebraska-Lincoln, 22, 64–64.
Oh, S., Chang, A., Ashapure, A., Jung, J., Dube, N., Maeda, M., Gonzalez, D., Landivar, J.
(2020). Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object
Detection Framework. Remote Sensing, 12, 2981. https://doi.org/10.3390/RS12182981
Poehlman, J. M. (2006). Breeding Field Crops, 5th ed., Wiley-Blackwell: Hoboken, NJ, USA.
https://doi.org/10.1007/978-94-015-7271-2
Qi, J., Chehbouni, A., Huete, A. R., Kerr, Y. H., Sorooshian, S. (1994). A Modified Soil
Adjusted Vegetation Index. Remote Sensing of Environment, 48, 119–126.
https://doi.org/10.1016/0034-4257(94)90134-1
Rouse, J. W., Haas, R. H., Schell, J. A., Deeering, D. (1973). Monitoring Vegetation Systems in
the Great Plains with ERTS (Earth Resources Technology Satellite). Proceeding of Third
Earth Resources Technology Satellite Symposium, 309–317.
Sapkota, B., Singh, V., Cope, D., Valasek, J., Bagavathiannan, M. (2020). Mapping and
Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery.
AgriEngineering, 2, 350-366. https://doi.org/10.3390/agriengineering2020024
Song, P., Wang, J., Guo, X., Yang, W., Zhao, C. (2012). High-Throughput Phenotyping:
Breaking through the Bottleneck in Future Crop Breeding. Crop Journal, 9, 633–645.
https://doi.org/10.1016/j.cj.2021.03.015
Tucker, C. J. (1979). Red and Photographic Infrared Linear Combinations for Monitoring
Vegetation. Remote Sensing of Environment, 8, 127–150. https://doi.org/10.1016/0034-
4257(79)90013-0
Woebbecke, D. M., Meyer, G. E., Von Bargen, K., Mortensen, D. A. (1995). Color Indices for
Weed Identification under Various Soil, Residue, and Lighting Conditions. Transactions of
the ASAE, 38, 259–269. https://doi.org/10.13031/2013.27838
Xue, J., Su, B. (2017). Significant Remote Sensing Vegetation Indices: A Review of
Developments and Applications. Journal of Sensors, 2017, 1353691.
https://doi.org/10.1155/2017/1353691
Yeom, J., Jung, J., Chang, A., Ashapure, A., Maeda, M., Maeda, A., Landivar, J. (2019).
Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage
Effects in Agriculture. Remote Sensing, 11, 1548. https://doi.org/10.3390/rs11131548
Yeom, J., Jung, J., Chang, A., Maeda, M., Landivar, J. (2018). Automated Open Cotton Boll
Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data. Remote
Sensing, 10, 1895. https://doi.org/10.3390/rs10121895
Yin, X., Goudriaan, J., Lantinga, E. A., Vos, J., Spiertz, H. J. (2003). A Flexible Sigmoid
Function of Determinate Growth. Annals of Botany, 91, 361–371.
https://doi.org/10.1093/aob/mcg029

You might also like