0% found this document useful (0 votes)
34 views20 pages

Robotic Weed Control System For Tomatos

Uploaded by

trephena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views20 pages

Robotic Weed Control System For Tomatos

Uploaded by

trephena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/226187516

Robotic Weed Control System for Tomatos

Article in Precision Agriculture · January 1999


DOI: 10.1023/A:1009977903204

CITATIONS READS
216 992

3 authors, including:

W. S. Lee David Slaughter


University of Florida University of California, Davis
229 PUBLICATIONS 5,792 CITATIONS 125 PUBLICATIONS 2,990 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Robotic weed control View project

Triticale seed properties View project

All content following this page was uploaded by W. S. Lee on 24 February 2015.

The user has requested enhancement of the downloaded file.


Precision Agriculture, 1, 95᎐113 Ž1999.
䊚 1999 Kluwer Academic Publishers. Manufactured in The Netherlands.

Robotic Weed Control System for Tomatoes


W. S. LEE, D. C. SLAUGHTER, AND D. K. GILES [email protected]
Biological and Agricultural Engineering, Uni¨ ersity of California, Da¨ is, CA 95616

Abstract. A real-time intelligent robotic weed control system was developed for selective herbicide
application to in-row weeds using machine vision and precision chemical application. The robotic vision
system took 0.34s to process one image, representing a 11.43 cm by 10.16 cm region of seedline
containing 10 plant objects, allowing the prototype robotic weed control system to travel at a continuous
rate of 1.20 kmrh. The overall performance of the robotic system in a commercial processing tomato
field and in simulated trials is discussed.

Keywords: robotics, machine vision, weed control, tomatoes

Introduction

Tomatoes are one of the leading vegetable crops produced in California. In 1996,
over 9 billion kg of processing tomatoes were produced in California, accounting
for 93% of all processing tomatoes produced in the U.S. ŽUSDA and NASS, 1997..
A total of 5.9 million kg of agricultural chemicals Žherbicides, insecticides, fungi-
cides, and other chemicals. were used to produce processing tomatoes in California
alone in 1994 ŽUSDA, NASS and ERS, 1995.. This heavy reliance on chemicals
raises many environmental and economic concerns, causing many farmers to seek
alternatives for weed control in order to reduce chemical use in farming.
Conventional mechanical cultivation cannot selectively remove weeds located in
the seedline and there are no selective herbicides for some croprweed situations.
Since hand labor is costly, an automated weed control system may be economically
feasible. A precision robotic weed control system could also reduce or eliminate
the need for chemicals. Although there have been many efforts to control in-row
weeds, no system is currently available for real-time field use.

Objectives

The goal of this project was to build a real-time machine vision based robotic weed
control system that can detect crop and weed locations, kill weeds and thin crop
plants. The system was required to recognize tomato plants and weeds outdoors in
commercial tomato fields using image processing techniques while moving forward
at a constant speed.
96 LEE, SLAUGHTER, AND GILES

Background

Machine vision technologies have been applied to agriculture to identify and locate
individual plants. Many researchers have tried various image processing methods,
working in different environments; however, most of the work has been done
indoors with controlled illumination and an adequate setup for the acquisition of
high quality images. In an early study typical of those to follow, Guyer et al. Ž1986.
studied the feasibility of using machine vision to identify the species of potted,
greenhouse-grown weeds. They noted that plants grown in a greenhouse had a
different appearance from those grown under the natural outdoor environment of
a commercial farm.
The variety of visual characteristics that have been used in indoor plant identifi-
cation can be divided into three categories: spectral reflectance, morphology, or
texture. Many studies Že.g. Franz, Gebhardt, and Unklesbay Ž1991b., Woebbecke
et al. Ž1995a., Brivot and Marchant Ž1996., and Shiraishi and Sumiya Ž1996.. have
used color or near-infrared reflectance to distinguish plants from the background.
In a few situations, researchers have found that spectral characteristics alone can
be used to distinguish between selected plant species, but this technique is usually
insufficient to distinguish crop plants from weeds on a typical California farm.
Morphological characteristics of plant leaves such as complexity, central moment,
principal axis of moment of inertia, first invariant moment, aspect ratio, radius
permutation, ratio of perimeter to longest axis, curvature, compactness, and
elongation have been used to classify plant species with some success ŽGuyer et al.
Ž1986., Franz, Gebhardt and Unklesbay Ž1991a., Woebbecke et al. Ž1995b., and
Shiraishi and Sumiya Ž1996... In a few cases textural feature analysis has also been
used to identify plant species ŽShearer and Holmes Ž1990., and Woebbecke et al.
Ž1995b...
After image processing technologies have been developed, the natural transition
is its ‘‘real-time’’ field application. There are a few machine vision systems which
have achieved real-time application. Slaughter et al. Ž1992 and 1997. developed a
real-time guidance system for precision cultivation Žlater named as the ‘‘UC Davis
Robotic Cultivator’’. that could identify the center of the row under normal field
conditions. The vision guidance system identified the location of the seedline, then
the offset between the current position and the desired position was adjusted by
moving the toolbar laterally. The system was tested in tomato fields and the test
results indicated that the prototype could operate at speeds exceeding 8.0 kmrh
while precisely positioning the cultivator with an overall RMS error ranging from
4.2 mm when there were no weeds to 11.9 mm when the area ratio of weed to
tomato was 3:1. This precision UC Davis Robotic Cultivator was used as a guidance
system for the robotic weed control system reported here. Liao, Paulsen and Reid
Ž1994. studied the feasibility of real-time detection of color and surface defects of
maize kernels. Using a Matrox Image-1280 real-time image processing board, they
reported that the processing time required from acquiring live images to the end of
primitive Žbasic. feature extraction was from 0.87 s for 1 object to 2.08 s for 12
objects. Haney, Precetti and Gibson Ž1994. applied machine vision to sort wood
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 97

based on its color. They reported that the system could operate at conveyor speeds
up to 110 mrmin. Alchanatis and Searcy Ž1995. built and tested a high speed
inspection system for fresh-market carrots. They reported that the system could
handle 2 carrotsrs with a classification accuracy of more than 90%.
Tian Ž1995. studied the feasibility of using a machine vision system to identify
individual plants with images taken in the natural outdoor environment. He
reported problems associated with non-uniform illumination. Four features, elon-
gation ŽELG., compactness ŽCMP., the logarithm of the ratio of height to width
ŽLHW., and the ratio of length to perimeter ŽLTP. were used as the optimum
subset among all features tested for tomato cotyledon recognition. He successfully
identified between 61 to 82 percent of all the individual plants in about 270 frames
of field images in a laboratory environment. However, the research ended before
high speed algorithms needed for implementation in a real-time computer vision
system for use in a commercial field were developed.
Some work has been done in selective application of herbicides. However, in the
majority of the work, ‘selective’ referred to selectivity of plants vs. soil, not crop vs.
weeds. Most of this work utilized a difference in reflectance levels between plants
and soil background based upon the chlorophyll in the foliage absorbing the red
radiation which is reflected by the soil. In earlier studies, the ratio of visible to
near-infrared radiation ŽHooper, Harries and Ambler Ž1976.. and the ratio of red
to near-infrared ŽHaggar, Stent and Isaac Ž1983., Felton et al. Ž1991., Felton and
McCloy Ž1992., and Merritt et al. Ž1994.. were used to distinguish green vegetation
from the soil background. Some of these led to commercial plant detector-sprayers
ŽWeed Seeker PhD 1620, Patchen California, Inc., Los Gatos, CA; and Detect-
spray-S45, Concord Inc., Fargo, ND.. Visser and Timmermans Ž1996. developed an
automatic selective herbicide spraying system for weed control. They used the
chlorophyll fluorescence effect and optically filtered LEDs in a sensor to detect
weeds, and solenoid valves to control weed spray. However, the system also detects
and sprays all green plants as ‘‘weeds.’’ None of the systems described above
sprayed weeds selectively as opposed to crop plants.
A group of researchers in Spain have worked to distinguish crop plants from
weeds ŽMolto et al. Ž1996. and Molto et al. Ž1997... They developed a machine
vision system for robotic weeding of artichokes and lettuce. The average processing
time was about 500 ms per image. A mobile robot for non-chemical weed control is
planned.

Materials and methods

Machine ¨ ision system

The video and computer hardware used here to develop and implement the
real-time computer vision system is shown in figure 1. A Sharp GPB-2 board
ŽSharp Digital Information Products, Inc.. was used as the main hardware portion
of the real-time image processing system. To facilitate real-time color image
processing a Sharp Incard was used to input the camera’s RGB Žred, green, and
98 LEE, SLAUGHTER, AND GILES

Figure 1. Schematic of the components of the real-time machine vision system.

blue. video signals and to transfer them to the GPB-2 board. A Sharp AUXLUT
Card Ždaughter board to the GPB-2. was used to implement a real-time look-up
table for true color to binary image conversion. A multipurpose inputroutput
board ŽModel CIO-DAS 1600 board, ComputerBoards, Inc.. was used to send an
asynchronous reset signal to the camera for on-the-fly image acquisition control.
An RGB color video camera ŽModel 2222-1340r0000, Cohu, Inc.. was used for
high resolution NTSC ŽNational Television System Committee. color image acqui-
sition. A Dell Dimension XPS Pro200n computer equipped with a 200 MHz
Pentium Pro CPU was used as the main microprocessor. The computer was
operated using the MS-DOS operating system Žversion 6.2. and all machine vision
algorithms were developed using the Microsoft C compiler Žversion 7.0..
The prototype robotic weed control system is shown in figure 2. The UC Davis
Robotic Cultivator ŽAlloway cultivation tool and camera 噛1. was utilized as a
guidance system to center the weed control system over the row. Each step, from
image acquisition to actuating the weed control device, was synchronized using an
encoded ŽModel HR6251000006, Danaher Controls. gage wheel on the toolbar of a
tractor ŽModel 7800, John Deere Co... This encoder generated a pulse whenever
the tractor moved 0.13 mm forward on the seedbed. A microcontroller ŽSensor-
WatchTM , TERN Inc.. was used to count the number of encoder pulses in order to
determine the tractor’s location along the row. The SensorWatchTM communicated
via a RS-232 serial port to the Dell computer containing the Sharp boards. The
image size was 11.43 cm long Žin the travel direction. by 10.16 cm wide, and a new
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 99

Figure 2. The prototype robotic weed control system.

image was acquired every 879 pulses Ž11.43 cm. by sending an asynchronous reset
signal to the color camera Žcamera 噛2 in figure 2..
A uniform illumination device was developed using a specially designed cultiva-
tion tunnel which was attached to the end frame of the ‘Alloway’ cultivation tool
and was composed of a C channel beam Ž10.16 cm wide, 60.96 cm long, and 0.48 cm
thick., two dichroic halogen lamps ŽIwasaki Electric Co. Model MR16CG, 12Vdc,
and 50W., two flashed opal optical diffusers ŽOriel Model No. 48030, 5.08 cm
diameter and 0.22 cm thick., two metal side shields and front and rear rubber flaps
Žfigure 2.. The two lamps were positioned at 30⬚ relative to the optical axis of
the camera. The side shields and rubber flaps were designed to block the sunlight
and to minimize the amount of soil falling on top of the tomato plants during
cultivation.

Image processing algorithm

Image acquisition. The first step of image processing was to acquire an image of
the juvenile tomato plants in a commercial tomato field. A shutter speed of 1r500
second was used to prevent blurring due to tractor motion and wind. The red,
green, and blue interlaced video signals were input to the Incard, which digitized a
single field of the interlaced image and subsampled the input image columnwise to
eliminate motion effects. The actual field of view was 11.43 cm = 10.16 cm,
digitized into 256 = 240 color Ž24 bit. pixels.
Images were taken in eight commercial processing tomato fields in Northern
California during the normal cultivation season starting in late March until
mid-May 1997. The tomato plants in these fields were in various stages of maturity
100 LEE, SLAUGHTER, AND GILES

Table 1. Execution time for each image processing step

Execution Percent of total time


Image processing step time Žms. Ž%.

Prepare image acquisition 0.02 0.01


Acquire color image Žone field. 16.76 4.87
Transfer and subsample acquired image 27.21 7.91
Check synchronization of main computer 2.08 0.60
and spray controller
Check image buffer overflow 1.19 0.35
Binarize 2.92 0.85
Morphology analysis 32.04 9.31
Label objects 9.89 2.87
Extract features 144.58 42.00
Make decision with a Bayesian classifier 0.94 0.27
Find tomato & weed locations 58.10 16.88
Send tomato & weed locations to spray controller 37.44 10.88
Miscellaneous commands 11.03 3.20
Total time 344.20 100.00

from just emerging to the second true leaf stage. The following weeds were also
commonly found in these processing tomato fields: Black nightshade Ž Solanum
nigrum., Hairy nightshade Ž Solanum sarrachoides., Ground cherry Ž Physalis spp..,
Lambsquarters Ž Chenopodium album., Mustard Ž Brassica spp.., Nettleleaf goose-
foot Ž Chenopodium murale., Shepherd’s-purse Ž Capsella bursa-pastoris., Redroot
pigweed Ž Amaranthus retroflexus., Groundsel Ž Senecio ¨ ulgaris., Velvetleaf Ž Abuti-
lon theophrasti., Field bindweed Ž Con¨ ol¨ ulus ar¨ ensis L.., and grass weeds ŽYellow
nutsedge, Cyperus esculentus L.; and Large crabgrass, Digitaria sanguinalis.. It was
an especially windy spring in Northern California in 1997 and despite the limited
protection offered by the illumination device most of the tomato plants in the
commercial fields studied were laid down along the direction of wind travel.

Binarization. After an image was digitized and stored as a 24 bit color image in
computer memory, it was segmented ŽRosenfeld and Kak, 1982. into plant and
non-plant regions using color information in hue, saturation and intensity color
space. In this step a Bayesian decision rule was applied to build a color look-up
table ŽLee, 1998. and the AUXLUT card was used for real-time conversion of the
color image into a binary image Žblack for plant leaves and white for background..
The segmentation process took less than 3 ms using the AUXLUT card in a
Dell Dimension XPS Pro200n computer with a 200 MHz Pentium Pro processor
ŽTable 1..
After segmentation, the image was enhanced through a series of image process-
ing steps including shrinking and swelling to remove noise and to obtain a smooth
shape for leaf recognition ŽHorn, 1986; Sharp 1993.. Figure 3c shows the seg-
mented and enhanced image of a tomato seedling and weeds in the field.
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 101

Figure 3. Tomato identification procedure. a. Poor quality image of tomato seedlings and weeds.

Plant recognition procedure. Shape features Žarea, major axis, minor axis, centroid,
area to length ratio ŽATL., compactness ŽCMP., elongation ŽELG., the logarithm
of the ratio of height to width ŽLHW., the ratio of length to perimeter ŽLTP., and
the ratio of perimeter to broadness ŽPTB.. were obtained for each plant leaf. In an
effort to recognize true leaves of tomato seedlings, the curvature of the leaf
boundaries were also studied ŽLee, Slaughter and Giles 1997.. Tomato true leaves
frequently have notches or concave regions along their boundary while most of the
weeds were round and convex Žfigure 3c.. Using curvature, the sum of the radius of
curvature ŽSUMINV. was also calculated for each leaf in order to improve the
Bayesian classifier built to distinguish tomatoes from weeds.
The centroid Ž‘q’ sign., major and minor axes, and perimeter from the feature
extraction process are shown in figure 3d. The pattern recognition features used in
this study were defined as follows.
102 LEE, SLAUGHTER, AND GILES

Figure 3. b. High quality image of tomato seedling and weeds.

ATL s ArearMajor Axis Ž 1.


CMP s 16*ArearPerimeter 2 Ž 2.
ELG s Ž Major Axis y Minor Axis . r Ž Major Axis q Minor Axis . Ž 3.
Height
LHW s log 10 ž / Ž 4.
Width
Major Axis
LTP s Ž 5.
Perimeter
PTB s Perimeterr2 Ž Height q Width . Ž 6.
1
SUMINV s Ý Curvature Ž 7.
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 103

Figure 3. c. Binary image of Figure 3a.

Plant leaves could be identified in typical training images using these features
Žfigure 3e. either as tomato cotyledons or as weeds for non-occluded leaves using a
Bayesian classifier ŽProc Discrim, SAS Institute Inc., 1993..
All images were divided into two groups, good and bad images, based on the
image focus and exposure level, presence of wind, cotyledon display angle, state of
maturity, and occlusion. Tomato leaves in bad images were harder to recognize
because the leaf shape was abnormal due to occlusion or from being blown down
by wind Žfigure 3a.. From each group, a training set and a validation set were
created in order to assess the plant recognition performance. A small subset of
images were carefully selected for training in order to minimize the size of the
training set while ensuring that the wide range of scene conditions encountered in
a commercial field were represented. A larger set of validation images were
selected randomly from each group. There was no overlap between the images in
the training and validation sets. The objects in each image were divided into 4
104 LEE, SLAUGHTER, AND GILES

Figure 3. d. Feature extraction of each object in Figure 3c.

classes; tomato cotyledon, tomato true leaf, miscellaneous tomato leaf, and weeds.
The miscellaneous tomato leaf group consisted of plants with cotyledons or true
leaves which were curled, occluded, eaten by bugs, or partially occluded by the
edge of the image. For the good image group, a total of 10 images were used for
the training set and 41 images were used for the validation set. For the bad group,
a total of 16 images and 46 images were used respectively ŽTable 2.. After the true
class of each leaf was determined by manual inspection, the performance of the
image processing algorithm was determined using a Bayesian discrimination proce-
dure ŽProc Discrim..
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 105

Figure 3. e. Processed image, tomato leaves in black, weed leaves in gray.

Precision spraying system

A robotic spraying system Žfigure 2. was developed with eight 12Vdc solenoid
valves ŽCapstan Ag Systems, Inc., Topeka, Kansas., a stainless steel manifold Ž3.18
cm = 3.18 cm = 13.97 cm., a specially designed accumulator, and eight driver
circuits for valve control. The robotic spraying system was mounted at the end of
the tunnel, about three image frames behind the camera.
Five hypodermic tubes ŽHeavy Wall Stainless Steel Type 304-W, 22 gauge,
I.D.s 0.28 mm, 12.7 mm long, Small Parts, Inc.. were used in a line 2.54 mm apart
to form a micro-spray nozzle. When moving at 1.20 kmrh and activated for 10 ms,
each micro-spray nozzle emitted an elliptical deposit 0.9 cm along the direction of
travel and 1.27 cm perpendicular to the direction of travel when operated at 103
kPa from a nozzle height of 10.16 cm above the seedbed. A spraying time of 10 ms
106 LEE, SLAUGHTER, AND GILES

Figure 3. f. Tomato cells Ž_., buffer zone Žr., and spray zone Ž=. overlaid on Figure 3b.

gave a flow rate of 0.098 Lrmin for each valve and an exit velocity from the nozzle
of 6.4 mrs. A CO 2 tank was used to pressurize the spray system. The eight
solenoid valves Ž2.54 cm outside diameter. were aligned to allow the entire 10.16
cm wide seedline to be sprayed when all were opened at the same time. An
accumulator was attached to the manifold in order to maintain a constant flow
rate, independent of the number of valves opened simultaneously.
After distinguishing tomatoes from weeds, the system divided each image into an
8 row by 18 column spray grid. The 8 rows correspond to the eight valvesrnozzles
of the precision spray system. The image was divided into 18 columns for precise
spray application, giving a spray cell size of 1.27 cm by 0.64 cm. The weed leaf
locations were then sent to the spray controller. A valve was opened for 10 ms to
spray the proper amount of herbicide onto each spray cell containing a weed.
Figure 4 contains an illustration showing the operating concept of the robotic weed
control system. The algorithm did not spray any cells adjacent to tomato cells in
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 107

Table 2. Performance of the prototype machine vision system using ELG and CMP

Group Good Bad Total

No. of image in training set 10 16 26


No. of image in validation set 41 46 87
Total no. of tomato leaves 192 128 320
Total no. of weed leaves 26 102 128
Avg. no. of tomato leaves per image 4.7 2.8 3.7
Avg. no. of weeds per image 0.6 2.2 1.5
Tomato cotyledons found 80.0% 62.5% 75.0%
Tomato true leaves found 38.0% 14.7% 30.5%
Miscellaneous tornato groups found 52.5% 32.9% 42.0%
Weeds found 53.8% 72.5% 68.8%
Avg. tomato leaves found per image 85.9% 53.9% 73.1%
Avg. weeds found per image 53.8% 72.5% 68.8%

Table 3. Targeting accuracy and precision of spraying system

No. of spray Avg. error Std. dev. of error


targets Žmm. Žmm.

99 6.58 4.90

Table 4. System performance under ideal laboratory conditions

No. of rectangular No. of rectangular No. of circular No. of circular


Trial objects Žtomatoes. objects sprayed objects Žweeds. objects sprayed

1 23 0 26 26
2 6 4 39 39
3 22 0 19 19
Total 51 4 84 84

order to protect tomato seedlings from spray drift. Figure 3f shows three types of
spray cells, those containing weeds, those containing tomato plants and those used
as a buffer zone to protect the tomato leaves from drift, overlaid on the original
image of a tomato row in figure 3b.
In order to assess the accuracy and precision of the spray system independently
from the pattern recognition performance of the machine vision system, a test was
conducted outdoors on a tomato bed using circular green targets 2.54 cm in
diameter. The targets were considered as ‘‘weeds’’ and were attached every 22.86
cm to a 10.16 cm wide strip of cardboard using double sided tape in order to
prevent them from moving. A color look-up table was created with a few training
images to identify the color of the coins. Their centroids were sprayed with a blue
dye ŽPrecision Laboratories, Inc. SIGNALTM . by the robotic precision spraying
system after they were detected by the machine vision system, while the tractor was
108 LEE, SLAUGHTER, AND GILES

Figure 4. Concept drawing of the robotic weed control system.

moving forward at 0.8 kmrh. The distance was measured between the center of the
coins and the center of the spray drops.

Tests of o¨ erall system performance

Two tests of the overall system performance were conducted: an outdoor test in a
commercial processing tomato field and an indoor test under ideal operating
conditions. The outdoor test was conducted in late May of 1997 in a field with one
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 109

of the last commercial plantings of processing tomatoes in Northern California.


The travel speed was about 0.8 kmrh and the tomato plants ranged from just
emerging up to the first true leaf stage, with the majority of tomato plants at the
cotyledon stage. The total number of tomato plants and weeds were 520 and 21,
respectively.
The indoor test was conducted on a smooth concrete floor using green rectangu-
lar targets Ž0.64 cm = 1.27 cm. to simulate tomato cotyledons and green circular
targets Ž1.27 cm diameter. to simulate weeds. Three replicate trials were conducted
where a random number of weed and tomato targets were placed in random
locations along a row the length of 10 image frames Ž1.14 m. for each trial. A total
of 135 targets Ž51 rectangular and 84 circular . were used in the indoor study and
the travel speed of the system was 0.8 kmrh during each trial.

Results and discussion

Speed of image processing algorithm

Processing time is a major concern in real-time machine vision applications. Since


the goal of this project was to develop a real-time robotic weed control system,
computationally intensive steps were avoided. The time required for each image
processing step is shown in Table 1. For a 256 by 240 pixel image representing a
11.43 cm = 10.16 cm field of view, the image processing algorithm took a total of
0.344 seconds to identify 10 tomato cotyledons in the image using only the features
of elongation and compactness. Thus the prototype cultivator could travel at a
continuous rate of 1.20 km per hour under these conditions. Higher speed could be
achieved simply by dedicating more image processing hardware to extract the
morphological features from the leaves in parallel, since the execution time is
dependent on the number of objects in an image and feature extraction takes
about 42% of the time to process one image ŽTable 1..
In an effort to reduce the processing time, objects were not processed and were
considered to be weeds if they were on the very top or bottom of an image Ži.e.
outside the seedline. or if their area was too small or too big to be considered as
tomatoes. To save additional time, the algorithm checked only the center and 4
corner points of each spray cell in the processed image for weeds in order to
determine whether to spray that cell, rather than scanning every pixel of the entire
image.

Plant recognition performance

Preliminary tests using all shape features described in equations 1᎐8, the number
of concave regions, and the standard deviation of curvature of each object, were
110 LEE, SLAUGHTER, AND GILES

conducted to find the optimal Bayesian classifier. After several preliminary tests,
two features, ELG and CMP, were found to provide the optimal Bayesian classifier
for the images in this study. Table 2 shows the performance for both image groups
with the Bayesian classifier built using these features.
Adding more features to the classifier had inconsistent results but in no case did
it help reduce the total error rate of the classifier. For example, when the feature
SUMINV was used along with ELG and CMP, the rate of true leaf recognition
dramatically increased from 38.0% to 62.0% in the good image group, however the
decrease in cotyledon recognition rate was even more dramatic going from 80% to
a very poor 6.7%. Similarly, these three features increased the weed recognition
rate to 76.5%, but decreased the recognition rate of miscellaneous tomato leaves to
0.0%, in the bad image group. These results show that a high level of variability in
shape patterns exists when the shape is characterized from a single two-dimen-
sional top view of plants growing in an uncontrolled environment such as that
found in an agricultural setting.

Precision spray system performance

On tomato beds, the average error between the center of the targets and the spray
drops was 6.58 mm and the standard deviation was 4.90 mm ŽTable 3.. Many
sources of error affected the accuracy of spray targeting. First, there was an
intrinsic error due to the spatial resolution of the spray system. The physical size of
the spray valves resulted in a nozzle spacing of 1.27 cm, which led to a correspond-
ing spray cell length of 1.27 cm. A minimum spatial error of 7.1 mm in spray
targeting would occur whenever the centroid of an object happened to be located
in the corner of a spray cell rather than the center, since the nozzle pattern is
centered about the center of the cell.
A second ‘‘systematic’’ error occurred whenever there was an adjustment in the
lateral position of the system by the guidance system. This error was caused when
any lateral movement occurred between the time an image was acquired and the
corresponding time the weeds in that image were actually sprayed, due to the three
image frame offset between the camera and the spray nozzles Žfigure 2.. An
improved system might be able to compensate for this error by allowing the spray
controller to communicate with the guidance system and adjust the weed map in
the memory of the spray controller to compensate for lateral movement. A similar
error is caused by wind when spraying actual weeds in the field because the wind
frequently displaces the weeds differently from the time the camera is overhead to
the time the nozzles spray.
A third source of error was caused by the displacement sensor when travelling
over uneven surfaces. The sensor did not generate the same number of pulses for
the same distance traveled whenever there were clods and bumps on the bed where
the gage wheel traveled, or whenever the compressibility of the soil varied from
location to location within a field.
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 111

O¨ erall system performance

The overall system results from the outdoor test showed that 24.2% of the
tomatoes were incorrectly identified and sprayed and 52.4% of the weeds were not
sprayed. The percentage of tomatoes recognized was consistent with the results
from the separate independent evaluation of the machine vision system however
the percent of weeds sprayed was somewhat below what was expected ŽTable 2..
The low percentage of weeds sprayed was due to several factors. Some weeds were
near tomatoes, so they were not sprayed due to the protection zone around the
tomato leaves. There were some grass weeds, which looked very similar to those
tomato cotyledons which were held in a more vertical position. Some weeds were
outside of the camera’s field of view, but they were counted as processed objects
since it was very difficult to determine the exact boundary of the camera’s view as
it traveled down the row because real-time visual monitoring of the image bound-
ary was not possible. Occasionally the tractor’s travel speed was a little too fast for
the number of objects in a single image causing the vision system to skip a few
segments of the row and some weeds to be missed. In addition, some tomato plants
were hidden by weeds, so they were identified as weeds and sprayed.
In contrast, the overall system results in the indoor test were much better. Only
8% Ž4 of 51. of the rectangular targets used to simulate tomato plants were
incorrectly sprayed and all of the circular targets used to simulate weeds were
correctly sprayed ŽTable 4.. Post-test analysis of the images from the indoor trials
showed that the 4 incorrectly sprayed rectangular targets were not recognized
because they appeared a little too dark for the LUT and their shape was distorted
in the segmentation process. These results show that the prototype performed well
when operated on a smooth surface with distinctly shaped, well separated objects.

Future work

For commercial success, improvements in accuracy and speed are needed. Ideally a
robotic weed control system would be able to travel at a continuous speed of 3
kmrh to 8 kmrh. In order to improve the tomato recognition performance,
additional techniques to recognize the wide variety of tomato true leaf shapes and
distinguish them from weeds needs to be developed. Occlusion of plant leaves is
also a significant problem in distinguishing tomato leaves in the uncontrolled
outdoor environment of an agricultural field and an accurate real-time algorithm
for the separating partially occluded leaves at the point of occlusion is needed.

Conclusions

䢇 A real-time intelligent robotic weed control system was developed and tested for
selective spraying of in-row weeds using a machine vision system and a precision
chemical application system.
112 LEE, SLAUGHTER, AND GILES

䢇 The image processing algorithm took 0.344 s to process one frame of a 256 = 240
pixel image representing a 11.43 cm by 10.16 cm field of view with 10 objects in
an image, allowing the prototype robotic weed control system to travel at a
continuous speed of 1.20 kmrh.
䢇 The image processing algorithm correctly identified, in real-time, 73.1% of
tomatoes and 68.8% of weeds in the validation set of field images taken in
commercial tomato fields.

References

1. L. Alchanatis and S. W. Searcy, ‘‘High speed inspection of carrots with a pipelined image processing
system,’’ ASAE Paper No. 95-3170, St. Joseph, MI, USA, 1995.
2. R. Brivot and J. A. Marchant, ‘‘Segmentation of plants and weeds for a precision crop protection
robot using infrared images,’’ IEE Proc.ᎏVision, Image and Signal Process. vol. 143, no. 2, pp.
118᎐124, 1996.
3. W. L. Felton, A. F. Doss, P. G. Nash, and K. R. McCloy, ‘‘A microprocessor controlled technology
to selectively spot spray weeds,’’ Proc. of Symposium on automated agriculture for the 21st century,’’
ASAE, 1991, pp. 427-432.
4. W. L. Felton and K. R. McCloy, ‘‘Spot spraying,’’ Agricultural Engineering, pp. 9᎐12, November
1992.
5. E. Franz, M. R. Gebhardt, and K. B. Unklesbay, ‘‘Shape description of completely visible and
partially occluded leaves for identifying plants in digital images,’’ Trans. of ASAE vol. 34, no. 2, pp.
673᎐681, 1991a.
6. E. Franz, M. R. Gebhardt, and K. B. Unklesbay, ‘‘The use of local spectral properties of leaves as
an aid for identifying weed seedlings in digital images,’’ Trans. of ASAE vol. 34, no. 2, pp. 682᎐687,
1991b.
7. D. E. Guyer, G. E. Miles, M. M. Schreiber, O. R. Mitchell, and V. C. Vanderbilt, ‘‘Machine vision
and image processing for plant identification,’’ Trans. of ASAE vol. 29, no. 6, pp. 1500᎐1507, 1986.
8. R. J. Haggar, C. J. Stent, and S. Isaac, ‘‘A prototype hand-held patch sprayer for killing weeds,
activated by spectral differences in croprweed canopies,’’ J. Ag. Eng. Res. vol. 28, pp. 349᎐358,
1983.
9. L. Haney, C. Precetti, and H. Gibson, ‘‘Color matching of wood with a real-time machine vision
system,’’ ASAE Paper No. 94-3579, St. Joseph, MI, USA, 1994.
10. A. W. Hooper, G. O. Harries, and B. Ambler, ‘‘A photoelectric sensor for distinguishing between
plant material and soil,’’ J. Ag. Eng. Res. vol. 21, pp. 145᎐155, 1976.
11. B. K. P. Horn, Robot Vision, McGraw-Hill: New York, 1986.
12. W. S. Lee, Robotic weed control system for tomatoes, Ph.D. Dissertation, University of California,
Davis, 1998.
13. W. S. Lee, D. C. Slaughter, and D. K. Giles, ‘‘Robotic weed control system for tomatoes using
machine vision and precision chemical application,’’ ASAE Paper No. 97-3093, St. Joseph, MI, USA,
1997.
14. K. Liao, M. R. Paulsen and J. F. Reid, ‘‘Real-time detection of colour and surface defects of maize
kernels using machine vision,’’ J. Ag. Eng. Res. vol. 59, no. 4, pp. 263᎐271, 1994.
15. S. J. Merritt, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, ‘‘Reflectance sensor and control
system for spot spraying,’’ ASAE Paper No. 94-1057, St. Joseph, MI, USA, 1994.
16. E. Molto,´ J. Blasco, N. Aleixos, J. Carrion,
´ and F. Juste, ‘‘Machine vision discrimination of weeds in
horticultural crops,’’ AGENG 96, Madrid, Report N. 96G-037, 1996.
17. E. Molto,´ J. Blasco, N. Aleixos, J. Carrion,
´ and F. Juste, ‘‘A machine vision system for robotic
weeding of row crops,’’ Proc. 5th Int. Symp. Fruit, Nut, and Vegetable Production Engineering, Davis,
CA., USA, 1997.
18. A. Rosenfeld and A. C. Kak, Digital picture processing, Academic Press: Orlando, FL, 1982.
ROBOTIC WEED CONTROL SYSTEM FOR TOMATOES 113

19. SAS Institute Inc., SASrSTAT User’s Guide. Version 6, 4th ed., vol. 1, SAS Institute Inc., Cary, NC,
1993.
20. Sharp, Image processing board GPB-2 Manual, Sharp Digital Information Products, Inc., 1993.
21. S. A. Shearer and R. G. Holmes, ‘‘Plant identification using color co-occurrence matrices,’’ Trans. of
ASAE vol. 33, no. 6, pp. 2037᎐2044, 1990.
22. M. Shiraishi and H. Sumiya, ‘‘Plant identification from leaves using quasi-sensor fusion,’’ J. of
Manufacturing Sci. and Eng. vol. ASME 118, pp. 382᎐387, 1996.
23. D. C. Slaughter, R. Curley, P. Chen, and C. Brooks, ‘‘Development of a robotic system for
nonchemical weed control,’’ Proc. 44th Annual California Weed Conf., Sacramento, CA, 1992.
24. D. C. Slaughter, P. Chen, and R. G. Curley, ‘‘Computer vision guidance system for precision
cultivation,’’ ASAE Paper No. 97-1079, St. Joseph, MI, USA, 1997.
25. L. Tian, Knowledge based machine ¨ ision system for outdoor plant identification, Ph.D. Dissertation,
University of California, Davis, 1995.
26. USDA and NASS, Agricultural statistics 1997, United States Government Printing Office, Washing-
ton D.C., 1997.
27. USDA, NASS, and ERS, Agricultural chemical usage, Vegetables, 1994 Summary, United States
Government Printing Office, Washington D.C., 1995.
28. R. Visser and A. J. M. Timmermans, ‘‘WEED-IT: a new selective weed control system,’’ Proc. of
SPIE, Optics in Agriculture, Forestry, and Biological Processing II vol. 2907, pp. 120᎐129, 1996.
29. D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, ‘‘Color indices for weed
identification under various soil, residue, and lighting conditions,’’ Trans. of ASAE vol. 38, no. 1, pp.
259᎐269, 1995a.
30. D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, ‘‘Shape features for
identifying young weeds using image analysis,’’ Trans. of ASAE vol. 38, no. 1, pp. 271᎐281, 1995b.

View publication stats

You might also like