1 Development of Monitoring Robot System For Tomato
1 Development of Monitoring Robot System For Tomato
Department of Agricultural Engineering, National Institute of Agricultural Sciences, Jeonju 54875, Korea;
[email protected]
* Correspondence: [email protected] (B.-H.C.); [email protected] (K.K.)
Abstract: Crop monitoring is highly important in terms of the efficient and stable performance of
tasks such as planting, spraying, and harvesting, and for this reason, several studies are being con-
ducted to develop and improve crop monitoring robots. In addition, the applications of deep learn-
ing algorithms are increasing in the development of agricultural robots since deep learning algo-
rithms that use convolutional neural networks have been proven to show outstanding performance
in image classification, segmentation, and object detection. However, most of these applications are
focused on the development of harvesting robots, and thus, there are only a few studies that im-
prove and develop monitoring robots through the use of deep learning. For this reason, we aimed
to develop a real-time robot monitoring system for the generative growth of tomatoes. The pre-
sented method detects tomato fruits grown in hydroponic greenhouses using the Faster R-CNN
(region-based convolutional neural network). In addition, we sought to select a color model that
Citation: Seo, D.; Cho, B.-H.; was robust to external light, and we used hue values to develop an image-based maturity standard
Kim, K. Development of for tomato fruits; furthermore, the developed maturity standard was verified through comparison
Monitoring Robot System for with expert classification. Finally, the number of tomatoes was counted using a centroid-based
Tomato Fruits in Hydroponic tracking algorithm. We trained the detection model using an open dataset and tested the whole
Greenhouses. Agronomy 2021, 11, system in real-time in a hydroponic greenhouse. A total of 53 tomato fruits were used to verify the
2211. https://doi.org/10.3390/
developed system, and the developed system achieved 88.6% detection accuracy when completely
agronomy11112211
obscured fruits not captured by the camera were included. When excluding obscured fruits, the
system’s accuracy was 90.2%. For the maturity classification, we conducted qualitative evaluations
Academic Editors: José Blasco, Nuria
with the assistance of experts.
Aleixos and Bosoon Park
robots. In addition, there are some studies that utilize monitoring robots in the context of
hydroponic greenhouses.
The concept of smart farming relates to the use of information and communications
technologies (ICT) in farms to enable the remote and automatic monitoring of crop condi-
tions and the control of growth environments in greenhouses, orchards, livestock barns,
and so on [20]. Smart farming aims to increase productivity and improve the quality of
products while reducing production costs, and is growing in importance due to the in-
creasing demand for higher crop yields, the aging of farmers, the reduction in agricultural
workers as a proportion of the population, the increasing use of ICT, and so on [20–22].
For this reason, several studies were conducted to apply agricultural robots for harvesting
[11,12,23] in hydroponic greenhouses. However, most of these studies focused on the de-
velopment of harvesting robots, and thus, there are only a few studies that improve and
develop the application of monitoring robots in hydroponic greenhouses through the use
of deep learning algorithms.
Meanwhile, the development of artificial intelligence (AI) techniques has led to more
studies being conducted on the application of machine learning algorithms in computer
vision tasks in agriculture [24]. The application of deep learning algorithms in the devel-
opment of agricultural robots is increasing since deep learning algorithms that use convo-
lutional neural networks (CNNs) have been proven to show outstanding performance in
image classification, segmentation, and object detection tasks [24,25]. For example, several
deep learning algorithms that use CNNs, such as YOLOv3 [24], modified YOLOv3, Mask
R-CNN [26], Faster R-CNN [27], and so on, were applied to detect fruits. In particular,
Faster R-CNN was confirmed to be suitable for the recognition of fruits of different sizes,
and its strong performance in the detection of tomato fruits was confirmed [28].
Tomato (Solanum lycopersicum L.) was produced at a quantity of approximately
180.8 million tons worldwide in 2019, and it is one of the most economically important
crops in the world [29]. Tomato undergoes both vegetative and generative growth. Vege-
tative growth includes the growth of the roots, stems and leaves, and generative growth
includes growth of the flowers and fruits. During generative growth, tomato fruits un-
dergo several changes such as increases in their size, color changes, and so on. In particu-
lar, tomato fruits undergo changes in their skin color from green to red, and these changes
are closely related to the maturity and production quality of tomatoes [30]. For this reason,
several studies attempted to quantitatively classify the maturity of tomatoes using image
processing [31–33]. However, it was found to be very difficult to consistently determine
the maturity of tomatoes because of their abundance at the time of harvest, which then
leads to problems during their distribution and export to faraway places [31]. Further-
more, farmers harvest tomatoes based on practical experience and/or maturity classifica-
tion charts, and these methods are easily influenced by the environment and mistakes
commonly occur [31].
Therefore, we studied the development of a robot that can monitor a large number
of tomatoes in real-time following objective criteria. First, we developed the tomato de-
tection model using Faster R-CNN, and the counting model was developed using a cen-
troid-based tracking algorithm. Second, we investigated the use of a color model that is
robust to external light changes, and developed an image-based maturity standard for
tomato fruits. Subsequently, tomato fruits were classified into six maturity levels using
the aforementioned newly developed maturity standard. Third, the number of tomatoes
was counted for each maturity stage by tracking the centroid measurements of the de-
tected bounding boxes.
2. System Configuration
Figure 1 shows an image of the general hydroponic greenhouse in Korea. In the sec-
tions in which crops are planted, there are rails that are used as hot water pipes, and the
other sections are concrete surfaces. For this reason, the drive-wheels of the robot are pro-
duced as a double structure. For automatic driving of the robot in the section where crops
Agronomy 2021, 11, 2211 3 of 14
are planted, two proximity sensors are installed at the bottom of the robot to recognize
the start and the end positions of the crop section. On the concrete surface, the robot rec-
ognizes the magnetic line on the floor using a magnetic sensor and drives along this line.
Figure 2 shows the schematic and actual images of the robot used in this study.
Figure 2. The schematic and the actual image of the monitoring robot.
(a) (b)
Figure 4. (a) Validation total loss graph and (b) the learning rate schedule.
3.1.2. Postprocess
Since the outputs of the detection model, Faster R-CNN, are in a rectangular shape,
including the background area, only the tomato fruit area should be separated to classify
its maturity. The k-means clustering algorithm was used to separate tomato areas from
the background area, and the k value was set as 2 in this study. There are several object
segmentation methods such as Mask R-CNN [40], but the object cannot be processed in
real-time because it takes about 200 milliseconds to process one frame [40]. However,
Faster R-CNN takes 55 milliseconds when using the ResNet-101 backbone. Since the pro-
cessing time can also be increased with more objects, we detected bounding boxes using
Faster R-CNN and then separated the fruit using k-means clustering.
To track and count detected objects without duplication in the real-time video, cen-
troid-based object tracking was used. The center point of the bounding box, the centroid,
was computed and assigned a unique ID (Identification). When the video frame was up-
dated and new points were given, the point that minimized the Euclidean distance be-
tween the original point was chosen as an associate centroid. If the point was not associ-
ated with any other points, it was given a new ID.
3.2.1. Maturity
In this study, tomato maturity was divided into six levels (Green, Breakers, Turing,
Pink, Light Red, and Red) according to the USDA (United States Department of Agricul-
ture) standard [32]. Table 1 shows the maturity levels of the tomatoes referenced in this
study; the maturity levels are classified by the ratio of the red region. Thus, we considered
the quantitative classification of tomato maturity through the use of both the USDA stand-
ard for tomato maturity and image processing; this process is detailed in Section 3.2.
Maturity Description
Turning Over 10% but not more than 30% red or pink
Pink Over 30% but not more than 60% pinkish or red
Light Red Over 60% but not more than 90% red
In general, the color changes in tomato fruits are closely related to the accumulated
temperature, and the accumulated temperature is the integrated excess or deficiency of
temperature for fixed data. In particular, the accumulated temperature could be used in
crop growth models, and it may prove increasingly important for assessing the impact of
climate change [41]. The maturity levels for tomato fruits may be estimated using the ac-
cumulated temperature, and it is known that the accumulated temperatures in the light
red and red stages are approximately 1000 and 1100 °C·day, respectively [42].
The region of interest (ROI) for each tomato was segmented, and the pixel values for
the red, green, and blue (RGB) channels were extracted from each tomato. For each chan-
nel of the RGB image, the pixel values within the area were averaged. Then, the aver-
aged values for nine images were averaged again to represent each RGB color charac-
teristic in relation to the accumulated temperature of the day: . Equation (1) describes
this process. Note that the value of n was 9 in our case, because n refers to the number of
target fruits. Correlations between the accumulated temperature and c values of each
channel are shown in Figure 6. As shown in Figure 6, there was no significant difference
between RGB values and the accumulated temperature. The RGB color model is consid-
ered to be an additive color model, in which the temperature of “light” produces corre-
sponding colors [43]. It produces colors by mixing the primary colors of light, red, green,
and blue, and then it is affected by lightness.
= / (1)
The light in the greenhouse was unstable due to the sunlight and the shading system.
In this case, the image brightness needed to be changed according to several environmen-
tal factors such as the weather, the circumstances of the greenhouse, and so on. Thus, RGB
values were not suitable for the analysis of skin color changes in tomatoes growing in the
greenhouse, because RGB values are significantly affected by these factors. For this reason,
to figure out the relationship between the maturity level (the accumulated temperature)
Agronomy 2021, 11, 2211 8 of 14
and the image characteristics, it was necessary that some image pre-processing, such as
gamma correction, be conducted, in addition to finding robust features that are resistant
to changes in brightness.
Figure 7. The grayscale area of the color checker obtained from the image.
Table 2 indicates the changes in the color model of tomatoes according to brightness.
Although all tomatoes were at a similar maturity stage, the brightness differed up to 93.68.
The fact that each pixel could contain values from 0 to 255 ensured that the light conditions
were not too unstable in the greenhouse. The red and green values also did not show any
tendency. This difference could have affected RGB values that contained color and light-
ness information simultaneously, in which case the quantification of maturity through the
image would have become difficult. In Table 2, red and green values also differed signif-
icantly, and their standard deviations were as high as 26.24 and 39.35.
Agronomy 2021, 11, 2211 9 of 14
Brightness
Standard
(Accumulated
Deviation
Temperature) 37.82 68.92 131.50
(982.0 °C·Day) (1115.1 °C·Day) (1066.6 °C·Day)
Red 8.45 26.75 70.95 26.24
Green 27.53 49.80 119.88 39.35
Blue 78.02 144.80 233.1 63.51
Hue 11.18 8.28 10.475 1.23
Saturation 223.90 205.13 175.23 20.04
Value 78.45 145.23 233.33 63.43
L* 46.08 90.43 162.00 47.76
a* 149.53 165.13 167.38 7.94
b* 150.68 163.40 174.80 9.85
To find color features that could overcome the influence of external light, we con-
verted the RGB color model to an HSV and L * a * b * model and the obtained average
values of each channel are shown in Table 2. When the standard deviations between three
images were configured, the hue value showed the least difference as being 1.23. Figure 9
shows the correlation between hue values and the accumulated temperature; the linear
regression model was fitted, and the R-squared value was 0.96. It can be seen that the hue
channel value was robust to external light changes and had a linear relationship with the
accumulated temperature. For these reasons, the HSV color model was applied to classify
the maturity level.
Using our temperature–hue data, we divided the maturity into six levels. The point
at which the hue value became lower was set as the “green” standard. The “red” standard
was set at the point at which the hue stopped changing and the accumulated temperature
reached about 1100 °C per day. Then, the range was divided into four sections. Figure 10
Agronomy 2021, 11, 2211 10 of 14
shows the relationship between the hue color model and the maturity levels for tomato
fruits, and image-based maturity standard for tomato fruits was defined by this relation-
ship. In addition, we confirmed the relationship between the color model and the maturity
levels with the assistance of three experts who run tomato farms. Twenty tomato fruits
were classified into six levels of maturity by the three experts, and the a * value of each
tomato fruit was measured using a portable colorimeter (CR20, Konica Minolta, Tokyo,
Japan). The a * values obtained from the images were calculated as average values. As a
result, it was confirmed that the calculated a * values from the images were within the
range of each maturity level, as classified by the three experts (Figure 11).
Figure 10. The relationship between hue color model and maturity level for the tomato fruits.
Figure 11. Comparison of the measured a * value through expert classification, and the a * values
calculated from images.
Figure 11 shows the relationship between the measured and calculated a * values.
The hue values obtained from the images were used for monitoring
(a) (b)
Figure 12. The test environments: (a) the robot on the rail in the greenhouse; (b) the detection area.
The test was repeated 10 times in the same area to evaluate the detection and count-
ing performance. We scored the system with a degree of accuracy that represents the ratio
of true predictions to the total number of objects. The developed system achieved 88.6%
detection accuracy when including fruits that were not captured by the camera because
they were completely obscured. The actual value of the number of tomatoes was 53 and
the average of 10 predictions was 54.4. As a result of the test, fewer than 53 objects were
measured two times, 53 objects were measured four times, and 54 objects were measured
more than four times. Meanwhile, when excluding the completely obscured fruits, the
system accuracy was 90.2%. Duplicate detections sometimes occurred for 4 to 6 objects,
and one case occurred in which the same value was counted twice, leading to a duplicate
error. As mentioned in the above section, we verified the image-based maturity standard
through comparison with expert classification, and we confirmed that the maturity stand-
ard could be used to monitor the maturity level of tomato fruits in hydroponic green-
houses. However, the maturity standard in the field may differ depending on the expert,
farm, and so on. Thus, it will be necessary to collect more classification data from experts
in order to achieve more comprehensive classification in the future. This method has the
potential to predict the harvest times of tomato fruits according to their maturity levels.
As mentioned in Section 2, a GPU was included in our hardware system, which
meant that the processing time was sufficiently low to process deep learning-based
Agronomy 2021, 11, 2211 12 of 14
programs in real-time. Table 3 describes the processing time during the field test. For a
given frame, the inference of object detection using Faster R-CNN took 0.16 s and the total
processing time was 0.18 s in average. The robot in Figure 12a moved at 0.16 m/s; there-
fore, the processing time was appropriate for the monitoring of tomato fruits in real-time.
In fact, the processing time could be longer with the presence of too many objects, as
shown in Figure 13, since, in general, inference takes a significantly longer time and each
output bounding box should perform the process from the k-means clustering to the cen-
troid-tracking stage. However, with these shortcuts, a speed of at least 2 FPS was achieved
with the measurement of 10 objects.
Table 3. Processing speed of monitoring with the NVIDIA GTX 2080 ti GPU.
5. Conclusions
This article presents a real-time robotic system for monitoring the generative growth
of tomato fruits. The system photographed tomato fruits in real-time with a RGB camera,
detected their presence, and classified their maturity into six levels. It also counted the
number of tomato fruits at each maturity level. Tomatoes grown under the hydroponic
conditions were detected by deep learning-based object detection and Faster R-CNN, and
the tomato fruit regions were separated from the background region using the k-means
clustering algorithm. We converted the frame image from an RGB to an HSV color model
and the maturity was classified into six levels using the mean hue values of the fruit re-
gions. In general, the one-step method of deep learning algorithms is used to detect and
classify the maturity levels of fruits. However, this method might not be suitable for the
uniform classification of maturity levels, because there is no quantitative standard for clas-
sifying these maturity levels. Thus, we considered the use of a two-step method that sep-
arates the detection and classification, and we found that the developed system has the
potential to monitor the maturity levels of tomato fruits in hydroponic greenhouses. Each
object was identified for counting purposes using the centroid-based object tracking algo-
rithm.
The presented system was able to monitor the numbers and maturity levels of tomato
fruits with appropriate accuracy. Thus, we conclude that the presented system could be
useful in the prediction of the harvest times and production levels of tomatoes, and that
it could also be applied to develop a tomato harvesting robot. However, to ensure greater
accuracy in the detection and counting of tomato fruits in hydroponic greenhouses, the
detection performance of the model must be improved, and the occlusion and duplicate
problems need to be solved.
Author Contributions: D.S. as first author planned the experiments and wrote the manuscript. B.-
H.C. and K.K. led the overall research as a corresponding author and helped revised the manuscript.
All authors have read and agreed to the published version of the manuscript.
Funding: This work was supported by Korea Institute of Planning and Evaluation for Technology
in Food, Agriculture and Forestry (IPET) and Korea Smart Farm R&D Foundation through Smart
Farm Innovation Technology Development Program, funded by MAFRA, MSICT and RDA (421031-
04).
Data Availability Statement: The data presented in this study are available on request from the
corresponding author. The data are not publicly available due to privacy reasons.
Conflicts of Interest: The authors declare no conflict of interest.
Agronomy 2021, 11, 2211 13 of 14
References
1. Yang, D.; Li, H.; Zhang, L. Study on the fruit recognition system based on machine vision. Adv. J. Food Sci. Technol. 2016, 10, 18–
21.
2. Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and localization methods for vision-based fruit picking
robots: A review. Front. Plant Sci. 2020, 11, 1–17. https://doi.org/10.3389/fpls.2020.00510.
3. Zhang, Q.; Karkee, M.; Tabb, A. The use of agricultural robots in orchard management. In Robotics and Automation for Improving
Agriculture; Billingsley, J., Ed.; Burleigh Dodds Science Publishing: Cambridge, UK, 2019; pp. 187–214.
https://arxiv.org/abs/1907.13114.
4. Srinivasan, N.; Prabhu, P.; Smruthi, S.S.; Sivaraman, N.V.; Gladwin, S.J.; Rajavel, R.; Natarajan, A.R. Design of an autonomous
seed plating robot. In Proceedings of the 2016 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Agra, India,
21–23 December 2016; pp. 1–4. https://doi.org/10.1109/R10-HTC.2016.7906789.
5. Santhi, P.V.; Kapileswar, N.; Chenchela, V.K.R.; Prasad, C.H.V.S. Sensor and vision based autonomous AGRIBOT for sowing
seeds. In Proceedings of the 2017 International Conference on Energy, Communication, Data Analysis and Soft Computing
(ICECDS), Chennai, India, 1–2 August 2017; pp. 242–245. https://doi.org/10.1109/ICECDS.2017.8389873.
6. Khuantham, C.; Sonthitham, A. Spraying robot controlled by application smartphone for pepper farm. In Proceedings of the
2020 International Conference on Power, Energy and Innovations (ICPEI), Chiangmai, Thailand, 14–16 October 2020; pp. 225–
228. https://doi.org/10.1109/ICPEI49860.2020.9431544.
7. Cantelli, L.; Bonaccorse, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A small versatile electrical robot for autonomous
spraying in agriculture. Agri. Eng. 2019, 1, 391–402. https://doi.org/10.3390/agriengineering1030029.
8. Danton, A.; Roux, J.C.; Dance, B.; Cariou, C.; Lenain, R. Development of a spraying robot for precision agriculture: An edge
following approach. In Proceedings of the 2020 IEEE Conference on Control Technology and Applications (CCTA), Montreal,
QC, Canada, 24–26 August 2020; pp. 267–272. https://doi.org/10.1109/CCTA41146.2020.9206304.
9. Murugan, K.; Shankar, B.J.; Sumanth, A.; Sudharshan, C.V.; Reddy, G.V. Smart automated pesticide spraying bot. In Proceed-
ings of the 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), Thoothukudi, India, 3–5 December
2020; pp. 864–868. https://doi.org/10.1109/ICISS49785.2020.9316063.
10. Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and simulation of an integrated end-effector for picking kiwifruit by
robot. Inf. Process. Agric. 2020, 7, 58–71. https://doi.org/10.1016/j.inpa.2019.05.004.
11. Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.;
Tuiji, B.V. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. https://doi.org/10.1002/rob.21937.
12. Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and
field evaluation. J. Field Robot. 2020, 37, 202–224. https://doi.org/10.1002/rob.21889.
13. Kuznetsova, A.; Maleva, T.; Soloviev, V. Using YOLOv3 algorithm with pre- and post-processing for apple detection in fruit-
harvesting robot. Agronomy 2020, 10, 1016. https://doi.org/10.3390/agronomy10071016.
14. Taqi, F.; Al-Langawi, F.; Abdulraheem, H.; El-Abd, M. A cherry-tomato harvesting robot. In Proceedings of the 2017 18th Inter-
national Conference on Advanced Robotics (ICAR), Hong Kong, China, 10–12 July 2017; pp. 463–468.
https://doi.org/10.1109/ICAR.2017.8023650.
15. Badeka, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Harvest crate detection for grapes harvesting robot
based on YOLOv3 model. In Proceedings of the 2020 Fourth International Conference On Intelligent Computing in Data Sci-
ences (ICDS), Fez, Morocco, 21–23 October 2020; pp. 1–5. https://doi.org/10.1109/ICDS50568.2020.9268751.
16. Chou, W.C.; Tsai, W.R.; Chang, H.H.; Lu, S.Y.; Lin, K.F.; Lin, P.L. Prioritization of pesticides in crops with a semi-quantitative
risk ranking method for Taiwan postmarket monitoring program. J. Food Drug Anal. 2019, 27, 347–354.
https://doi.org/10.1016/j.jfda.2018.06.009.
17. Ravankar, A.; Ravankar, A.A.; Watanabe, M.; Hoshino, Y.; Rawankar, A. Development of a low-cost semantic monitoring sys-
tem for vineyards using autonomous robots. Agriculture 2020, 10, 182. https://doi.org/10.3390/agriculture10050182.
18. Kim, W.S.; Lee, D.H.; Kim, Y.J.; Kim, T.; Lee, W.S.; Choi, C.H. Stereo-vision-based crop height estimation for agricultural robots.
Comput. Electron. Agric. 2021, 181, 105937. https://doi.org/10.1016/j.compag.2020.105937.
19. Fernando, S.; Nethmi, R.; Silva, A.; Perera, A.; De Silva, R.; Abeygunawardhana, P.K.W. Intelligent disease detection system for
greenhouse with a robotic monitoring system. In Proceedings of the 2020 2nd International Conference on Advancements in
Computing (ICAC), Malabe, Sri Lanka, 10–11 December 2020; pp. 204–209. https://doi.org/10.1109/ICAC51239.2020.9357143.
20. Yoon, C.; Lim, D.; Park, C. Factors affecting adoption of smart farms: The case of Korea. Comput. Hum. Behav. 2020, 108, 106309.
https://doi.org/10.1016/j.chb.2020.106309.
21. Santos, L.C.; Aguiar, A.S.; Santos, F.N.; Valente, A.; Petry, M. Occupancy grid and topological maps extraction from satellite
images for path planning in agricultural robots. Robotics 2020, 9, 77. https://doi.org/10.3390/robotics9040077.
22. Moysiadis, V.; Sarigiannidis, P.; Vitsas. V.; Khelifi, A. Smart farming in Europe. Comput. Sci. Rev. 2021, 39, 100345.
https://lps3.doi.org.proxy.jbnu.ac.kr/10.1016/j.cosrev.2020.100345.
23. Rong, J.; Wang, P.; Yang, Q.; Huang, F. A field-tested harvesting robot for oyster mushroom in greenhouse. Agronomy 2021, 11,
1210. https://doi.org/10.3390/agronomy11061210.
24. Liu, G.; Nouaze, J.C.; Mbouembe, P.L.T.; Kim, J.H. YOLO-tomato: A robust algorithm for tomato detection based on YOLOv3.
Sensors 2020, 20, 2145. https://doi.org/10.3390/s20072145.
Agronomy 2021, 11, 2211 14 of 14
25. Lawal, M.O. Tomato detection based on modified YOLOv3 framework. Sci. Rep. 2021, 11, 1447. https://doi.org/10.1038/s41598-
021-81216-5.
26. Afonso, M.; Fonteijn, H.; Fiorentin, F.S.; Lensink, D.; Mooij, M.; Faber, N.; Polder, G.; Wehrens, R. Tomato fruit detection and
counting in greenhouses using deep learning. Front. Plant Sci. 2020, 11, 571299. https://doi.org/10.3389/fpls.2020.571299.
27. Hu, C.; Liu, X.; Pan, Z.; Li, P. Automatic detection of single ripe tomato on plant combining Faster R-CNN and intuitionistic
Fuzzy set. IEEE Access 2019, 7, 154683–154696. https://doi.org/10.1109/ACCESS.2019.2949343.
28. Iwasaki, Y.; Yamane, A.; Itoh, M.; Goto, C.; Matsumuto, H.; Takaichi, M. Demonstration of year-round production of tomato
fruits with high soluble-solids content by low node-order pinching and high-density planting. Bull. NARO Crop. Sci. 2019, 3,
41–51. http://doi.org/10.24514/00002002.
29. FAOSTAT. Available online: http://www.fao.org/faostat/en/#home (accessed on 15 August 2021).
30. Alexander, L.; Grierson, D. Ethylene biosynthesis and action in tomato: A model for climacteric fruit ripening. J. Exp. Bot. 2002,
53, 2039–2055. https://doi.org/10.1093/jxb/erf072.
31. Garcia, M.B.; Ambat, S.; Adao, R.T. Tomayto, tomahto: A machine learning approach for tomato ripening stage identification
using pixel-based color image classification. In Proceedings of the 2019 IEEE 11th International Conference on Humanoid, Nan-
otechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Laoag, Phil-
ippines, 29 November 2019–1 December 2019; pp. 1–6. https://doi.org/10.1109/HNICEM48295.2019.9072892.
32. Rupanagudi, S.R.; Ranjani, B.S.; Nagaraj, P.; Bhat, V.G. A cost effective tomato maturity grading system using image processing
for farmers. In Proceedings of the 2014 International Conference on Contemporary Computing and Informatics (IC3I), Mysore,
India, 27–29 November 2014; pp. 7–12. https://doi.org/10.1109/IC3I.2014.7019591.
33. Pacheco, W.D.N.; Lopez, F.R.J. Tomato classification according to organoleptic maturity (coloration) using machine learning
algorithms K-NN, MLP, and K-Means Clustering. In Proceedings of the 2019 XXII Symposium on Image, Signal Processing and
Artificial Vision (STSIVA), Bucaramanga, Colombia, 24–26 April 2019; pp. 1–5. https://doi.org/10.1109/STSIVA.2019.8730232.
34. Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation.
In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, 23–28 June 2014; pp. 580–587.
https://doi.org/10.1109/CVPR.2014.81.
35. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceeding of
the 2016 IEEE Conference of Computer Vision and Pattern Recogniton, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788.
https://doi.org/10.1109/CVPR.2016.91.
36. Ren, S.; He, R.; Girshick, R.; Sun, J.; Faster R-CNN: Towards Real-Time Object Detectin with Region Proposal Networks. IEEE
Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. https://doi.org/10.1109/TPAMI.2016.2577031.
37. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single shot multibox detector. In Proceedings
of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37.
https://doi.org/10.1007/978-3-319-46448-0_2.
38. He, K.; Zhang, X.; Ren, S.; Sun. Deep Residual Learning for Image Recogniton. In Proceeding of 2016 IEEE Conference of Com-
puter Vision and Pattern Recogniton, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. https://doi.org/10.1109/CVPR.2016.90.
39. Make ML, Tomato Dataset, Make ML. Available online: https://makeml.app/datasets/tomato (accessed on 25 August 2021).
40. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R.; Mask R-CNN. In Proceeding of the 2017 IEEE International Conference on Com-
puter Vision, Venice, Italy, 22–29 October 2017; pp. 2980–2988. https://doi.org/10.1109/ICCV.2017.322.
41. Hallett, S.H.; Jones, R.J.A. Compilation of an accumulated temperature databased for use in an environmental information
system. Agric. For. Meteorol. 1993, 63, 21–34. https://doi.org/10.1016/0168-1923(93)90020-I.
42. Harvest Timer. Available online: https://harvest-timer.com (accessed on 23 August 2021).
43. Hirsch, R. Exploring Colour Photography: A Complete Guide. Laurance King Publishing, London, UK: 2004; ISBN 1-85669-420-
8.