0% found this document useful (0 votes)
181 views5 pages

Inception-V3 For Flower Classification

This document discusses using the Inception-v3 model on the TensorFlow platform to classify flower images with transfer learning. It provides background on previous research in flower classification, which primarily used color, shape and texture features and required human intervention. Convolutional neural networks provide a more automated approach. The paper proposes to use transfer learning with Inception-v3 to retrain flower datasets and improve classification accuracy over traditional methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
181 views5 pages

Inception-V3 For Flower Classification

This document discusses using the Inception-v3 model on the TensorFlow platform to classify flower images with transfer learning. It provides background on previous research in flower classification, which primarily used color, shape and texture features and required human intervention. Convolutional neural networks provide a more automated approach. The paper proposes to use transfer learning with Inception-v3 to retrain flower datasets and improve classification accuracy over traditional methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2017 2nd International Conference on Image, Vision and Computing

Inception-v3 for Flower Classification

'
Xiaoling Xia Cui Xu
College of Computer Science, Donghua University College of Computer Science, Donghua University
Shanghai, China Shanghai, China
e-mail: [email protected] e-mail: [email protected]

Bing Nan
College of Computer Science, Donghua University
Shanghai, China

Abstract-The study of flower classification system is a very development of computer technology and digital image
important subject in the field of Botany. A classifier of flowers processing technology, people began to explore the method
with high accuracy will also bring a lot of fun to people's lives. of automatic flower classification by computer, this method
However, because of the complex background of flowers, the is mainly based on the color, shape, texture and other
similarity between the different species of flowers, and the features of the flowers to calculate the similarity between the
differences among the same species of flowers, there are still
flower images and then determine the species of flowers. In
some challenges in the recognition of flower images. The
2004, Jie Zou and George Nagy from Rensselaer Polytechnic
traditional flower classification is mainly based on the three
Institute introduce the concept of Computer Assisted Visual
features: color, shape and texture, this classification requires
InterActive Recognition (CAVIAR) and implemented a
people to select features for classification, and the accuracy is
flower recognition system [14]. In 2004, Takeshi Saitoh et al.
not very high. In this paper, based on Inception-v3 model of
from Toyohashi University of Technology proposed an
TensorFlow platform, we use the transfer learning technology
automatic method for recognizing a blooming flower, it
to retrain the flower category datasets, which can greatly
improve the accuracy of flower classification.
conducted for 600 pictures and obtained a flower recognition
rate of 90% [15]. In 2006, Maria-Elena Nilsback and
Keywords-flower classification; TensorFlow; inception-v3; Andrew Zisserman from the University of Oxford developed
transfer learning a visual vocabulary that explicitly represents the various
aspects (colour, shape, and texture) that distinguish one
flower from another, it can overcome the ambiguities that
I. INTRODUCTION
exist between flower categories [13]. They proposed an
Flower classification is a fundamental research work in algorithm for automatically segmenting flowers in 2007 [12].
the field of Botany. Up to now, it has been found that there In 2008, Maria-Elena Nilsback and Andrew Zisserman
are hundreds of thousands of species of flowers, which is one computed four different features for the flowers and
of the most prosperous species in the world. With the combined the features using a multiple kernel framework
development of economy and technology, more and more with a SVM classifier, the accuracy on Oxford-102 flower
people love to travel in the blooming season. At the same dataset reached 88.33% [11]. They also perfected the
time, people use cameras, mobile phones and other devices algorithm for automatically segmenting flowers in 2010 [9].
to shoot flowers, but people will also get confused because In 2013, Anelia Angelova and Shenghuo Zhu proposed an
they don't know the species of flowers. Therefore, the design efficient object detection and segmentation method for fine­
of a flower classifier will also bring a lot of fun to people's grained recognition [7]. In 2014, Xiaodong Xie used the
lives. There are some challenges in flower classification, the fine-grained flower classification on Oxford-17 flower
background of flower image is complex, there is similarity dataset (the recognition accuracy was 93.14%) and Oxford­
between the different species of flowers, so we can not just lO2 flower dataset (the recognition accuracy was 79.1%) [6].
rely on a single feature, such as color, shape or texture to During this period, people pay more attention to image
distinguish the species of flowers, and the same species of segmentation and artificial feature selection. The traditional
flowers will be different because of the shape, scale, computer classification method is not fully automatic
viewpoint and so on. classification method, the feature selection process requires
The most primitive method of flower classification is to human intervention, the accuracy of feature selection directly
observe the living habits, morphological structure and other affects the overall classification, and the accuracy is not very
features of flowers, and then compared with the registered high.
flowers, and ultimately determine the types of flowers. This Convolutional neural network is an efficient recognition
classification method is entirely artificial, the workload is method which has been developed in recent years. This
large, and need the professional staffs who have a wealth of network avoids the complex preprocessing of the image, and
professional knowledge and experience to guide. With the

978-1-5090-6238-6/17/$31.00 ©2017 IEEE 783


Authorized licensed use limited to: Rajamangala Univ of Technology Isan provided by UniNet. Downloaded on June 09,2023 at 06:06:08 UTC from IEEE Xplore. Restrictions apply.
people can input the original image directly. It uses local small amount of data to train the model, and achieve high
receptive field, weights sharing and pooling technology and accuracy with a short training time.
makes the training parameters greatly reduced compared to
the neural network. It also has a certain degree of translation, III. CONSTRUCTION OF FLOWER CLASSIFICATION MODEL

rotation and distortion invariance of image. It has made great This section focuses on the construction process of
progress in the field of image classification. flower classification model. The construction process of the
TensorFlow [1] is the second generation of artificial flower image classification model is divided into four steps:
intelligence learning system developed by Google, which image preprocessing, training process, verification process
supports the convolutional neural network (CNN), recurrent and testing process.
neural network (RNN) and other depth of the neural network
model, which can be used in speech recognition, image A. Image Preprocessing
recognition and so on many machines deep learning field. The learning method of convolution neural network
In this paper, we use the transfer learning technique to belongs to supervised learning in machine learning, so in the
retrain the Inception-v3 [3] model of TensorFlow [1] on the image preprocessing step we need to label the data.
flower category datasets [13] [11]. We implemented an
B. Inception-v3 Model
effective flower classification model using a short training
time and achieve a higher accuracy. The rest of this paper is The main graph of Inception-v3 [3] model is shown in
organized as follows: part II we introduces the relevant work figure 1 at the end of the paper.
of flower classification; we introduces the construction
process of flower classification model in part III; part IV we
prove the validity of the model through experiments.

II. RELATED WORK

Up to now, it has been found that there are hundreds of


thousands of species of flowers, the researches on flower
classification are widely used the Oxford-I7 flower dataset
and Oxford-I02 dataset, therefore, we also uses the two
datasets for flower classification model training.
Compared with the traditional flower classification
methods, convolutional neural network use multilayer
convolution to extract features and combine the features
automatically. It has a higher recognition rate and a wider
range of applications.
TensorFlow [1] as the second generation of Google
artificial intelligence learning system has got much attention
and affirmation in the field of the machine learning in all
over the world. TensorFlow has ranked first in all machine
learningand deep learning programs so far. TensorFlow has
the advantages of high availability and high flexibility, and
with the support of TensorFlow researchers, the efficiency of
TensorFlow is improved. Today, Google has released a
number of pretrained models on the TensorFlow's official
website, to facilitate the use of researchers in different fields.
Inception-v3 [3] is one of the pretrained models on the
TensorFlow. It is a rethinking for the initial structure of
computer vision after Inception-vI [5], Inception-v2 [4] in
2015. The Inception-v3 [3] model is trained on the ImageNet
datasets , containing the information that can identify 1000
classes in ImageNet, the error rate of top-5 is 3.5% , the error
rate of top-I dropped to 17.3%. Tensorflow [1] also provides
detailed tutorials for us to retrain Inception's final Layer for
new categories using transfer learning.
Transfer learning is a new machine learning method
which can use the existing knowledge learned from one
environment and solve the other new problem which is
different but has some relation with the old problem. For
example, we can apply the knowledge learned from the
motorcycle problem to the study of bike problem. Compared
with the traditional neural network, it only needs to use a Figure 1. Main graph of Inception-v3 model.

784
Authorized licensed use limited to: Rajamangala Univ of Technology Isan provided by UniNet. Downloaded on June 09,2023 at 06:06:08 UTC from IEEE Xplore. Restrictions apply.
C. Transfer Learning Based on Inception-v3 Model images to the species which contains few images to increase
the dataset.
Inception-v3 [3] network model is a deep neural network,
Transfer learning based on Inception-v3 [3] model. we
it is very difficult for us to train it directly with a low
should keep the parameters of the previous layer, then
configured computer, it takes at least a few days to train it.
remove the last layer and input the flower dataset to retrain
Tensorflow [1] provides a tutorials for us to retrain
the new last layer, the number of output nodes will be
Inception's final Layer for new categories using transfer
changed to I7(Oxford-I7 flower dataset) or 102(102 flower
learning. We use the transfer learning method which keep
dataset).
the parameters of the previous layer and remove the last
The last layer of the model is trained by back propagation
layer of the Inception-v3 [3] model, then retrain a last layer.
algorithm, and the cross entropy cost function is used to
The number of output nodes in the last layer is equal to the
adjust the weight parameter by calculating the error between
nwnber of categories in the dataset. For example, ImageNet
the output of the softmax layer and the label vector of the
dataset has 1000 classes, so the last layer has 1000 output
given sample category.
nodes in the original Inception-v3 model.

IV. EXPERIMENT

This experiment is based on the Inception-v3 [3] model


of TensorFlow [1] platform, the hardware platform is
alpine sea anthuriumJpg artichoke.jpg azaleaJpg
MacBook Pro: processor 2.9GHz Intel i7, memory 8GB
holly.jpg
1600MHz DDR3. The experimental datasets are the Oxford-
17 flower dataset and the Oxford-I02 flower dataset.
In this section, the following part is as follows: first, we
make a simple introduction on the dataset; second, we
introduce the process of the experiment in detail; then, we ball moss.jpg balloon barbeton bearded iris.jpg

show the result of this experiment; finally, we verify the flower.jpg daisy.jpg

effectiveness of the method through the comparison


experiment.

A. Dataset
bee balm.jpg bird of bishop of blackberry
Oxford-I7 flower dataset [13]: this dataset was created paradise.jpg Ilandaff.jpg lily.jpg

by Maria-Elena and Andrew Zisserman from the University


of Oxford in the United States in 2006, which contains 17
species of flowers, each of species has 80 flower images. The
selected flowers are common in Britain. In this dataset, the
black·eyed blanket bolero deep bougainvillea.jp
scale of the image is large, the posture and the lighting are susan.jpg flower.jpg blue.jpg 9
also greatly changed, and there is similarity between the
different species of flowers and differences between the
same species of flowers.
Oxford-I02 flower dataset [11]: this dataset contains 102
species of flowers, it was also created by Maria-Elena bromelia.jpg buttercupJpg californian camellia.jp9
Nilsback and Andrew Zisserman in 2008, each of species poppy.jpg

contains 40-258 images. Compared with the Oxford-I7


flower dataset, the Oxford-I02 flower dataset contain more
species of flowers and there are more similarities between
different types of flowers, so the flower classification will be
canna lilyJpg canterbury cape f1ower.jp9 carnation.jpg
more complex. bells.jpg
There are many types of flowers in the two datasets, it is
difficult to display them one by one in the paper, so we select
28 kinds of flowers in Oxford-I02 flower dataset [11] as a
sample show. Figure 2 shows the example of 28 species of
c.autleya c1ematisJpg colt's footJpg columbine.jpg
flowers in Oxford-102 flower dataset [11]. spicata.jpg

B. Experimental Procedure Figure 2. The example of 28 species of flowers in Oxford-102 flower


dataset.
Image preprocessing. For the Oxford-I7 flower dataset
[13], each of species has 80 flower images, we only need to
label the image in the dataset. For the Oxford-102 flower C. The Result of Experiment
dataset [11], each of species contains 40-258 flower images, Figure 3 and figure 4 show the variation of accuracy and
the number of the flower images in some species is too small. cross entropy based on Oxford-I7 flower dataset [13]. Figure
So in addition to label the image, we also need to add some 5 and figure 6 show the variation of accuracy and cross

785
Authorized licensed use limited to: Rajamangala Univ of Technology Isan provided by UniNet. Downloaded on June 09,2023 at 06:06:08 UTC from IEEE Xplore. Restrictions apply.
entropy based on Oxford-102 flower dataset [11]. The the trammg accuracy keep between 99%-100%, and the
orange line represents the training set, and the green line validation accuracy can be maintained at around 95%.
represents the validation set.
TABLET. DESCRIPTION OF THE FOUR FIGURES
accuracy

Dataset index Performance

accuracy of training set 100%

accuracy of validation set 98%-99%


Oxford- 17
flower dataset
cross entropy of training set 0.0 1

0700 cross entropy of validation set 0.07

accuracy of training set 99%-100%

accuracy of validation set 95%


Figure 3. The variation of accuracy on Oxford-I 7 flower dataset. Oxford-I 02
flower dataset
cross entropy of training set 0. 10
crossenlropy

cross entropy of validation set 0.24


2.00

D. Control Experiment

'00
In order to verify the effectiveness of the proposed
method, experiments are carried out to compare the flower
0"'"
classification algorithm based on [6], [7], [14] on Oxford-I7
flower dataset [13], and compare the flower classification
algorithm based on [6], [11] on Oxford-102 flower dataset
0.00

[11]. The results are shown in Table II and Table III.


Figure 4. The variation of cross entropy on Oxford-I 7 flower dataset.
TABLE H. CLASSIFICATION PERFORMANCE COMPARISON OF
accuracy DIFFERENT ALGORITHMS ON OXFORD- 17 FLOWER DATASET

Method Performance
0.900

[7] 85%

[ 14] 89%

[6] 93%

Our Method 95%

TABLE HI. CLASSIFICATION PERFORMANCE COMPARISON OF


DIFFERENT ALGORITHMS ON OXFORD-I 02 FLOWER DATASET
Figure 5. The variation of accuracy on Oxford-I 02 flower dataset.
Method Performance
crossentropy
[6] 79. 1%

[ 1 1] 88.33%
3.50
Our Method 94%

From the experimental results of Table II and Table III,


we can see that the classification accuracy of this method is
higher than other method.

V. CONCLUSION

Figure 6. The variation of cross entropy on Oxford- 17 flower dataset In this paper, based on the Inception-v3 model of
TensorFlow platform, we use the transfer learning
Table I shows the description of the four figures. For technology to train a flower classification model on flower
Oxford-I7 flower dataset [13], the training accuracy can category datasets. The classification accuracy of the model
reach to 100%, and the validation accuracy can be are 95% on Oxford-I7 flower dataset and 94% on Oxford-
maintained at 98% -99%. For Oxford-I02 flower dataset [11], 102 flower dataset, which is higher than other method. The

786
Authorized licensed use limited to: Rajamangala Univ of Technology Isan provided by UniNet. Downloaded on June 09,2023 at 06:06:08 UTC from IEEE Xplore. Restrictions apply.
future work is to study and develop a more effective model [8] Alex Krizhevsky, lIya Sutskever, Geoffrey E. Hinton, ImageNet
Classification with Deep Convolutional Neural Networks. NIPS 20 12:
for image classification.
1 106- 1 1 14,2014.

REFERENCES [9] Maria-Elena Nilsback, Andrew Zisserman, Delving deeper into the
whorl of flower segmentation. Image Vision Comput. 28(6): 1049-
[I] Martin Abadi, Ashish Agarwal, et aI, TensorFlow: Large-Scale 1062,20 10.
Machine Learning on Heterogeneous Distributed Systems. CoRR
[ 10] Nilsback M-E ' An Automatic Visual Flora -- Segmentation and
abs/1603.04467 ,20 16.
Classific �tion of Flowers Images. PhD thesis,2009.
[2] Abhineet Saxena, Convolutional neural networks: an illustration in
[ 1 1] Maria-Elena Nilsback, Andrew Zisserman: Automated Flower
TensorFlow. ACM Crossroads 22(4): 56-58, 20 16.
Classification over a Large Number of Classes. ICVGlP 2008: 722-
[3] Christian Szegedy, Vincent Vanhoucke, et aI, Rethinking the 729,2008.
Inception Architecture for Computer Vision. arXiv: 15 12.00567, 20 15.
[ 12] Maria-Elena Nilsback,Andrew Zisserman: Delving into the Whorl of
[4] Sergey loffe,Christian Szegedy, et al. Batch Normalization: Flower Segmentation. BMVC 2007: 1- 10,2007.
Accelerating Deep Network Training by Reducing Internal CovarIate
[ 13] Maria-Elena Nilsback, Andrew Zisserman: A Visual Vocabulary for
Shift. ICML20 15: 448-456,20 15.
Flower Classification. CVPR (2) 2006: 1447-1454,2006.
[5] Christian Szegedy, Wei Liu, et aI, Going Deeper with Convolutions.
[ 14] Zou J, Nagy G, Evaluation of model-based interactive flower
arXiv:1409.4842,2014.
recognition[C]. International Conference on Pattern Recognition:
[6] Xiaodong Xie, Research on Fine-Grained Classification for Visual 3 1 1-314,IEEE,2004.
Flower Image [D]. Xiamen University,20 14.
[ 15] Saitoh T, Aoki K, Kaneko T, Automatic recognition of blooming
[7] Angelova A, Zhu S, Efficient Object Detection and Segmentation for flowers [C]. International Conference on Pattern Recognition: 27-30,
Fine-Grained Recognition[C]. Computer Vision and Pattern IEEE,2004.
Recognition: 8 1 1-8 18,IEEE,20 13.

787
Authorized licensed use limited to: Rajamangala Univ of Technology Isan provided by UniNet. Downloaded on June 09,2023 at 06:06:08 UTC from IEEE Xplore. Restrictions apply.

You might also like