Convolutional neural networks have been developed to more accurately solve computer vision problems, with CNN architectures being used in many applications. This paper classifies flower images using transfer learning with a deep CNN pre-trained on ImageNet with TensorFlow, where the type of transfer learning depends on the size and similarity of the new dataset to the original. Initializing weights from a pre-trained model allows for more robust fine-tuning and helps prevent overfitting to the new data.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
39 views1 page
CNN For Transfer Learning
Convolutional neural networks have been developed to more accurately solve computer vision problems, with CNN architectures being used in many applications. This paper classifies flower images using transfer learning with a deep CNN pre-trained on ImageNet with TensorFlow, where the type of transfer learning depends on the size and similarity of the new dataset to the original. Initializing weights from a pre-trained model allows for more robust fine-tuning and helps prevent overfitting to the new data.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1
3.
CNN for Transfer learning
In over past few years, more advanced convolutional neural networks have been developed by researchers in order to solve computer vision problems more accurately. CNN architectures are leveraged in many applications. In this paper to classify flower images, we utilized deep convolutional neural networks pre-trained on ImageNet using TensorFlow which is a deep learning framework developed by Google. The type of transfer learning to use on new dataset depends on two factors: whether the new dataset is small or big; and how similar it is to the data used to train the original model. An additional factor to be vigilant about is overfitting. However, for all practical purposes, it may still be advisable to initialize weights from a pre- trained model, as it allows more robust fine-tuning throughout the entire network.