Image Recognition using TensorFlow
Last Updated :
03 May, 2025
Image recognition is a task where a model identifies objects in an image and assigns labels to them. For example a model can be trained to identify difference between different types of flowers, animals or traffic signs. In this article, we will use Tensorflow and Keras to build a simple image recognition model.
Implementation of Image Recognition
Lets see various steps involved in its implementation:
Step 1: Importing TensorFlow and Other Libraries
Here we will be using Matplotlib, NumPy, TensorFlow, Keras and PIL libraries.
Python
import matplotlib.pyplot as plt
import numpy as np
import os
import PIL
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential
Step 2: Loading image datasets
We will be using flower dataset which contains 3,670 images with five classes labeled as daisy, dandelion, roses, sunflowers and tulips. Here, pathlib library is used to handle the path names of the downloaded image file.
- data_dir = tf.keras.utils.get_file(‘flower_photos’, origin=dataset_url, untar=True): Downloads dataset from the provided URL (origin=dataset_url) and extracts it (untar=True) returning the path where the dataset is stored.
- data_dir = pathlib.Path(data_dir): Converts directory path (data_dir) into a Path object to allow easier manipulation and operations with file paths.
Python
import pathlib
dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)
data_dir = data_dir / "flower_photos"
if not os.path.exists(data_dir):
raise FileNotFoundError(f"Dataset directory not found at: {data_dir}")
Output:

Dataset loaded successfully
Now after downloading it, we can count total images by using len() method. Here glob() method is used to find jpg files in the specified directory.
Python
image_count = len(list(data_dir.glob('*/*.jpg')))
print(f"Total images found: {image_count}")
if image_count == 0:
print("Warning: No images found. Check your dataset path and format.")
all_files = list(data_dir.glob('*/*'))
print(f"Found files (first 5): {[str(f) for f in all_files[:5]]}")
Output:
Total images found: 3670
Lets see some sample rose image from the dataset. Here we will find and list all files inside roses folder in the dataset directory.
Python
roses = list(data_dir.glob('roses/*'))
PIL.Image.open(str(roses[0]))
Output:

Rose image
Step 3: Creating a model
Working with images we need to load the images using tf.keras.utils.image_dataset_from_directory function. We will split the dataset into 80% training and 20% validation datasets.
Training Split: Data on which the model trains on.
- seed=123,image_size=(180, 180), batch_size=32): Sets a fixed random seed, target size and batch size respectively.
Python
train_ds = tf.keras.utils.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="training",
seed=123,
image_size=(180, 180),
batch_size=32)
Output:
Found 3670 files belonging to 5 classes.
Using 2936 files for training.
Validation Split: Data on which the model gets validated.
Python
val_ds = tf.keras.utils.image_dataset_from_directory(
data_dir,
validation_split=0.2,
subset="validation",
seed=123,
image_size=(180,180),
batch_size=32)
Output:
Found 3670 files belonging to 5 classes.
Using 734 files for validation.
We can check class names by calling the class_names function on the training dataset in alphabetical order.
Python
class_names = train_ds.class_names
print(class_names)
Output:
['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']
Step 4: Visualizing the Datasets
Before training the model let’s visualize some images from the training dataset using matplotlib. This will help us understand how the dataset looks. We can view 25 images from training dataset.
- ax = plt.subplot(5, 5, i + 1): Adds a subplot at the specified position in the 5×5 grid.
Python
import matplotlib.pyplot as plt
plt.figure(figsize=(10, 10))
for images, labels in train_ds.take(1):
for i in range(25):
ax = plt.subplot(5, 5, i + 1)
plt.imshow(images[i].numpy().astype("uint8"))
plt.title(class_names[labels[i]])
plt.axis("off")
Output:

Sample training images
Step 5: Building the Model
Here we design CNN (Convolutional Neural Network) model using Keras Sequential()
model which is commonly used model. We will use three convolution layers with Conv2D
and MaxPooling2D
followed by a dense layer to classify images.
- layers.Rescaling(1./255, input_shape=(180,180, 3)): Rescales images to [0,1] and sets input image size.
- layers.Conv2D(16, 3, padding=’same’, activation=’relu’): Adds a convolutional layer with 16 filters and ReLU activation.
- layers.MaxPooling2D(): Adds a max-pooling layer to down sample feature maps.
Python
num_classes = len(class_names)
model = Sequential([
layers.Rescaling(1./255, input_shape=(180,180, 3)),
layers.Conv2D(16, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(32, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(64, 3, padding='same', activation='relu'),
layers.MaxPooling2D(),
layers.Flatten(),
layers.Dense(128, activation='relu'),
layers.Dense(num_classes)
])
Step 6: Compiling the Model
Now we compile model with the Adam optimizer and sparse categorical cross-entropy loss function. This allows us to evaluate performance of the model in terms of accuracy.
Python
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(
from_logits=True),
metrics=['accuracy'])
model.summary()
Output:

Model summary
Step 7: Training the Model
We can now train model using the model.fit() function. We will use the training and validation datasets and train model for 10 epochs.
Python
epochs=10
history = model.fit(
train_ds,
validation_data=val_ds,
epochs=epochs
)
Output :

No of Epochs
With each epoch, accuracy is changed.
Step 8: Visualizing Training Results
Creating plots of accuracy and loss on the training and validation sets to consider bias and variance.
- acc = history.history[‘accuracy’]: Stores training accuracy from each epoch in the acc variable.
- val_acc = history.history[‘val_accuracy’]: Stores validation accuracy from each epoch in the val_acc variable.
Python
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs_range = range(epochs)
plt.figure(figsize=(8, 8))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()
Output :

Visualization of Accuracy and Loss
By following these steps, we’ve learned how to classify images into categories easily. With this we can understand the important concepts of image recognition.
Similar Reads
Image Segmentation Using TensorFlow
Image segmentation refers to the task of annotating a single class to different groups of pixels. While the input is an image, the output is a mask that draws the region of the shape in that image. Image segmentation has wide applications in domains such as medical image analysis, self-driving cars,
7 min read
Object Detection using TensorFlow
Identifying and detecting objects within images or videos is a key task in computer vision. It is critical in a variety of applications, ranging from autonomous vehicles and surveillance systems to augmented reality and medical imaging. TensorFlow, a Google open-source machine learning framework, pr
7 min read
Skin Cancer Detection using TensorFlow
In this article, we will learn how to implement a Skin Cancer Detection model using Tensorflow. We will use a dataset that contains images for the two categories that are malignant or benign. We will use the transfer learning technique to achieve better results in less amount of training. We will us
5 min read
One Hot Encoding using Tensorflow
In this post, we will be seeing how to initialize a vector in TensorFlow with all zeros or ones. The function you will be calling is tf.ones(). To initialize with zeros you could use tf.zeros() instead. These functions take in a shape and return an array full of zeros and ones accordingly. Code: imp
2 min read
Sign Language Recognition System using TensorFlow in Python
Sign language is a important mode of communication for individuals with hearing impairments. Building an automated system to recognize sign language can significantly improve accessibility and inclusivity. In this article we will develop a Sign Language Recognition System using TensorFlow and Convol
5 min read
Implementing Neural Networks Using TensorFlow
Deep learning has been on the rise in this decade and its applications are so wide-ranging and amazing that it's almost hard to believe that it's been only a few years in its advancements. And at the core of deep learning lies a basic "unit" that governs its architecture, yes, It's neural networks.
8 min read
Load Images in Tensorflow - Python
In this article, we are going to see how to load images in TensorFlow in Python. Loading Images in Tensorflow For loading Images Using Tenserflow, we use tf.keras.utils.load_img function, which loads the image from a particular provided path in PIL Format. PIL is a Python Imaging Library that gives
3 min read
XOR Implementation in Tensorflow
In this article, we'll learn how to implement an XOR gate in Tensorflow. Before we move onto Tensorflow implementation we'll have a look at how the XOR Gate Truth Table to get a deep understanding about XOR. X Y X (XOR) Y 0 0 0 0 1 1 1 0 1 1 1 0 From the above truth table, we come to know that the o
5 min read
Introduction to TensorFlow
TensorFlow is an open-source framework for machine learning (ML) and artificial intelligence (AI) that was developed by Google Brain. It was designed to facilitate the development of machine learning models, particularly deep learning models, by providing tools to easily build, train, and deploy the
6 min read
Training Loop in TensorFlow
Training neural networks is at the core of machine learning, and understanding how to write a training loop from scratch is fundamental for any deep learning practitioner and TensorFlow provides powerful tools for building and training neural networks efficiently. In this article, we will get into t
7 min read