Machine learning
Machine learning
Neural Networks
Semester Project
Presented to:
Dr. Aisha Batool
Presented by:
Soban Khan (UW-22-AI-BS-052)
Shahzaib Khan (UW-22-AI-BS-028)
Image Classification of Cats and Dogs using Convolutional
Neural Networks
Problem Description
Objective: The aim of this project is to develop a Convolutional Neural Network (CNN) model
for classifying images of cats and dogs. This problem is important because it serves as a
foundational task in the field of computer vision, with applications in areas such as animal
welfare and pet adoption.
Benefits:
Animal Welfare Organizations: Can use the model to automate the identification of pets in
shelters.
Research Community: Provides a basis for further exploration into more complex image
classification tasks.
Dataset Description:
Name of Dataset: Kaggle Cats and Dogs Dataset
Potential Work
Machine Learning Approaches:
Models: CNN architectures such as VGGNet, ResNet, or custom architectures can be utilized.
Pre-processing:
Evaluation Metrics:
Accuracy
Precision
Recall
F1-score
Methodology
Data Acquisition
Download the dataset using the Kaggle API or wget.
Unzip and load the data into a suitable format (e.g., Pandas DataFrame).
Data Preprocessing
Resize images to 128x128 pixels.
Normalize pixel values.
Implement data augmentation using Keras' ImageDataGenerator.
Expected Results
Aim for an accuracy of at least 97.5% on the test set.
Anticipate strong performance metrics (precision, recall, F1-score) across both classes
(cats and dogs).
Conclusion
This project aims to leverage CNNs for effective image classification of cats and dogs using the
Kaggle dataset. The expected outcomes can significantly benefit animal welfare initiatives and
further research in image classification tasks.
References
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet classification with deep
convolutional neural networks.
Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.