0% found this document useful (0 votes)
2 views21 pages

Department of Civil Engineering

The document outlines the development of an Emotion Detector app using Convolutional Neural Networks (CNN) within the TensorFlow framework, aimed at recognizing human emotions from facial expressions in real-time. Key phases include data collection and preprocessing, model design and training, and evaluation using various performance metrics. The app has potential applications in customer service, healthcare, marketing, and education.

Uploaded by

Sahil Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views21 pages

Department of Civil Engineering

The document outlines the development of an Emotion Detector app using Convolutional Neural Networks (CNN) within the TensorFlow framework, aimed at recognizing human emotions from facial expressions in real-time. Key phases include data collection and preprocessing, model design and training, and evaluation using various performance metrics. The app has potential applications in customer service, healthcare, marketing, and education.

Uploaded by

Sahil Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

De-

part-
ment of civil Engineering

Emotion Detector App Development


Using CNN

Title Page

Title: Development of an Emotion Detector App Using CNN


Student Name: Sahil Yadav
Internship Period: [27-03-2024] - [10-05-2024]
Organization: Null Class Ed Tech
Enrollment no: 21411026
Student Declaration

I, Sahil Yadav, hereby declare that the work presented in this re-
port entitled "Development of an Emotion Detector App Using
CNN" is my own and has been carried out under the supervision
of Senior Data scientist at Null Class Ed Tech. This work has not
been submitted elsewhere for any other degree or diploma.

(signature of student) (Date of submission)


Certification
Abstract
This report details the development of an Emotion Detector app utilizing
Convolutional Neural Networks (CNN) within the TensorFlow framework.
The primary aim of this project was to create an application capable of re-
cognizing and classifying human emotions from facial expressions in real-
time. The process involved several key phases:

1) Data Collection and Preprocessing: A comprehensive dataset of fa-


cial images representing various emotions was collected from publicly
available sources. This dataset underwent extensive preprocessing, in-
cluding normalization, resizing, and data augmentation techniques, to en-
hance the model's robustness and generalizability.

2) Model Design and Training: A deep learning model based on CNN ar-
chitecture was designed to extract and learn features from the facial im-
ages. The CNN comprised multiple layers, including convolutional layers
for feature extraction, pooling layers for dimensionality reduction, and
fully connected layers for classification.

3) Evaluation and Optimization: The trained model was evaluated us-


ing various metrics, including accuracy, precision, recall, and F1-score.

Table of contents

1. Introduction
 General Introduction
 Field application
 Case studies
2. Details
 Collecting a comprehensive dataset of facial expressions.
 Designing and training a CNN model using TensorFlow
 Developing a user-friendly app interface for real-time emotion
detection.

3. Methodology
 Data training
 Model training
 App development

4. Results
5. Conclusion
List of Figures

1. CNN Architecture Diagram


2. Code Screenshots
3. Model Training Screenshots
4. Model Accuracy Screenshots
5. App Interface Screenshots

List of Tables
1. Parameter Table

Abbreviations
1. CNN: Convolutional Neural Network
2. AI: Artificial Intelligence
3. ReLu: Rectified linear unit
4. FER-2013: Facial Expression Recognition 2013
5.
Introduction

General Introduction

Emotion detection is a vital aspect of modern artificial intelligence (AI)


systems. It involves identifying and interpreting human emotions from
various forms of data such as text, speech, and facial expressions. Among
these, facial expression recognition is particularly significant due to its
wide range of applications in social and professional contexts. Emotion de-
tection through facial expressions can enhance human-computer interac-
tion, improve customer service experiences, and aid in mental health as-
sessments.

This project focuses on developing an Emotion Detector app that lever-


ages Convolutional Neural Networks (CNNs) to recognize and classify hu-
man emotions from facial images. CNNs are a class of deep learning mod-
els that have proven highly effective in image recognition tasks due to
their ability to automatically learn spatial hierarchies of features.
Field Applications

Emotion detection has numerous applications across various industries:

 Customer Service: Emotion detection can enhance user experi-


ences by allowing systems to respond appropriately to customer
emotions. For instance, customer support chatbots can detect frus-
tration or dissatisfaction and escalate issues to human agents.

 Healthcare: Emotion recognition can play a crucial role in mental


health monitoring, providing insights into patients' emotional states
and identifying signs of depression, anxiety, or other mental health
conditions.

 Marketing: Understanding consumer emotions in response to ad-


vertisements or products can help companies tailor their marketing
strategies and improve customer engagement.

 Education: Emotion detection can be used in e-learning platforms


to gauge student engagement and provide personalized learning ex-
periences.

 Security: Surveillance systems equipped with emotion detection


can identify suspicious or threatening behavior in real-time.

Case Studies

A notable example of emotion detection technology in action is Affectiva,


an AI company specializing in emotion AI. Affectiva's technology analyzes
facial expressions and emotions in real-time, providing valuable insights
for applications in automotive, advertising, and media analytics. Their
emotion AI is used to enhance driver safety by monitoring driver emotions
and detecting signs of drowsiness or distraction.
Another example is the use of emotion detection in call centers, where AI
systems analyze customer emotions during interactions to improve ser-
vice quality. Companies like Cogito provide real-time emotional intelli-
gence solutions that help agents adapt their responses based on the de-
tected emotions, leading to better customer satisfaction.
Detailed Problem Statement
The goal of this project was to develop an emotion detection app that can
accurately identify human emotions from facial expressions in real-time.

Here’s the step-by-step process:

1. Importing necessary libraries

2. Dataset Visualization:- The code's main purpose is to visualize a


sample of nine randomly chosen images from the specified directory
(train/**/**). This can be useful for:
 Data Exploration: Quickly viewing a sample of your data-
set to understand the types of images it contains.
 Verification: Ensuring that the images are correctly loaded
and displayed, and the file paths are correctly specified.
3. loading and preprocessing image data for training and valid -
ation:- ImageDataGenerator is a class in Keras that provides various
image augmentation techniques to create batches of tensor image
data with real-time data augmentation. It helps to generate more
varied images from the existing dataset, thus helping the model to
generalize better.

4. Defining the Convolutional layers and fully connected lay-


ers:- This code defines a CNN architecture for facial expression re-
cognition, applying convolutional and dense layers with batch nor-
malization, ReLU activation, max-pooling, and dropout to prevent
overfitting. The final layer uses softmax activation for multi-class
classification into 7 categories. The model is then compiled for train-
ing.

5. Model Instances(Creating the model)


6. Model Training:- This code snippet is setting up and training a ma-
chine learning model using the fit method. It specifies the number of
epochs, the steps per epoch, and the validation steps based on the
sizes of the training and validation datasets. It also includes a call-
back for saving the model weights when the validation accuracy im-
proves, ensuring the best model is saved during training. The Mod-
elCheckpoint is particularly useful for saving the best version of the
model based on a monitored metric.
7. Model Evaluation

8. Plotting graphs for the visualization of loss, accuracy and other im-
portant factors.
Methodology
Data Collection

The initial step in developing the Emotion Detector app involved assem-
bling a comprehensive dataset of facial expressions. Key steps included:

 Sourcing Data: We utilized public databases such as the FER-2013


dataset, which contains thousands of images categorized into differ-
ent emotion classes (e.g., happy, sad, angry, surprised, neutral).

 Data Preprocessing: Images were standardized to a consistent


size (48x48 pixels) and converted to grayscale to simplify the
model's input requirements. Normalization was applied to scale pixel
values to the [0, 1] range, enhancing model performance.

 Data Augmentation: To improve the robustness of the model, data


augmentation techniques were employed. These included:

o Rotation: Randomly rotating images within a range of ±20


degrees.

o Flipping: Horizontally flipping images to simulate different


orientations.

o Scaling: Randomly zooming in and out to create variations in


image sizes.

Model Training

The heart of the project was the design and training of a Convolutional
Neural Network (CNN) using TensorFlow. The process was divided into sev-
eral stages:

1. Model Architecture

 Convolutional Layers: Multiple convolutional layers were used to


extract features from the images. Each layer applied a set of filters
to detect edges, textures, and patterns relevant to different emo-
tions.

 Pooling Layers: Max-pooling layers were introduced to reduce the


spatial dimensions of the feature maps, making the model more
computationally efficient and reducing overfitting.

 Fully Connected Layers: After several convolutional and pooling


layers, the model included fully connected (dense) layers to inter-
pret the extracted features and classify the emotions.

 Activation Functions: ReLU (Rectified Linear Unit) was used as the


activation function in hidden layers to introduce non-linearity, while
softmax was applied in the output layer to obtain probability distri-
butions over emotion classes.
2. Hyperparameter Tuning

 Learning Rate: Experimented with different learning rates to find


the optimal value that balanced convergence speed and model ac-
curacy.

 Batch Size: Tested various batch sizes to ensure efficient training


and better generalization.

 Epochs: The model was trained over multiple epochs, monitoring


performance on a validation set to prevent overfitting.
3. Model Evaluation

 Training and Validation Split: The dataset was split into training
(80%) and validation (20%) sets to evaluate model performance
during training.

 Performance Metrics: Key metrics included accuracy, precision,


recall, and F1-score. The model's confusion matrix was also ana-
lyzed to understand its performance across different emotion
classes.

 Cross-Validation: Performed cross-validation to ensure the model's


robustness and reliability.

App Development

The app was developed using a Python-based framework. Key features in-
cluded:

 Real-time emotion detection using the trained CNN model.

 A user-friendly interface to capture and analyze facial expressions.


 Integration with TensorFlow for model inference.
Conclusion
The project successfully developed an Emotion Detector app that utilizes
CNNs for accurate emotion detection. The app demonstrated high per-
formance in real-time emotion recognition, with potential applications in
various fields such as customer service, healthcare, and marketing. Future
work could involve expanding the dataset, improving model accuracy, and
adding more features to the app.

Thank you

You might also like