0% found this document useful (0 votes)
23 views29 pages

DSMP Ty Report 2023

Uploaded by

Mahek Attar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views29 pages

DSMP Ty Report 2023

Uploaded by

Mahek Attar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Facial Emotion Recognition

1. INTRODUCTION

1.1 Project Idea


The project "Facial Emotion Recognition" aims to develop a Python software application
that leverages machine learning and OpenCV to detect and analyze users' emotions
through their facial expressions. By utilizing computer vision techniques, the application
captures real-time video input from the user's camera and employs trained machine
learning models to identify and interpret facial expressions, providing valuable insights
into the user's emotional state.

1.2 Need of project


Understanding and interpreting human emotions play a crucial role in various domains,
including human-computer interaction, psychology, marketing, and user experience
design. Traditional methods of assessing emotions relied heavily on self-reporting or
subjective interpretation, which often led to inaccuracies and biases. The need for a
reliable, non-intrusive, and objective approach to emotion detection has driven the
development of the " Facial Emotion Recognition " software application. By harnessing
the power of machine learning and computer vision, this project aims to provide a
robust and efficienttool for accurately detecting and analyzing emotions in real-time.

1.3 Literature survey


1) How Emojis Might Improve Your Health Care, January 28 2022
- Virginia Pelley,Nassim Assefi
The emoji symbols on our phones, tablets and computers are so ubiquitous, Some worry
they’re helping to destroy language. The emoji symbols on our phones, tablets and
computers are so ubiquitous, some worry they’re helping to destroy language. Others
argue that emojis should be an accepted language of their own, a universal, inclusive and
simple mode of communication. Many doctors say those same benefits can improve how
they care for patients

Dr. Daulatrao Aher College Of Engineering, Karad 1


Facial Emotion Recognition

2) The Melting Face Emoji Has Already Won Us Over, New York Times
- AnnaP.Kambhapaty
There are times when words feel inadequate when one’s dread, shame, exhaustion or
discomfort seems too immense to be captured in written language. The melting face was
conceived back in 2019 by Jennifer Daniel and Neil Cohn, who connected over their
mutual appreciation for visual language. Ms. Daniel, who uses the pronouns they and
them, is an emoji subcommittee chair for Unicode and a creative director at Google; Mr.
Cohn, an associate professor of cognition and communication at Tilburg University. Mr.
Cohn had published some work on representations of emotion in Japanese Visual
Language that caught the eye of Ms. Daniel. In Mr. Cohn’s research was “paperification”
which, according to him, is “what happens in a manga sometimes when people become
embarrassed, they will turn into a piece of paper and flutter away.”

3) "DeepMoji”
- Bjarke Felbo
The first number of The Graphic appeared on December 4, 1869, and scored an
immediate success. The Islington Cattle Show and the National Cycle Show at the
Crystal Palace are attracting huge crowds. One newfangled idea is the motorbicycle
featuring a two horse-power four-cylinder engine, built by Major Holdenof the Royal
Artillery.Throughout the book, Felbo provides insights into the research and
experimentation process behind DeepMoji, explaining the techniques and
methodologies used to train the model. He discusses the challenges faced when working
with emotional data and outlines the steps taken to overcome them. The author also
offers practical examples of how DeepMoji can be applied in various fields, such as
sentiment analysis, social media monitoring, and market research

Dr. Daulatrao Aher College Of Engineering, Karad 2


Facial Emotion Recognition

4) "EmojiNet"
- Sanjaya Wijeratne

"Emotion Analysis of Social Network Texts Using Emoji-Based Features"

Through engaging storytelling and in-depth analysis, the author sheds light on the
origins of emojis and their transformation from simple emoticons to a diverse and
universally recognized language. Wijeratne discusses the rise of emojis as a powerful
means of expression, allowing individuals to convey emotions, attitudes, and ideas in a
concise and visually appealing manner.

Moreover, the book examines the societal and psychological implications of emojis.
Wijeratne explores how emojis influence human behavior, shape online conversations,
and contribute to the formation of digital identities.

Dr. Daulatrao Aher College Of Engineering, Karad 3


Facial Emotion Recognition

2. PROBLEM STATEMENT AND SCOPE

2.1 Problem Statement

The way we communicate in the digital era is constantly evolving, and there is a growing
need for effective methods to convey emotions in online conversations. While text-based
communication lacks the nuances of face-to-face interactions, emojis have emerged as a
popular means of expressing emotions in digital conversations. However, manually
selecting emojis to match the intended emotion can be time-consuming and subjective,
leading to potential misunderstandings or misinterpretations.

2.2 Project Scope

 Emojis for Accessibility: There may be opportunities to create emojis that are
specifically designed for people with disabilities, such as emojis that represent sign
language or emojis that are designed to be more inclusive of people with diverse
abilities.

 Collaborative Emoji Design: There may be opportunities to create collaborative


platforms or communities where people can work together to design and develop new
emojis, sharing their ideas and expertise.

 Emojis for Social Causes: There may be opportunities to create emojis that are designed
to raise awareness or promote social causes, such as emojis that represent climate change,
social justice, or mental health awareness.

2.3 Area of project

The "EmotionSense" project falls within the realm of computer vision, machine
learning, and human-computer interaction. It intersects the fields of emotion
recognition, facial expression analysis, and user experience research. The web
application can find many applications domains such as psychology, market research,
virtual reality, gaming, and customer feedback analysis.

Dr. Daulatrao Aher College Of Engineering, Karad 4


Facial Emotion Recognition

2.4 Goals & objectives

 Develop an Intelligent Emoji Recommendation System

 Enhance Emotional Expression

 Improve Contextual Clarity

Dr. Daulatrao Aher College Of Engineering, Karad 5


Facial Emotion Recognition

3. SOFTWARE REQUIREMENT SPECIFICATION

3.1 Software Requirements

Software Requirement Specification


Operating system Windows 10 and above

Front End Tkinter for GUI

Back End Python and SQLite3 for Database

Tool Python 3.10

Table 3.1 Software Requirements

3.2 Hardware Requirements

Hardware Requirement Specification

Desktop/PC/Laptop Intel Core i7 Processor

Hard disk 20 GB

Moniter 15 VGA Colour

Ram 4 GB

Table3.2 Hardware Requirements

Dr. Daulatrao Aher College Of Engineering, Karad 6


Facial Emotion Recognition

4. PROJECT PLAN

Month Week 1 Week 2 Week 3 Week 4

Feb 2023 Introduction Search the topic Collect the related Create the
To project information synopsis
Mini project

March2023 Start working on Design and coding Coding Testing


the project project

April2023 Improvement in Testing Report Updated


code Report

May2023 Enhancing project Created executable Testing Final project


code file and
presentation

Table: 4.1 Project schedule

Dr. Daulatrao Aher College Of Engineering, Karad 7


Facial Emotion Recognition

5. SOFTWARE DESIGN

5.1 DFD
A Data Flow Diagram (DFD) illustrates the flow of data within the EmotionSense
software application, highlighting the interactions between different components and
the data transformations. Here's a high-level representation of the DFD for
EmotionSense:

Facial
Expression

Processing
Images

Face Lip,Eye,Mout Face point


Detection h detection
Mapping

Facial
Particle Expression
Image Analysis

Emotion
Detection

Figure: 5.1 DFD

Dr. Daulatrao Aher College Of Engineering, Karad 8


Facial Emotion Recognition

5.2 Flow Chart

START

Yes
Trained Model lenovo

No

FER 2013 Data Set Emojis Data Set

Image Preprocessing

Deep CNN

Trained Model

No
Input Facial Expression

Image Preproessing

Detected
Expression

Yes

Mapping the related emoji

Emoji Output

Figure: 5.2 Flow Chart

Dr. Daulatrao Aher College Of Engineering, Karad 9


Facial Emotion Recognition

5.3 Architecture Diagram

The architecture diagram illustrates the high-level structure and organization of the
EmotionSense software application. It demonstrates the components and their
interactions within the system. Here's a simplified representation of the architecture
diagram for EmotionSense

Figure: 5.3 Architecture Diagram

Dr. Daulatrao Aher College Of Engineering, Karad 10


Facial Emotion Recognition

5.4 UML Diagram

UML (Unified Modeling Language) diagrams provide a standardized notation for


visualizing the structure, behavior, and relationships within a software application.
Here's an example of a UML Class Diagram for the EmotionSense application

<<Main Tkinter
Window>> Root

Lable, Button, Entry

Username
Password

1..*

1..1 1..1 1..1

<<ModelTrain>> <<Project>> <<Database >>


train Show_vid `User details

Keras, OpenCv, Numpy Keras, OpenCv Username


Password
Detection
Sequential Model train
Login
Register

Figure: 5.4 UML Diagram

Dr. Daulatrao Aher College Of Engineering, Karad 11


Facial Emotion Recognition

5.5 Use case Diagram

Use-case diagrams describe the high-level functions and scope of a system. These diagrams
also identify the interactions between the system and its actors. The use cases and actors in
use-case diagrams describe what the system does and how the actors use it, but not how the
system operates internally.

Load Image Take New Picture

Automatic Face
Detection

Perform Face
Detection

Manual Face
User Detection

Request Image
Classification

Figure: 5.5 Use Case Diagram

Dr. Daulatrao Aher College Of Engineering, Karad 12


Facial Emotion Recognition

5.6 Database Design

Database: user_detail
Table: users

Field Data Type Description

id INTEGER PRIMARY KEY

Name TEXT Name of image

data BLOB Image data in binary

In this design, the "users" table has three fields:

1. Username: It stores the unique username for each user. The data type of this field is
VARCHAR.

2. Password: It stores the password associated with the user's account. The data type of this
field is VARCHAR.

3. Data: It is a BLOB field that stores the binary data of the image.

Dr. Daulatrao Aher College Of Engineering, Karad 13


Facial Emotion Recognition

6. IMPLEMENTATION DETAILS

The implementation details of the EmotionSense software application involve the usage
of various modules.

A. train_test.py: This module, named "train_test.py," performs emotion recognition using a


Convolution Neural Network (CNN) on live video feed. It consists of the following
components:

1. Data Processing: The module uses the Keras `ImageDataGenerator` to preprocess the
training and validation data by rescaling the pixel values.

2. Model Architecture: The CNN model is created using the Sequential API from Keras. It
contains several Convolution, Pooling, Dropout, and Dense layers to learn features and
classify emotions.

3. Model Training: The model is compiled with categorical cross-entropy loss, Adam
optimizer, and accuracy metrics. If a pre-trained model weights file exists, it is loaded and
evaluated on the training data. Otherwise, the model is trained using the `fit_generator`
function. The module assumes the presence of the emotion labels and emoji images in
specific directories. It requires the installation of OpenCV and Keras libraries with
dependencies.

B. gui_with_login_emoji_store.py: PIL is a library for opening, manipulating, and


saving many different image file formats. It is utilized in EmotionSense for image
handling tasks, such as converting images between different formats and displaying
them in the GUI.The provided Python module is named "gui_with_login_emoji_store.py".
It imports various libraries such as tkinter, cv2, PIL, numpy, sqlite3, io, and keras. The module
defines functions for saving generated emoji to a SQLite database, retrieving emojis from the
database, and displaying a graphical user interface (GUI) with a live webcam feed and
predicted emotions.The module also creates a Sequential model using Keras for emotion
recognition.

Dr. Daulatrao Aher College Of Engineering, Karad 14


Facial Emotion Recognition

It loads pre-trained weights for the model from a file named "emotionsModel.h5". The
emotions are detected by applying Haar cascades for face detection and using the trained
model to predict the emotion from the detected faces. The predicted emotion is then displayed
as text above the face and as an emoji image on the GUI. Additionally, the module includes
functions for capturing and displaying the live video feed from the default camera, as well as
saving the predicted emotion and corresponding emoji to the database. It also provides a
registration function for adding new users to the user_details.db database. The module uses
various image files located in the "images/emojis" and "images" directories to display the
emoji images and GUI background.

C. Emoji_viewer.py: The provided `emoji_viewer.py` module is a simple image database


viewer using Tkinter. This module creates a main window with a Treeview widget on the left
to display image names and a canvas on the right to display the selected image. It uses SQLite
to store image data and retrieves the image bytes from the database. The selected image is
displayed by converting the bytes to a PIL Image, resizing it, converting it to a Tkinter-
compatible format, and then displaying it on the canvas. The Treeview's selection event is
bound to a function that retrieves the image data and displays it. The module connects to the
database, retrieves image names, populates the Treeview, and starts the Tkinter event loop.

Dr. Daulatrao Aher College Of Engineering, Karad 15


Facial Emotion Recognition

7. TESTING

Introduction
Testing plays a crucial role in ensuring the reliability, accuracy, and effectiveness of the
EmotionSense software application. It involves various techniques and strategies to verify that
the system meets the specified requirements and functions as intended. This section outlines
the different types of testing performed on EmotionSense, including system testing, white box
testing, black box testing, unit testing, integration testing, and performance testing. It also
discusses the test items, test plan, and criteria for determining the pass/fail status of each test.

System Testing
System testing focuses on evaluating the entire EmotionSense system as a whole. It involves
testing the integrated components and their interactions to ensure that they work together
seamlessly. System testing verifies that the application functions correctly, user interface
elements are responsive, and all features are working as expected. Test scenarios include
logging in and registering, capturing video input, performing real-time emotion detection, and
displaying accurate feedback. The aim is to validate that the system meets the defined
requirements and operates reliably under different scenarios.

White Box Testing


White box testing, also known as code-based testing, examines the internal structure and
implementation details of the EmotionSense application. It involves testing specific code
segments, functions, and modules to ensure they function correctly and handle edge cases
properly. White box testing verifies the logic and flow of the code, checks for error handling
and exception handling, and assesses the accuracy of data processing algorithms. This testing
technique ensures the robustness and correctness of the codebase.

Dr. Daulatrao Aher College Of Engineering, Karad 16


Facial Emotion Recognition

Black Box Testing


Black box testing focuses on testing the Facial Emotion Recognition application without
considering its internal code structure or implementation. Testers assess the system's
functionalities based on the provided specifications and expected behavior. Black box testing
includes scenarios such as entering valid and invalid login credentials, registering new users,
simulating various facial expressions, and analyzing the accuracy of emotion detection. This
technique ensures that the application behaves as expected from the user's perspective, without
knowledge of the internal workings.

Test Items
The test items for EmotionSense encompass various components and functionalities,
including:

User login and registration system


User interface elements and responsiveness Video input capture and processing Emotion
detection algorithms and accuracyFeedback generation and display
Error handling and exception management Database management and user data storage

Test Plan
The test plan outlines the approach and strategies for testing the EmotionSense application. It
includes a detailed description of test scenarios, test objectives, testing techniques, and the
timeline for each testing phase. The plan defines the roles and responsibilities of the testing
team, the resources required, and the environments in which the tests will be conducted. It
also identifies any specific tools or frameworks used for testing purposes, such as test
automation frameworks or mock data generators.

Dr. Daulatrao Aher College Of Engineering, Karad 17


Facial Emotion Recognition

Unit Testing

Unit testing focuses on testing individual units or components of the EmotionSense


application in isolation. It ensures that each unit functions correctly and meets its specified
requirements. Unit tests are performed on individual functions, methods, or classes to validate
their behavior, input/output. Handling, and edge case scenarios. Unit testing helps identify
and fix issues at an early stage, promoting code quality and maintainability.

Integration Testing
Integration testing verifies the interactions and communication between different components
of the EmotionSense application. It ensures that the integrated components work together
seamlessly and produce the expected results. Integration tests validate the integration of the
user interface, login system, emotion detection module, feedback generation, and database
management. The goal is to identify any inconsistencies or issues that may arise due to the
interactions between these components.

Performance Testing
Performance testing evaluates the EmotionSense application's performance under various load
conditions and measures its responsiveness and scalability. It tests the system's ability to
handle a large number of concurrent users and process video input in real-time. Performance
testing also assesses the application's resource usage, response times, and stability under
different workloads. The objective is to ensure that the application performs optimally and
meets the performance requirements defined for EmotionSense.

Dr. Daulatrao Aher College Of Engineering, Karad 18


Facial Emotion Recognition

Item Pass/Fail Criteria


The pass/fail criteria for each test item are defined based on the expected behavior and
requirements of the EmotionSense application. The criteria may include factors such as
accuracy thresholds for emotion detection, successful login and registration processes,
responsive user interface, absence of critical errors or crashes, and adherence to performance
benchmarks. Each test item should have specific acceptance criteria that determine whether it
passes or fails the test.

By conducting thorough testing using a combination of system testing, white box testing, black
box testing, unit testing, integration testing, and performance testing, the EmotionSense
application can be validated for its functionality, reliability, accuracy, and performance. The
testing process ensures a robust and user-friendly application that accurately detects and
analyzes user emotions.

Dr. Daulatrao Aher College Of Engineering, Karad 19


Facial Emotion Recognition

8. SNAPSHOTS

1. Login Page

Figure 8.1 Registration

Dr. Daulatrao Aher College Of Engineering, Karad 20


Facial Emotion Recognition

Figure 8.2 Login Page

Dr. Daulatrao Aher College Of Engineering, Karad 21


Facial Emotion Recognition

Figure 8.3 Start Button

Dr. Daulatrao Aher College Of Engineering, Karad 22


Facial Emotion Recognition

8. CONCLUSION

The Emotionsense project has successfully addressed the challenge of accurately conveying
emotions in facial-based communication. By leveraging natural language processing and
machine learning techniques, we have developed a web application that predicts and suggests
appropriate emojis based on the content and emotional context of a given facial expression.
The Emotionsense system offers users an intuitive and interactive web application, enabling
them to effortlessly enhance their facial communication with expressive emojis. The
Emotionsense web application enhances the user experience, allowing individuals to
personalize their communication style and better convey their intended emotions. The
Emotionsense project has not only fulfilled the goal of bridging the gap between text and
emotion but also contributed to fostering better emotional understanding in online interactions.
By automating the process of selecting appropriate emojis, users can effectively convey their
emotions, reducing the chances of misinterpretations and misunderstandings. The
Emotionsense project has made significant strides in improving the emotional expression and
understanding in digital communication, enhancing the overall communication experience.

Dr. Daulatrao Aher College Of Engineering, Karad 23


Emotionsense

9. FUTURE SCOPE

While the current version of EmotionSense successfully detects and analyzes emotions from
facial expressions, there are several areas that can be explored to enhance the application and
expand its capabilities:

Multi-platform support: Currently, the application is developed for the Python platform.
Expanding the application to other platforms, such as mobile devices (iOS, Android), would
increase its accessibility and reach.

Enhanced emotion detection: Continual improvement of the emotion detection algorithms can
be pursued to enhance the accuracy and robustness of the application. Incorporating state-of-
the-art machine learning techniques, such as deep learning models, could lead to improved
emotion recognition results.

Dr. Daulatrao Aher College Of Engineering, Karad 24


Emotionsense

11. REFERENCES

1. Python Software Foundation. (n.d.). Tkinter - Python interface to Tcl/Tk. Retrieved


from https://docs.python.org/3/library/tkinter.html

2. OpenCV. (n.d.). OpenCV: Open Source Computer Vision Library. Retrieved from
https://opencv.org/

3. Pillow (PIL Fork). (n.d.). Python Imaging Library (Fork). Retrieved


from https://pillow.readthedocs.io/en/stable

4. NumPy. (n.d.). NumPy: The fundamental package for scientific computing with Python.
Retrievedfrom https://numpy.org/

5. SQLite. (n.d.). SQLite Home Page. Retrieved from https://www.sqlite.org/index.html

6. Keras. (n.d.). Keras: Deep Learning for humans. Retrieved from https://keras.io/

7. The Python Software Foundation. (n.d.). The Python Standard Library - SQLite3.
Retrieved fromhttps://docs.python.org/3/library/sqlite3.html

8. Smith, R. (2018). Neural Networks: An In-depth Visual Introduction with Python.


Retrieved from https://towardsdatascience.com/neural-networks-an-in-depth-visual-
introduction-with-python- 1dff1f78417c

9. Shleif, A. (2020). GUI with Python - Tkinter. Retrieved from


https://realpython.com/python-gui- tkinter/

10. Das, A. (2020). A Comprehensive Guide to Convolutional Neural Networks — the ELI5
way. Retrieved from https://towardsdatascience.com/a-comprehensive-guide-to-
convolutional-neural- networks-the-eli5-way-3bd2b1164a53

Dr. Daulatrao Aher College Of Engineering, Karad 25


Emotionsense

Dr. Daulatrao Aher College Of Engineering, Karad 26


Emotionsense

Dr. Daulatrao Aher College Of Engineering, Karad 27


Emotionsense

Dr. Daulatrao Aher College Of Engineering, Karad 28


Emotionsense

Dr. Daulatrao Aher College Of Engineering, Karad 29

You might also like