B.E Cse Batchno 16
B.E Cse Batchno 16
RECOMMENDATION SYSTEM
Submitted in partial fulfillment of the
requirements for the award of
Bachelor of Engineering degree in Computer Science and Engineering
By
SCHOOL OF COMPUTING
SATHYABAMA
INSTITUTE OF SCIENCE AND TECHNOLOGY
(DEEMED TO BE UNIVERSITY)
Accredited with Grade “A” by NAAC | 12B Status by UGC | Approved by AICTE
JEPPIAAR NAGAR, RAJIV GANDHISALAI,
CHENNAI - 600119
APRIL - 2023
i
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
BONAFIDE CERTIFICATE
This is to certify that this Project Report is the bonafide work of MEGHANA ELISETTI
(Reg.No - 39110295) and PULIPATI ANSHU (Reg.No - 39110811) who carried out
the Project Phase-2 entitled “EMOTION BASED MOVIE RECOMMENDATION
SYSTEM” under my supervision from January 2023 to April 2023.
Internal Guide
Dr. J. ALBERT MAYAN, M.E., Ph.D
DATE: 19-04-2023
iii
ACKNOWLEDGEMENT
I would like to express my sincere and deep sense of gratitude to my Project Guide
Dr. J. Albert Mayan M.E., Ph.D, for his valuable guidance, suggestions and
constant encouragement paved way for the successful completion of my phase-1
project work.
I wish to express my thanks to all Teaching and Non-teaching staff members of the
Department of Computer Science and Engineering who were helpful in many
ways for the completion of the project.
iv
ABSTRACT
The modern world comprises many forms of entertainment, the most common being
– movies. Over the past decades, the ways of production, creation and distribution of
movies have been greatly advanced. People have a variety of movies to choose from
and require a recommendation system to guide them in this process.
Emotions are a strong reaction to stimuli and are an intelligent and rational form of
behavior. It is difficult to assess the emotional responses to the movies based on the
diverse reactions to movies. Color psychology can be used to detect various
emotional states such as happiness, sadness, anger, fear, and excitement.
The aim of this project is to develop an user interface where colors are used to
represent the user‟s appropriate emotions and compare the emotion with facial
expressions and then recommend a movie based on that. A hybrid approach,
combining collaborative filtering and content-based filtering is used to recommend
movies.
v
INDEX
CHAPTER TITLE PAGE
NO NO.
ABSTRACT v
LIST OF TABLES x
LIST OF FIGURES ix
LIST OF ABBREVIATIONS viii
1 INTRODUCTION 1
1.1 History and Introduction of Recommendation System 1
1.2 Emotion based recommendation system 3
2 LITERATURE SURVEY 5
2.1 Review of Literature Survey 5
2.2 Inferences from Literature Survey 9
2.3 Open Problems in Existing System 10
3 REQUIREMENTS ANALYSIS 11
3.1 Software Requirements Specification Document 11
3.1.1 Hardware Requirements 13
3.1.2 Software Requirements 13
4 DESCRIPTION OF THE PROPOSED SYSTEM 22
4.1 Proposed System 22
4.2 Selected Methodology or process model 22
4.3 Architecture /Overall Design of Proposed System 23
4.4 Description of Software for Implementation and Testing 24
Plan of The Proposed Model/System
4.5 Project Management Plan 27
4.6 Transition / Software to Operations Plan 28
5 IMPLEMENTATION DETAILS 30
5.1 Development Setup 30
5.2 Algorithms 33
6 RESULTS AND DISCUSSION 38
7 CONCLUSION 39
7.1 Conclusion 39
7.2 Future Work 40
7.3 Research Issues 40
vi
7.4 Implementation Issues 42
REFERENCES 43
APPENDIX 45
A. SOURCE CODE 45
B. SCREENSHOTS 49
C. RESEARCH PAPER 49
vii
LIST OF ABBREVIATIONS
ABBREVIATION EXPANSION
RS Recommendation System
EEG Electroencephalogram
vii
i
LIST OF FIGURES
PAGE
FIGURE NO FIGURE NAME
No.
4.1 Development of the New Methodology 23
ix
LIST OF TABLES
TABLE PAGE
TABLE NAME
No. No.
x0
CHAPTER 1
INTRODUCTION
Many decisions are made in our daily lives, often without our knowledge. Users are
often overwhelmed by choice and need help finding what they're looking for. For
example, what should we put on in the morning that is suitable for our daily
program? Which menu to choose in the dining room? Which task should we
perform first? Which school should we enroll our child in? In the past, specialists
have attempted to assist users in making judgments, but for a while there have
been other options. A website that sells books (as well), like Amazon, can assist
individuals in choosing their next book instead of only the librarian or retail
employee. The videos that YouTube suggests are all based on prior browsing and
provide engaging audiovisual content with a high hit rate. This is where the
recommendation system comes into the picture.
In the past two decades, e-commerce has surged extensively due to changes in
consumer behavior. Several customers are more inclined to shop from the comfort of
their home and use online services. The pandemic has also caused more businesses
and consumers to turn to digital platforms for sales and purchases of goods online.
Recommender systems have the ability to predict whether the user would prefer a
certain item or not. This helps the customer by reducing their transaction costs during
online shopping, and in turn helps the service provider by creating more traffic for
their website / service.
Recommender systems also showed a positive growth in revenue for online services
and operations. Recommender systems are further categorized into non-
personalized and personalized systems. A non-personalized recommender system is
a system which recommends the general most popular items available, for example:
The top ten places to visit for a vacation. A personalized recommender system is a
system which recommends the most suitable choices for the user based on their past
preferences. A good, personalized recommender system is one that is
diverse and can recommend different items to the user.
2
the business environment. Primarily, because customers stand to benefit from an
increased variety of goods and a simplified path to information pertinent to their
needs as a result of the market's continued expansion, in addition to the services.
Most of the time, the customer is the one who supplies the recommendation system
with data. This may include things like the information on the items that he is looking
for, as well as his ratings, numerous demographics, and other data. The
recommender system may use a single strategy or several techniques for making
recommendations. based on this information, and as a result, provides product
suggestions to the customers.
To provide suggestions that can be trusted, the recommender system must be able
to grasp exactly what it is that the user is seeking. The requirements of the customer
as well as their preferences. On the other hand, when it comes to things that are
more subjective and complex, such as movies, music, and novels, aroma, the
process of giving a rating to or determining the desirable. There are occasions when
consumers have difficulty comprehending the characteristics of a product. In addition,
since consumers' preferences for these essentially subjective products change often
depending on how they are feeling, the It is not enough to try to comprehend a
person based just on the conventional profile that they fit.
4
CHAPTER 2
LITERATURE SURVEY
5
The research shows that emotion- and sentiment-based models are better than
traditional ones. This research supports utilizing sentiment and emotion in movie
recommender systems.
[3] Title: Analysis of Intelligent movie recommender system from facial expression
Author: S. Chauhan, R. Mangrola and D. Viji
Year: 2021
Machine learning helps tackle real-time business and research difficulties. Machine
learning is a conversion of classical mathematics. Facial recognition uses machine
learning models. Real-time facial recognition is utilized in security systems,
workplaces, etc. Using facial detection, machine learning models may propose
movies. This is done through recording user reaction rather than seeking particular
videos. Some relevant research is based on attentional convolutional neural
(recognizes facial micro expressions), and a recommender system has been created
to suggest movies or songs based on CNN output. Boosting algorithms, another face
recognition method employing decision trees, are less effective than CNN. CNN
seems more accurate. Combining content-based and collaborative filtering
recommendation systems increases their power.
[5] Title: On the Influence of Shot Scale on Film Mood and Narrative Engagement in
Film Viewers
Author: S. Benini, M. Savardi, K. Bálint, A. B. Kovács and A. Signoroni
6
Year: 2022
Shot scale, the apparent distance of the camera from a scene's subject, has artistic
and narrative effects. To quantify how shot scale influences both lower and higher
complexity reactions in viewers, we first studied how Close, Medium, and Long Shots
connect to viewers' ratings on film mood, measured in terms of hedonic tone,
energetic arousal, and tense arousal on 50 film clips. Then, the evaluation of shot
scale's influence on violent scene viewers' narrative engagement and its sub-scales:
narrative comprehension, attentional focus, emotional engagement, and narrative
presence is done. Convolutional Neural Networks trained on the filmographies of six
filmmakers, analyze 120 full-length movies at one frame per second to automate shot
size categorization. This research examines the link between shot size and viewer
emotional engagement using big corpora. Beyond style study, understanding cinema
narrative impacts may help with movie recommendations and film therapy.
[6] Title: EEG Based Neuromarketing Recommender System for Video Commercials
Author: S. K. Bandara, U. C. Wijesinghe, B. P. Jayalath, S. K. Bandara, P. S.
Haddela and L. M. Wickramasinghe
Year: 2021
Video ads are popular due to technological advancements and corporate
competitiveness. Before marketing and promotions begin, it's important to analyze
these video advertisements' impact. This research uses EEG data from a Brain-
Computer Interface (BCI) to analyze the viability of movie trailers. This helps the firm
consider the commercial's impact and worth. The analysis offers movie ads based on
user choices. The video's emotion, attentiveness, and pleasure will be assessed.
Random Forest prediction algorithm with 91.97% accuracy was used for emotion
analysis, while c-Support Vector Classifier method with 91.70% accuracy was utilized
for attention analysis. Central Limit theorem and Empirical rule were applied to
analyze pleasure. Initial findings validate the suggested framework's promise. This
study focuses on the movie and entertainment sector, but it may be applied to other
industries.
[7] Title: Evaluation Method for Video Advertisetments Using EEG Signals
Author: A. Dushantha, R. Akalanka, H. Gayan, K. C. Siriwardhana, P. S. Haddela
and L. Wickramasinghe
7
Year : 2020
Video advertisements became very popular in recent past due to the
technology advancement and competitiveness of businesses. Therefore, analyzing
the impact of commercial video advertisements is important before they launch the
marketing campaign. This paper presents a unique method that can evaluate the
effectiveness of movie commercials (trailers) using Electroencephalogram (EEG)
signals captured from a brain computer interface. Randomly selected fifteen movie
lovers participated to capture EEG signals. For a selected set of movie trailers, three
different types of classification models were trained and tested. With the help of
classification models, it measures attention and enjoyment levels and also emotional
status of a viewer to compute effectiveness of an advertisement. It also consists of a
recommender system which suggests movie advertisements based on the
preferences of the users. From the initial results received, it confirms that the
proposed framework is producing promising results. Even though this work focuses
on the movie/entertainment industry, it has the potential to be developed and applied
for many other industries as well.
[8] Title: Sentiment Analysis using deep learning for use in recommendation systems
of various public media applications
Author: K. Arava, R. S. K. Chaitanya, S. Sikindar, S. P. Praveen and S. D
Year: 2022
Sentiment Analysis is a method of analyzing text and extracting
opinions from it. It‟s also known as emotion or opinion extraction, and it‟s part of the
machine learning as well as data mining categories. There are numerous ways to
convey one‟s sentiments. It can be articulated in a variety of ways, such as through
feelings, making judgments, or expressing one‟s vision or insight. Sentiment
investigation is the act of detecting, recognizing, and categorizing a user‟s emotion or
view for any service, such as movies, product issues, events, or any other attribute
that can be good, negative, or neutral. This analysis is based on social
communication channels such as websites that include ratings, forum conversations,
blogs, micro-blogs, Twitter, and other social media platforms. The important goal of
suggested systems is to improvise accuracy and to generate recommendation
systems using deep learning algorithms.
8
[9] Title: A Hybrid Recommender System for Improving Rating Prediction of Movie
Recommendation
Author: N. Kannikaklang, S. Wongthanavasu and W. Thamviset
Year: 2022
Because of the COVID-19 pandemic, online movies are now extremely
popular. While movie theaters have not been serviced and people are staying
quarantined, movies are the best choice for relaxing and treating stress. At present,
recommender systems are widely integrated into many platforms of movie
applications. A hybrid recommender system is one promising technique to improve
system performance, especially for cold-start, data sparsity, and scalability. This
paper proposed a hybrid of matrix factorization, biased matrix factorization, and
factor-wise matrix factorization to solve all mentioned drawback problems. Simulation
shows that the proposed hybrid algorithm can decrease approximately 11.91% and
10.70% for RMSE and MAE, respectively, when compared with the traditional
methods. In addition, the proposed algorithm is capable of scalability. While the
number of datasets is tremendously increased by 10 times, it are still effectively
executed.
The work of [1] A.Hitz S, A Naas and S.Sigg regarding the paper based on the
sharing of geotagged pictures for an Emotion based recommend system shows that
an emotion based recommendation system showed a 19% increase in average rating
when compared to regular recommendation systems. The benefits of this are that,
enhancement is done automatically when an input is given. However, the accuracy is
questioned.
9
2.3 OPEN PROBLEMS IN EXISTING SYSTEM
The existing systems use facial recognition to detect the emotion of the user. A facial
recognition system is a technology capable of matching a human face from a digital
image or a video frame against a database of faces. Facial recognition can be used
in airports, ATM machines, surveillance (monitoring and searching for drug offender
and criminals, CCTV control), security (office access, building access control,
airports, flight boarding system, email authentication on multimedia workstation).
The reason for the low accuracy is due to the large variety of responses people
showcase. Some people are more expressive and react obviously to certain
scenarios. while others are unable to do so and react in less obvious manners.
According to an IEEE journal written by Chung-Hsien Wu and Jen-Chun Lin, the
facial expressions of introverts are significantly different from the distinct expression
of an extroverted person [23]. Hence this suggests that it is hard a system to
accurately recognize the emotion of the user merely based on their facial recognition,
as there is no standard reaction to showcase that the user is feeling a certain way.
For example, the user who is an introvert might feel happy but just react with a small
smile, while an extrovert would react with a big grin on their face. This does not mean
that the introvert was less happy than the extrovert, but it means that there is more
than one way to showcase your feeling.
10
CHAPTER 3
REQUIREMENT ANALYSIS
Scope
Movie Recommender is a movie recommendation system, which provides users with
movies which they may like, based on the movies that they previously saw. Every
logged-in user should have access to the recommender system. The system will go
through the movies that the user previously saw and rated, and then according to that
information, it should provide movies to the user. The project's main aim is to provide
accurate movie recommendations to the user through emotions. These emotions are
detected by using the concept of colours and face recognition. This project is
beneficial for the users and the companies. Users may find movies that they may like
without consuming time and even they can encounter new movies which they like
from the recommendations. For the company, they make the website more attractive,
so they draw more users to the website and the system makes the users of the
website spend more time online.
Operating Environment
User Interfaces
The user interface for the software shall be compatible to any browser such as
11
Internet Explorer, Mozilla or Google Chrome by which user can access to the system.
The user interface shall be implemented using any framework or software package
like streamlit.
Hardware Interfaces
Since the application must run over the internet, this brings out the requirement of a
network interface on the device. User should have a device with valid internet
connection, Wi-Fi or 3G/4G/5G.
Software Interfaces
Get_pos_neg () function:
The calculation of positive and negative emotions is done here. Initially the pos and
neg variables are assigned to zero. The zero and one values are mapped to each
color by the color selection of the user. Each color represents different emotions.
Get_final_sentiment () function:
Based on the colors that the user selects, the number of ones and zeroes will be
calculated with the help of get_pos_neg() function. The emotion detected by the face
is also mapped to zeroes and ones which represents emotion. The face emotion is
compared with the color emotion using dot operator.
Get_poster() function:
The posters of the movies will be shown based on the genres and the use of api. The
genres are also mapped to zeroes and ones. This function returns the path of the
poster using api.
12
Suggest_movie() function:
Easy-to-read − Python code is more clearly defined and visible to the eyes.
13
Portable − Python can operate on a broad range of hardware devices and
has the same interface across them all.
Scalable − Python provides a better structure and support for large programs
than shell scripting.
Python Libraries
Scikit-learn
14
Features of Scikit-learn
Keras
Keras is a deep learning API written in Python, running on top of the machine
learning platform TensorFlow. It was developed with a focus on enabling fast
experimentation. Being able to go from idea to result as fast as possible is key
to doing good research.
- Simple -- but not simplistic. Keras reduces developer cognitive load to free
you to focus on the parts of the problem that really matter.
15
- Flexible -- Keras adopts the principle of progressive disclosure of
complexity: simple workflows should be quick and easy, while arbitrarily
advanced workflows should be possible via a clear path that builds upon
what you've already learned.
- Powerful -- Keras provides industry-strength performance and scalability:
it is used by organizations and companies including NASA, YouTube, and
Waymo.
- Highly Flexible -- Keras provide high flexibility to all of its developers by
integrating low-level deep learning languages such as TensorFlow or
Theano, which ensures that anything written in the base language can be
implemented in Keras.
PyTorch
PyTorch is a fully featured framework for building deep learning models, which is
a type of machine learning that‟s commonly used in applications like image
recognition and language processing. PyTorch is written in Python, thus making it
easier for most machine learning developers to learn and use. PyTorch is
distinctive for its excellent support for GPUs and its use of reverse-mode auto-
differentiation, which enables computation graphs to be modified on the fly. This
makes it a popular choice for fast experimentation and prototyping. PyTorch is
known for having three levels of abstraction as given below −
Features
16
Computational graphs − PyTorch provides an excellent platform which
offers dynamic computational graphs. Thus, a user can change them
during runtime. This is highly useful when a developer has no idea of how
much memory is required for creating a neural network model.
Advantages
It is easy to debug and understand the code.
It includes many layers as Torch.
It includes a lot of loss functions.
It can be considered as a NumPy extension to GPUs.
It allows for building networks whose structure is dependent on
computation itself.
SymPy(latex)
NumPy
- Sophisticated functions
Pandas
pandas is a software library written for the Python programming language for data
manipulation and analysis. It offers data structures and operations for
17
manipulating numerical tables and time series. It is free software released under
the three-clause BSD license.
Features
It has a fast and efficient DataFrame object with default and customized
indexing.
Handle multiple operations of the data sets such as sub setting, slicing,
filtering, groupBy, re-ordering, and re-shaping.
Provides fast performance, and If you want to speed it, up even more,
you can use Cython.
Matplotlib
Matplotlib is a plotting library for the Python programming language and its
numerical mathematics extension NumPy. It provides an object-oriented API
for embedding plots into applications using general-purpose GUI toolkits like
Tkinter, wxPython, Qt, or GTK.
Features
18
- Use a rich array of third-party packages built on Matplotlib.
Machine Learning
Machine Learning comes into the picture when problems cannot be solved using
typical approaches. ML algorithms combined with new computing technologies
promote scalability and improve efficiency. Modern ML models can forecast
everything from disease outbreaks to stock market fluctuations.
Visual Studio Code has built-in multi-language support that enables programmers to
make use of one editor for different languages. VScode can detect if any snippet of
code is left incomplete. Resources can also be pulled from online GitHub
Repositories that allow cloning of the code and storing it online.
19
Jupyter Notebook
The original web application for creating and sharing computational documents is
Jupyter Notebook. It provides a straightforward, streamlined, document-centric
experience. Jupyter Notebook allows users to collect all components of a data project
in one location, making it easier to demonstrate the complete project process to your
target audience. Users may utilize the web-based application to generate data
visualizations and other project components to share with others via the platforms.
Deep Learning
Deep learning is a subset of machine learning, which is essentially a neural network
with three or more layers. These neural networks attempt to simulate the behaviour of
the human brain, allowing it to learn from large amounts of data. While a neural
network with a single layer can still make approximate predictions, additional hidden
layers can help to optimize and refine for accuracy.
Definition of Neural Network: Neural networks are comprised of a node layer that
contains an input layer, one or more hidden layers, and an output layer. Each node
connects to another and has an associated weight and threshold. If the output of any
individual node is above the specified threshold value, that node is activated, sending
data to the next layer of the network. Otherwise, no data is passed along to the next
layer of the network.
Working:
The first neuron layer i.e. input layer receives the input data and passes it to
the first hidden layer.
The hidden layers perform the computations on the received data. The biggest
challenge in neural network creation is to decide the number of neurons and
many hidden layers.
Finally, the output layer produces the required output.
Deep learning can generate new features from the limited available training data
sets. One of the major benefits of deep learning is the reduced time required for
feature engineering as compared to machine learning. Some fields where deep
learning is applied are driverless cars, virtual assistants, and facial recognition.
20
Virtual assistants such as Siri, Alexa and Cortana use deep learning to translate
human speech and language into necessary instructions. Deep learning aids
driverless cars like Tesla‟s to understand different scenarios of the road, speed
limits, signals and pedestrian behaviours.
21
CHAPTER 4
DESCRIPTION OF PROPOSED SYSTEM
The summary of the relation between colors and emotions that will be used in our
system is shown below in Table 4.1.
22
Collaborative filtering is best suited to problems with known data on users (age,
gender, occupation, etc.) but a lack of data for items of interest or difficult feature
extraction. In contrast to the content-based approach, collaborative recommender
systems attempt to predict a user's utility for an item based on the utility of other
users with the item in the past.
Content-based filtering methods are based on item featurization (as opposed to user
featurization) and a profile of a user's utility. It is best suited to problems that have
known data on items (e.g., leading actors, year of release, genre for movies) and how
the user has historically interacted with the recommender system but lack personal
information about the user. Content-based recommenders are essentially a user-
specific learning problem that uses item features to quantify the user's utility (likes
and dislikes, rating, etc.).
The main objective is to develop a website where a user can get the
recommendations of movie by their emotions using color psychology and facial
gestures. Color psychology is an effective way of detecting. a user's emotions
through color. The TMDB movie dataset is used to train and test a model to detect
the appropriate emotions.
23
The collaborative filtering technique deals with the similarities of features
between users and recommends personalized movies, whereas content-based
filtering is all about filtering a movie's like, dislike, or ratings. The user profiles
vector using matrix system and emotions of a user will be extracted with the help of
these two techniques.
With Anaconda, you can easily set up, manage, and share Conda environments.
Moreover, you can deploy any required project with a few clicks when you‟re using
Anaconda. There are many advantages to using Anaconda and the following are the
most prominent ones among them: Anaconda is free and open source. This means
you can use it without spending any money. In the data science sector, Anaconda is
an industry staple.
It is open source too, which has made it widely popular. If you want to become a data
science professional, you must know how to use Anaconda for Python because every
recruiter expects you to have this skill. It is a must-have for data science. It has more
than 1500 Python and R data science packages, so you don‟t face any compatibility
issues while collaborating with others.
For example, suppose your colleague sends you a project which requires packages
called A and B but you only have package A. Without having package B, you wouldn‟t
24
be able to run the project. Anaconda mitigates the chances of such errors. You can
easily collaborate on projects without worrying about any compatibility issues. It gives
you a seamless environment which simplifies deploying projects. You can deploy any
project with just a few clicks and commands while managing the rest.
25
conda create -n <your_environment_name> python=3.6
Similarly, if you want to create an environment with a particular package, you can use
the following command:
conda create -n <your_environment_name>pack_name
Here, you can replace pack_name with the name of the package you want to use.
If you have a .yml file, you can use the following command to create a new Conda
environment based on that file:
conda env create -n <your_environment_name> -f <file_name>.yml
We have also discussed how you can export an existing Conda environment to a .yml
file later in this article.
Activating an Environment
You can activate a Conda environment by using the following command:
conda activate <environment_name>
You should activate the environment before you start working on the same. Also,
replace the term <environment_name> with the environment name you want to
activate. On the other hand, if you want to deactivate an environment use the
following command:
conda deactivate
26
conda install <package_name>=<version>
Deleting an Environment
Sometimes, you don‟t need to add a new environment but remove one. In such
cases, you must know how to delete a Conda environment, which you can do so by
using the following command:
conda env remove –name <env_name>
The above command would delete the Conda environment right away.
Project Scope: Define the project's goals, objectives, and requirements. Identify the
key stakeholders, team members, and resources needed to complete the project.
Project Planning: Develop a detailed project plan that outlines the tasks, timelines,
and deliverables for each phase of the project. Create a work breakdown structure
27
(WBS) to organize the project tasks into manageable parts.
Resource Allocation: Identify the resources needed for the project, including
personnel, hardware, and software. Allocate resources based on project
requirements, timelines, and budget.
Risk Management: Identify potential risks to the project and develop a risk
management plan to mitigate or avoid them. Create contingency plans to handle
unexpected events that may affect the project timeline or budget.
Project Execution: Implement the project plan by assigning tasks to team members,
monitoring progress, and addressing any issues that arise. Regularly communicate
project status to stakeholders and make adjustments as needed.
Project Monitoring and Control: Monitor project progress against the project plan and
make adjustments as necessary. Track project costs, timelines, and quality to ensure
the project is completed within budget, on time, and to the required quality standards.
Project Closure: Evaluate the project's success against the project goals and
objectives. Close out the project by delivering the final product, documenting project
outcomes, and conducting lessons learned review to identify areas for improvement
in future projects.
Deployment and Integration: Once the recommendation system has been tested and
refined, it can be deployed to the production environment. The system should be
integrated with the existing infrastructure and made accessible to users through an
easy-to-use interface.
28
User Training: Users need to be trained on how to use the recommendation system
to ensure they get the most out of it. Training materials, such as user guides,
tutorials, and videos, should be provided to help users navigate and utilize the system
effectively.
29
CHAPTER - 5
IMPLEMENTATION DETAILS
Install Python:
Download and install python from the official Python website. Make sure to select the
latest version of the Python. Click the appropriate link for your system to download
the executable file.
Build a streamlit application to create a framework for the machine learning and deep
learning algorithms. The code is saved in the python file named webapp.py in the
project directory.
To run the application locally, execute the following command in the command
prompt/ terminal : “streamlit run webapp.py” or “python -m streamlit run webapp.py”.
This command will start the streamlit application, and it will be accessible at
http://localhost:8501.
Deployment Setup:
Web Server: The system can be deployed on a web server, such as Apache or
Nginx, to provide a web interface for users to access the recommendation system.
Application Server: The web server can be connected to an application server, such
as Tomcat or JBoss, to run the recommendation system application.
30
server, such as MySQL or PostgreSQL, to store and retrieve movie data, user
profiles, and recommendation history.
Load Balancer: To ensure scalability and high availability, a load balancer, such as
HAProxy or Nginx, can be used to distribute the traffic across multiple web servers.
Cloud Platform: The system can be deployed on a cloud platform, such as Amazon
Web Services (AWS) or Google Cloud Platform (GCP), to take advantage of their
scalability and flexibility features. Cloud platforms provide infrastructure-as-a-service
(IaaS) and platform-as-a-service (PaaS) options to deploy, manage and scale the
system.
Security: Security measures should be taken into consideration to ensure the system
is protected against attacks. For instance, deploying an SSL/TLS certificate to encrypt
communication between the server and clients, restricting access to sensitive data,
and implementing firewalls to prevent unauthorized access.
Monitoring and Logging: Monitoring and logging tools, such as Prometheus and ELK
stack, can be used to monitor system performance, detect and alert any potential
issues, and log system events and user activities.
1. Environment setup:
Configure the network settings and firewall rules to allow traffic to and
from the application.
31
Train the convolutional neural network (CNN) model on a dataset of
movie posters and emotions using a deep learning framework such as
TensorFlow or PyTorch.
Integrate the CNN model and emotion detection algorithm into the user
interface, so that users can upload a picture of themselves, and the
system can detect their emotion and recommend a movie.
3. Deployment:
4. Maintenance:
Deploying an E-MRS involves many steps, and there may be variations depending on
the specific requirements of the project. However, this deployment setup should
provide a solid foundation for building and deploying an emotion-based movie
recommendation system.
32
5.2 ALGORITHMS
The TMDB film database, known for its extensive marking, was utilized to narrow the
pool of possible outcomes. As the primary identifier for clustering, the use of genre
was chosen. This brief examination has resulted in the identification of 24 distinct
genres. These films span the gamut from action to history, and many fall into multiple
categories. There are many of products that may not even come close to
summarizing a complete film story. To make meaningful discoveries, the need of
more information is necessary. Obtaining as much information as possible is mainly
limited by the time period. As the data model becomes more refined, less of it is
considered. There may be one significant advantage depending on how
recommendations are made. Structures for sharing information with social media,
movie forums, machine learning forums, and other online channels are required. Our
data collection for the recommender system took roughly eight weeks. Users of the
sentiment-based recommender contributed thousands of individual opinions.
The datasets used in this project were obtained from the Kaggle website. The
datasets are “tmdb_5000_movies.csv” and “tmdb_5000_credits.csv” as shown below.
In the “tmdb_5000_movies.csv” dataset, there are 4803 rows and 20 columns. It
comprises budget, genre, homepage, original language, id, original title, companies,
overview, popularity, keywords, release date, production countries, revenue, spoken
languages, runtime, tagline, title, vote average, status, and vote count as fields. There
are 4 columns and 4803 rows in the "tmdb 5000 credits.csv" dataset, with fields like
crew, cast, title, and movie id. The two datasets contain 24 genres. The datasets are
joined based on the column shared by both CSV files. To improve results, null or
missing values will be checked while training the model. If there are any null values or
missing values, drop those values. Among all the columns in the merged dataset, the
selected features are genres, keywords, cast, crew, title, overview, and movie id.
33
Table 5.1: Credits TMDB dataset
This algorithm's job is to deduce the user's emotional state from a palette of three
colors. So that it can achieve that, it will use the following reasoning to examine the
color sequence: If the user selects three colors and at least two of them have the
same emotional connotation, then that feeling is the one they are experiencing at the
moment. If the user selects the hues yellow, blue, and green or yellow, blue, and
black, for instance, they are experiencing elation. An alternative scenario exists
where the first hue chosen represents happiness, the second hue signifies love, and
the third represents either sadness or wrath (negative emotion).
The user's current emotional state is "joy-love" in this case. If, on the other hand, two
of the colors signify melancholy and one suggests fury, then sadness and anger
34
represent the user's current emotional state. If a person selects the positive feelings
of "joy-love". For instance, the result will be a combination of the following colors:
yellow, light red, and black. Movie evaluations and associated feelings are recorded
in user accounts. All of this data may be found in the User Profiles repository. The
user will be asked to fill out a survey during registration to share his thoughts on
which kind of movies best represent certain moods. A user's choice of movie is
treated as an implicit 5-star rating. Love, rage, joy, and sadness are the four
categories of user emotions that make up the profile vector.
The user's preferred genres and titles of films to view when experiencing an emotion
E are subsequently catalogued under each emotion group E. If user u selects films A
and B for the emotion "love," films C and D for "anger," and film D for "joy," then
user u's profile vector would look like this: u = {{A: 5, B: 5}, {C: 5}, {D: 5}, {}}. To
provide input on the system's film suggestion, the user can rate the degree to which
the film is in line with his present may suggest film F; nevertheless, the user may not
agree that he would like to see this film while he is in love, and so he may give it a
rating of 1 out of 5. This transforms the letter u into a profile vector: u = {{A: 5, B: 5, F:
1}, {C: 5}, {D: 5}, {}}.
The colors used in this project are black, white, red, yellow, and blue. The user is
required to select three colors among the colors mentioned. Each color is assigned 1
and 0 values. Negative and positive emotions are represented by numbers 1 and 0.
The colors are mapped by 1 and 0 if the user selects black, white, and blue. The
mapped values are 0, 1, and 1. The resulting emotion is positive if the mapped value
of positive emotions exceeds the assigned value of negative emotions. Otherwise,
the outcome would be negative.
For a given database U of user profiles, and a target user u, a movie m, an emotion
e, the executive steps of the CF algorithm can be outlined as following:
35
Step 1: Extract the profile vector of the target user u from the database. We
consider the information of user preferences available from the User Profiles
database U. For example, for the target user u, his profile vector is u = {{A : 5,
B : 5, F : 1}, {C : 5}, {D : 5}, {}}
Step 2: Search for other users who have rated at least one movie in common
with the target user u.
In order to reduce the computing user-film matrix, the system will consider only the
users who have rated at least one movie-emotion in common with the target user u.
For example: Let Un is a sub collection of U, containing only users who have at least
one movie-emotion in common with u. Un =< u1, u2, u3 > There are 3 users in Un
with: u1 = {{A : 5, F : 2}, {C : 5}, {E : 5}, {I : 5}} u2 = {{B : 5}, {}, {G : 5}, {H : 4}}
However, users do not provide consistent ratings when asked to rate the same movie
at different times. As a result, it is impossible to create an algorithm more accurate
than the variance in a user‟s ratings for the same item. Even when accuracy
differences between recommender systems are measurable, they are usually too
small to be noticed by the users.
Another problem is that some recommender systems produce highly accurate but
totally useless recommendations to users. For example, Lord of the Rings is a very
famous movie and nearly all of the movie lovers have already seen it. So, it is useless
to recommend it again to the users.
Therefore, to make the evaluation results to be more reliable, we propose to use the
precision and mean absolute error with consideration to the novelty factor.
Mean absolute error (MAE) measures the average absolute deviation between
a predicted rating and the user‟s true rating. Thus, it can measure how close
the recommender system‟s predicted ratings are to the true user ratings.
Precision is defined as the ratio between the number of well-recommended
items and the total number of rated recommendations. Well recommended
movies are movies which the users have never seen before and are rated at
least 4 out of 5.
The main goal of our tests is to measure how close our system‟s recommended
products are to the true user‟s needs and preferences. Another goal is to determine
the precision of the Emotion detector. After providing a list of movie recommendations
36
to the users, the system will ask them to evaluate each movie and to specify whether
they have already seen that movie. The users are also invited to answer a survey at
the end of the recommendation process:
Questions about the overall evaluation of the user about the system,
Questions about the user interface and recommendation explanations, and
Questions about the quality and the precision of the Emotion detector.
All responses from this survey will be stored and calculated to determine the user
satisfaction after using the system.
Six user emotions are considered in this project, like anger, sadness, happiness,
disgust, surprise, and fear, and these emotions will be recognized by the user's face.
Angry, fear, sadness, and disgust come under negative emotions. Surprised and
happy come under positive emotions. If the face-detected emotion is negative and the
three-color choice produces a positive emotion, then the movies which result in
positive emotions to the user will be recommended. The Dot operator is used to
compare the face emotion and color emotion.
37
CHAPTER - 6
RESULTS AND DISCUSSION
In this paper, A recommendation system is designed to help you find the best movies
to watch based on the genre you like. The simple reason that viewers either love or
hated the movies in our system is, it only considers ratings of 1 to 5. This approach
provides far superior recommendations to consumers since it lets them comprehend
the connection between their feelings and the suggested content. When a user gives
a film's genre a high rating, similar film genres are suggested.
What makes this system unique is that it considers the user's feelings and compares
those emotions with that of emotion detected by the face. By using this information to
make informed product recommendations. From these findings, we can infer that
using more dense data will improve the recommender's performance and that include
additional genres will allow us to provide more targeted suggestions. In addition to
this, we have carried out studies to demonstrate the accuracy of our predictions
when employing emotional feature data.
Movies that have received ratings of 3 or above are presumed to be loved by the
user, whilst movies that have received ratings of 1 or 2 are supposed to be despised
by the user. We do not consider unclear 3 rating. The result matrix is produced by
making use of the ratings matrix and the genres matrix. The result matrix is the dot
product of the two matrices that came before it. The outcome then undergoes an
additional conversion into a binary format. If the result of the dot product is greater
than 0, then the value 1 is given to the cell in question; otherwise, the value 0 is used.
Emotions are the undeniable reliance of information in bridging human and machine
intercommunication. Machines can recommend better when they can comprehend an
individual‟s emotions. Producing emotions in the users is conventionally recognized
as the fundamental goal of movies. Hence, movie recommendations based on one‟s
emotional trajectory is key as it allows them to map movie recommendations based
on their emotional stage.
38
CHAPTER – 7
CONCLUSION
7.1 CONCLUSION
To address these research issues, researchers can develop new methods and
techniques, such as user-based evaluations, surveys, and interviews, to evaluate the
system's performance and identify areas for improvement. Researchers can also
explore the integration of multiple modalities for emotion recognition and develop
personalized recommendation algorithms that consider a user's historical viewing
preferences and emotional responses to movies.
39
with social media and streaming platforms, and user-based evaluation of
recommendation quality. These avenues for improvement and expansion would
enable the system to provide a more comprehensive and personalized movie
recommendation experience for users, leveraging social signals and multi-modal
emotion recognition, and integrating seamlessly with streaming platforms.
Furthermore, user-based evaluations would provide a more comprehensive and user-
centric evaluation of the system's performance.
Another research issue is the ethical and social implications of the system's
recommendations. The recommendation system has the potential to influence user
behavior and shape their movie preferences. As such, research could explore the
ethical and social implications of the system's recommendations. For example, what
are the potential biases in the recommendation system, and how can they be
mitigated? How can the system's recommendations avoid reinforcing stereotypes or
perpetuating discrimination? Exploring these issues can help ensure that the system's
40
recommendations are fair, unbiased, and aligned with the values and expectations of
its users.
41
language processing technologies. By exploring these issues, researchers can
develop new methods and techniques to enhance the recommendation system's
performance, improve user satisfaction, and mitigate potential ethical and social
concerns.
Data collection and pre-processing: The first and foremost task is to collect and pre-
process data related to movies, including their genres, plot summary, cast, ratings,
and reviews. This data needs to be processed and cleaned to extract relevant
features that can be used to build an emotion-based recommendation system.
Emotion classification: One of the key challenges is to classify emotions from user
reviews or ratings. Various techniques such as sentiment analysis, emotion detection
using machine learning algorithms, and deep learning models can be used to classify
emotions.
Recommendation engine: Once the emotions are classified, the next step is to build a
recommendation engine that can suggest movies based on a user's emotional state.
The recommendation engine can use techniques such as collaborative filtering,
content-based filtering, or a hybrid of both.
Scalability: As the system grows, the scalability of the recommendation engine needs
to be considered. The system should be able to handle a large number of users and
provide real-time recommendations.
Data privacy and security: The recommendation system may collect sensitive data
such as user preferences, which needs to be handled with care. Proper data privacy
and security measures need to be implemented to protect user data.
42
REFERENCES
[1] A. Hitz, S. -A. Naas and S. Sigg, "Sharing geotagged pictures for an Emotion-
based Recommender System," 2021
[2] C. Lee, D. Han, S. Choi, K. Han and M. Yi, "Multi-Relational Stacking Ensemble
Recommender System Using Cinematic Experience," 2022
[3] S. Chauhan, R. Mangrola and D. Viji, "Analysis of Intelligent movie recommender
system from facial expression," 2021
[4] A. Dushantha, R. Akalanka, H. Gayan, K. C. Siriwardhana, P. S. Haddela and L.
Wickramasinghe, "Evaluation Method for Video Advertisetments Using EEG
Signals," 2020
[5] S. Benini, M. Savardi, K. Bálint, A. B. Kovács and A. Signoroni, "On the Influence
of Shot Scale on Film Mood and Narrative Engagement in Film Viewers,"2022
[6] H. Cao and J. Kang, "Study on Improvement of Recommendation Algorithm
Based on Emotional Polarity Classification," 2020
[7] S. K. Bandara, U. C. Wijesinghe, B. P. Jayalath, S. K. Bandara, P. S. Haddela and
L. M. Wickramasinghe, "EEG Based Neuromarketing Recommender System for
Video Commercials," 2021
[8] K. Arava, R. S. K. Chaitanya, S. Sikindar, S. P. Praveen and S. D, "Sentiment
Analysis using deep learning for use in recommendation systems of various public
media applications," 2022
[9] N. Kannikaklang, S. Wongthanavasu and W. Thamviset, "A Hybrid Recommender
System for Improving Rating Prediction of Movie Recommendation," 2022
[10] https://towardsdatascience.com/recommendation-systems-a-review-
d4592b6caf4b
[11] https://www.onespire.net/news/history-of-recommender-systems/
[12] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6514576/
[13] https://www.ptidej.net/courses/ift6251/fall06/article/Projet%20Ai%20-
%20Ilusca%20-%20Yousra.doc.pdf
[14] https://www.verywellmind.com/color-psychology-2795824
[15] https://www.tutorialspoint.com/python/python_overview.htm#:~:text=Python%2
0is%20a%20high%2Dlevel,syntactical%20constructions%20than%20other%20lan
guages
[16] https://www.verywellmind.com/color-psychology-2795824M.
[17] Borgaonkar, M. Katta, P. Kudale, V. Deshpande, Prof. V. Babanne, "PVR
43
System: Personalized Video Recommendation", IJRECE, Vol. 7, Issue 4, pp. 302-
305, Oct-Dec2019.
[18] Najmeh Samadiani, Guangyan Huang, Borui Cai, Wei Luo, Yong Xiong, Jing
He, A Review on Automatic Facial Expression Recognition Systems Assisted by
Multimodal Sensor Data
[19] https://www.geeksforgeeks.org/introduction-machine-learning/
[20] Chung-Hsien Wu, Jen-Chun Lin, Wen-Li Wei, Two-Level Hierarchical
Alignment for Semi-Coupled HMM-Based Audiovisual Emotion Recognition with
Temporal Course
44
APPENDIX
A. SOURCE CODE
Webapp.py file:
import streamlit as st
from PIL import Image
import cv2
import numpy as np
from keras.models import model_from_json
from support import *
from PIL import Image
def detect_emotion(img):
json_file = open('model/emotion_model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
emotion_model = model_from_json(loaded_model_json)
emotion_model.load_weights('model/emotion_model.h5')
emotion_prediction = emotion_model.predict(cropped_img)
maxindex = int(np.argmax(emotion_prediction))
cv2.putText(frame, emotion_dict[maxindex], (x+5, y-20),
cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2, cv2.LINE_AA)
if __name__=='__main__':
45
emotion_dict = {0: "Angry", 1: "Disgusted", 2: "Fearful", 4: "Happy", 5:
"Sad", 6: "Surprised"}
genres = {"Action":28, "Comedy":35, "Crime":80, "Fantasy":14, "Horror":27,
"Thriller":53}
color_select1 = st.selectbox('Select a Color', ['Black', 'White', 'Red',
"Yellow", 'Blue'], key=1)
color_select2 = st.selectbox('Select a Color', ['Black', 'White', 'Red',
'Yellow', 'Blue'], key=2)
color_select3 = st.selectbox('Select a Color', ['Black', 'White', 'Red',
'Yellow', 'Blue'], key=3)
img_file_buffer = st.camera_input("Capture")
suggest = st.button('Suggest')
if suggest:
if img_file_buffer is not None:
image = Image.open(img_file_buffer)
cv2_img = np.array(image)
try:
img, id = detect_emotion(cv2_img)
st.write("Detected Emotion from the face:
"+emotion_dict[id])
all_colors = [color_select1, color_select2, color_select3]
all_movie_names, all_poster_links =
get_all_recom(all_colors, id)
for name, poster in zip(all_movie_names,
all_poster_links):
st.write(name)
st.image(poster)
except:
id = 1
all_colors = [color_select1, color_select2, color_select3]
all_movie_names, all_poster_links =
get_all_recom(all_colors, id)
for name, poster in zip(all_movie_names,
all_poster_links):
st.write(name)
st.image(poster)
else:
all_colors = [color_select1, color_select2, color_select3]
all_movie_names, all_poster_links = get_all_recom(all_colors,
1)
for name, poster in zip(all_movie_names, all_poster_links):
st.write(name)
st.image(poster)
support.py file:
import requests
import random as r
color_map = {
46
'Black':0,
'White': 1,
'Red':0,
'Yellow':1,
'Blue': 1
}
face_map = {
'Angry': 0,
'Disgusted': 0,
'Fearful': 0,
'Happy': 1,
'Sad': 0,
'Surprised': 1,
}
emotion_dict = {
0: "Angry",
1: "Disgusted",
2: "Fearful",
4: "Happy",
5: "Sad",
6: "Surprised"
}
apiKey = 'bdfe31153439600a352617c3ca93d2e1'
a=r.sample([14,27,28,35,878,11749],2)
b=r.sample([28,35,53,80],2)
genre_map = {
'pos' : a,
'neg' : b
}
def get_pos_neg(sentiments):
pos, neg = 0,0
for sentiment in sentiments:
if color_map[sentiment] == 1:
pos += 1
else:
neg += 1
return (pos,neg)
47
if neg + 1 > pos:
return 'neg'
else:
return 'pos'
else:
pos, neg = color_sentiment
if pos > neg:
return 'pos'
return 'neg'
def get_poster(movie_id):
url =
"https://api.themoviedb.org/3/movie/{}?api_key=bdfe31153439600a352617c3ca93d2e
1&language=en-US".format(movie_id)
data = requests.get(url).json()
poster_path = data["poster_path"]
full_path = "https://image.tmdb.org/t/p/w500/" + poster_path
return full_path
def suggest_movie(final_emotion):
genre_list = genre_map[final_emotion]
movie_names = []
movie_poster_links = []
for genre in genre_list:
response =
requests.get(f"https://api.themoviedb.org/3/discover/movie?api_key={apiKey}&la
nguage=en-
US&sort_by=popularity.desc&include_adult=false&include_video=false&page=1&with
_genres={genre}")
movies = response.json()['results']
for movie in movies:
movie_names.append(movie['original_title'])
movie_poster_links.append(get_poster(movie['id']))
return movie_names, movie_poster_links
48
B. SCREENSHOTS
C. RESEARCH PAPER
49
Emotionally Driven Film Referral System Using Color Psychology
Pulipati Anshu 1 , Meghana Elisetti 2 , Albert Mayan J 3, Ashok Kumar K 4
1,2
U.G Student, Department of CSE, Sathyabama Institute of Science and Technology, Chennai
3,4
Professor, School of Computing, Sathyabama Institute of Science and Technology, Chennai.
1
[email protected], [email protected] , [email protected],
4
[email protected]
Abstract - The modern world comprises many forms of In [1], shot scale, the camera's apparent distance from a scene's
entertainment, the most common being movies. Over the past subject, has artistic and narrative functions in any film. The
decades, the methods of production, creation, and distribution first step is to examine how the distribution and rotation of
of movies have greatly advanced. People have a variety of close, medium, and long shots affect viewers' ratings of film
movies to choose from and require a recommendation system mood to determine the impact of shot scale on responses of
to guide them in this process. Emotion-based movie lower and higher complexity. The next step is analyzing how
recommendation systems (E-MRS) are ones that recommend
shot scale affects violent scene viewers' narrative engagement
movies based on the user's emotions. Emotions are a strong
and its sub-scales. The shot scale and viewer emotional
reaction to stimuli and are an intelligent and rational form of
behaviour. It is difficult to assess the emotional responses to involvement are being further investigated in this study using
movies based on the diverse reactions to movies. Colour large corpora.
psychology can be used to detect various emotional states such
In [2], machine learning helps tackle real-time commercial and
as happiness, sadness, anger, fear, and excitement. In this
research problems. Machine learning is simply the expansion
paper, an user interface is created where colours are used to
of mathematical applications. Facial recognition uses machine-
represent the user’s appropriate emotions, and then a movie is
learning models. Security, automatic attendance, and offices
recommended based on that. A hybrid approach, combining
employ facial recognition in real-time. Facial detection-based
collaborative filtering and content-based filtering, is used to
movie recommendation is an important machine learning
recommend movies.
application. Capturing emotion instead of browsing movies
saves time. Compared to CNN, decision tree-based facial
Keywords - Movie Recommendation, Color Psychology, Emotion recognition employing boosting methods is inefficient. CNN is
Recognition, Collaborative filtering technique, Content-based filtering
better for accuracy. Social filtering and content-based systems
technique.
enhance the recommended systems' power.
50
lately addressed the seeming difficulties in judging movie- it is challenging to use in practical settings, as demonstrated in
induced emotions and the indisputable significant a journal by Najmeh Samadiani and distributed by PubMed
heterogeneity in participants' emotional responses to film Central (PMC). According to the journal, 97% of laboratory-
content. Connotation uses audio-visual descriptors' objectivity controlled FER systems have high accuracy. However, when
to anticipate user emotions. No physiological signs are needed. these results are transferred to real-world applications, 50% of
It uses connotative concepts and user reactions to similar the systems have low accuracy [11].
stimuli, not other people's widely varying emotional rates. This
study extracts audio-visual and cinematic language descriptors The low accuracy is due to the large variety of responses
and uses users' connotative ratings to place, compare, and people showcase. Some people are more expressive and react
recommend movie scenes. obviously to peculiar scenarios, while others are unable to do
so and react in less obvious manners. According to an IEEE
In [7], machine learning helps tackle real-time commercial and journal by According to Chung-Hsien Wu and Jen-Chun Lin,
research problems. Machine learning is simply the expansion introverts' facial expressions differ significantly from those of
of mathematical applications. Facial recognition uses machine extroverted people [12]. Hence this suggests that it is hard for a
learning models. Real-time security, automated attendance, and system to accurately recognize the emotions of the user merely
workplace applications use facial recognition. Facial detection- based on their facial recognition, as there is no standard
based movie recommendation is an important machine learning reaction to showcase that the user is feeling a certain way.
application. Capturing emotion instead of browsing movies While an introvert might feel happy but only smile modestly in
saves time. Some studies are based on attentional response, an extrovert would respond with a broad grin.
convolutional neural networks and recommended systems to However, it does not imply that the introvert is least happy
suggest movies or songs based on CNN output. Compared to than the extrovert, but it means that there is more than one way
CNN, decision tree-based facial recognition employing to showcase the feeling.
boosting methods is inefficient. CNN is better for accuracy.
Content-based and collaborative filtering recommended III. METHODOLOGY
systems work better together. The proposed system is to create a user interface to recommend
movies based on user emotions using color psychology and the
In [8], one recent software recommends things based on user's facial expressions. There are five basic emotions that a
customer needs. It cannot generalize the recommender system human being has: love, joy, anger, sadness, and fear. Some
since user needs vary. Connotation predicts viewers' emotions researchers discovered that colors significantly influence users'
using audio-visual descriptors' objectivity. Connotative space emotions and feelings. Color is a natural form to represent
is formed by extracting audio-visual descriptors and the user's human emotions. According to figure 1 below, the colors
emotional state. Then the connotatively closest movies are represent a user's positive and negative emotions.
suggested. Finally, the framework is evaluated subjectively by
asking consumers to validate the film components that match
their affective needs.
In [9], mind Frame, the planned online app, recommends music
and movies based on mood. The suggested system detects and
recommends music and movies based on user moods like joy,
sadness, and tranquility. The Python OpenCV library extracts
facial features. A neural network trained with labeled photos of
the user's emotional states predicts their mood. Django, a web
framework, is used here to store, view, and process user data
and requests. Fig. 1: Emotions Varying with Colors
Finally, in [10], researchers can create context-aware apps that A. Data acquisition and pre-processing
adjust to users' emotions by recognizing facial expressions. The TMDB film database, known for its extensive marking,
Computer vision researchers studied face recognition. Face was utilized to narrow the pool of possible outcomes. As the
Fetch, a novel context based multimedia content primary identifier for clustering, the use of genre was choosen.
recommendation system, analyses a user's facial expressions to This brief examination has resulted in the identification of 24
determine their emotional state (joy, despair, panic, distinct genres. These films span the gamut from action to
displeasure, amaze, and wrath) and provides multimedia history, and many fall into multiple categories. There are many
content accordingly. of products that may not even come close to summarizing a
complete film story. To make meaningful discoveries, the need
The existing systems use facial recognition to detect the of more information is necessary. Obtaining as much
emotion of the user. A facial recognition system can match a information as possible is mainly limited by the time period.
human face from an image or video to a database of faces. It As the data model becomes more refined, less of it is
can be useful in airports, ATMs, surveillance (watching and considered. There may be one significant advantage depending
looking for drug users and criminals, controlling CCTV), and on how recommendations are made. Structures for sharing
security(building access control, flight boarding system, information with social media, movie forums, machine
airports, office access, email authentication on multimedia learning forums, and other online channels are required. Our
workstations). data collection for the recommender system took roughly eight
weeks. Users of the sentiment-based recommender contributed
Facial Emotion Recognition (FER) is a technology that
thousands of individual opinions.
recognizes emotions from photos and static videos. Some
applications of this software are for research areas targeting The datasets used in this project were obtained from the
mental disease diagnosis and human interaction detection. This Kaggle website. The datasets are “tmdb_5000_movies.csv”
technology is ground-breaking in areas of research. However, and “tmdb_5000_credits.csv” as shown below. In the
51
“tmdb_5000_movies.csv” dataset, there are 4803 rows and 20 line with his present may suggest film F; nevertheless, the user
columns. It comprises budget, genre, homepage, original may not agree that he would like to see this film while he is in
language, id, original title, companies, overview, popularity, love, and so he may give it a rating of 1 out of 5. This
keywords, release date, production countries, revenue, spoken transforms the letter u into a profile vector: u = {{A: 5, B: 5, F:
languages, runtime, tagline, title, vote average, status, and vote 1}, {C: 5}, {D: 5}, {}}.
count as fields. There are 4 columns and 4803 rows in the
"tmdb 5000 credits.csv" dataset, with fields like crew, cast, The colors used in this project are black, white, red, yellow,
title, and movie id. The two datasets contain 24 genres. The and blue. The user is required to select three colors among the
datasets are joined based on the column shared by both CSV colors mentioned. Each color is assigned 1 and 0 values.
files. To improve results, null or missing values will be Negative and positive emotions are represented by numbers 1
checked while training the model. If there are any null values and 0. The colors are mapped by 1 and 0 if the user selects
or missing values, drop those values. Among all the columns in black, white, and blue. The mapped values are 0, 1, and 1. The
the merged dataset, the selected features are genres, keywords, resulting emotion is positive if the mapped value of positive
cast, crew, title, overview, and movie id. emotions exceeds the assigned value of negative emotions.
Otherwise, the outcome would be negative.
Table 1: Credits TMDB dataset C. Training Model
Collaborative filtering and content-based recommendation
cascade in the recommendation algorithm. Collaborative
filtering: below is the explanation for how our collaborative
filtering approach can be used in recommendation.
The executive phases of the CF algorithm for a database U of
user profiles, a target user u, a movie m, and an emotion e are:
Step 1: Get u's database profile vector. User Profiles database
Table 2: Movies TMDB dataset U preferences are considered. For the target user u, his profile
vector is u = {{A: 5, B: 5, F : 1}, {C : 5}, {D : 5}, {}}.
Step 2: Find users who have rated at least one movie with u.
The system will only consider users who share at least one
movie-emotion with the target user u to decrease the
computing user-film matrix. e.g., Let Un is a subcollection of
U with just movie-emotion-sharing participants. u1, u2, u3 3 Un
users: u1 = {{A: 5, F: 2}, {C: 5}, {E: 5}, {I: 5}} and u2 =
B. Building Emotion Detector {{B: 5}, {}, {G : 5}, {H : 4}}.
This algorithm's job is to deduce the user's emotional state
from a palette of three colors. So that it can achieve that, it will Users rate the same movie differently. Thus, no algorithm can
use the following reasoning to examine the color sequence: If be more accurate than the variance in user ratings for the same
the user selects three colors and at least two of them have the item. Users rarely detect recommender system accuracy
same emotional connotation, then that feeling is the one they changes. Some recommender systems give users accurate but
are experiencing at the moment. If the user selects the hues pointless recommendations. Most movie fans have seen Lord
yellow, blue, and green or yellow, blue, and black, for instance, of the Rings. As a result, advising users again is worthless.
they are experiencing elation. An alternative scenario exists Accuracy, mean absolute error, and novelty factor should be
where the first hue chosen represents happiness, the second hue used to improve evaluation findings. The mean absolute error
signifies love, and the third represents either sadness or wrath is defined as the average absolute difference between a
(negative emotion). predicted rating and the actual rating provided by the user.
The user's current emotional state is "joy-love" in this case. If, Thus, it may assess how well the recommender system predicts
on the other hand, two of the colors signify melancholy and user ratings. Precision is the ratio of well-recommended things
one suggests fury, then sadness and anger represent the user's to rated recommendations. Well-recommended movies are new
current emotional state. If a person selects the positive feelings releases with a minimum 4/5 rating. Our tests determine how
of "joy-love". For instance, the result will be a combination of well our system's recommended products match user needs and
the following colors: yellow, light red, and black. Movie preferences. Emotion detector precision is another goal.
evaluations and associated feelings are recorded in user D. Comparison of Face Emotion and Color Emotion
accounts. All of this data may be found in the User Profiles Six user emotions are taken into account in this project, like
repository. The user will be asked to fill out a survey during anger, sadness, happiness, disgust, surprise, and fear, and these
registration to share his thoughts on which kind of movies best emotions will be recognized by the user's face. Angry, fear,
represent certain moods. A user's choice of movie is treated as sadness, disgusted comes under negative emotion. Surprised
an implicit 5-star rating. and happy come under positive emotion. If the face-detected
Love, rage, joy, and sadness are the four categories of user emotion is negative and the three color choice produces a
emotions that make up the profile vector. The user's preferred positive emotion, then the movies which results in positive
genres and titles of films to view when experiencing an emotions to the user will be recommended. Dot operator is
emotion E are subsequently catalogued under each emotion used to compare the face emotion and color emotion.
group E. If user u selects films A and B for the emotion E. Structure of the Developed Framework
"love," films C and D for "anger," and film D for "joy," then
user u's profile vector would look like this: u = {{A: 5, B: 5},
{C: 5}, {D: 5}, {}}. To provide input on the system's film
suggestion, the user can rate the degree to which the film is in
52
Fig 2: Development of the New Methodology
The primary goal is to create a website where a user can get
recommendations of movie by their emotions using color Fig 3: Detection of Emotions
psychology and facial gestures. Color psychology is an
effective way of detecting. a user's emotions through color. If the result of the dot product is greater than 0, then the value
The TMDB movie dataset is used to train and test a model to 1 is given to the cell inquestion; otherwise, the value 0 is used.
detect the appropriate emotions. Any missing values, errors, or During 1918–19 by completely closing their borders. Other
outliers must be corrected. To accomplish this, a preprocessing studies suggested that travel restrictions could delay the start
technique must be used to improve accuracy. The following of local transmission and slow global spread. According to the
step is to visualize the data. scant evidence available, NPIs related to international travel
would be ineffective in containing a pandemic of influenza
The hybrid technique is then used to extract features in the and would require a significant investment in terms of
following step. The two techniques used in this project to funding.
accomplish this are content-based filtering and social filtering.
The collaborative filtering technique deals with the similarities
of features between users and recommends personalized
movies, whereas content-based filtering is all about filtering a
movie's like, dislike, or ratings. The user profiles vector using
matrix system and emotions of a user will be extracted with the
help of these two techniques.
As a result, a model is developed to recommend movies based
on emotions. The user will create his profile and choose any
three distinct colors here. Following that, the user must
complete a survey. Every color represents a different emotion
(from color psychology). As a result of the survey results, the
model will recommend movies based on the emotions
Fig 4: Films Referred According to Emotions
expressed by the user through the use of colors and facial
emotion.
V. CONCLUSION
IV. RESULTS AND DISCUSSION The study's findings show how emotion-based movie
In this paper, a recommendation system is designed to help the recommendation algorithms might improve user interaction
user find the best movies to watch based on the genre the user and engagement with movie streaming services. The system
like. For the simple reason that viewers either loved or hated can improve user satisfaction and loyalty by making
the movies, our system only considered ratings of 1 and 5. personalised recommendations that are in line with users'
This approach provides far superior recommendations to emotional states and preferences. The experience of watching
consumers since it lets them comprehend the connection a movie is heavily influenced by emotions, and neglecting
between their feelings and the suggested content. When a user them might result in lower user engagement. This would
gives a film's genre a high rating, similar film genres are ultimately promote the business growth of these services.
suggested. What makes this system unique is that it takes into Future studies in this field might examine the usefulness of
account the user's feelings and uses that information to make emotion-based movie recommendation systems in various
informed product recommendations. From these findings, it cultural and demographic contexts, as well as the feasibility of
can be infer that using more dense data will improve the combining other user data (such as social media usage and
recommender's performance and that include additional genres browsing history) into the algorithm for making
will allow us to provide more targeted suggestions. recommendations.
53
4. N. G. Meshram and A. P. Bhagat, "Connotative features
based affective movie recommendation system",
International Conference on Information Communication and
Embedded Systems (ICICES2014), pp. 1-7, 2014.
5. H. Cao and J. Kang, "Study on Improvement of
Recommendation Algorithm Based on Emotional Polarity
Classification", 5th International Conference on Computer
and Communication Systems (ICCCS), 2020, pp. 182-186,
doi: 10.1109/ICCCS49078.2020.9118414.
6. S. Benini, L. Canini, and R. Leonardi, "Affective
Recommendation of Movies Based on Selected Connotative
Features" in IEEE Transactions on Circuits and Systems for
Video Technology, vol. 23, no. 4, pp. 636-647, April 2013.
7. S. Chauhan, R. Mangrola and D. Viji (2021), "Analysis of
Intelligent movie recommender system from facial
expression", 5th International Conference on Computing
Methodologies and Communication , pp. 1454-1461.
8. N. Balaganesh, K. Muneeswaran and N. Sivakumar,
"Feature selection for recommendation of movies" in Global
Conference on Communication Technologies (GCCT), pp.
250-255, 2015.
9. Mayan J.A, Arifa S, Pavithra R,"Semantic based multi
lexical ranking technique for an effective search in protected
cloud,2016 International Conference on Control,
Instrumentation, Communication and Computational
Technologies, 2016, pp. 570-576.
10. M. Gupta, B. Nithin, S. V. Amirishetty, and H.
Kurakula(2021),"Mind Frame Music and Movie
Recommendations to uplift the current mood using Deep
Learning", International Conference on System,
Computation, Automation and Networking , pp. 1-6.
11. M. B. Mariappan, M. Suk and B. Prabhakaran,
"FaceFetch: A User Emotion Driven Multimedia Content
Recommendation System Based on Facial Expression
Recognition", IEEE International Symposium on
Multimedia, 2012, pp. 84-87, doi:10.1109/ISM.2012.24.
12. Albert Mayan J, Karthikeyan S, Nikhil C, Bharat M and
Padmavathy J, "Facial attendance system technology using
Microsoft Cognitive Services" ,International Journal of
Engineering Systems Modelling and Simulation, Vol. 12,
Nos. 2/3, pp. 180-187 ,2021
13. Mary Posonia A, S. Vigneshwari, Albert Mayan J, D.
Jamunarani,"Service Direct : Platform that Incorporates
Service Providers and Consumers Directly", International
Journal of Engineering and Advanced Technology (IJEAT) ,
Vol.8 ,No.6, 2019
54