CCTV Project
CCTV Project
DEGREE OF
SUBMITTED BY
GAURIKADLAG B190554234
SHREYA DESHMUKH
SUMEDH JOSHI
“Novel
Machine Learning-Based Approach for
Real-Time Suspicious Activity Detection in CCTV
Footage”
Submitted by
GAURIKADLAG B190554234
SHREYA DESHMUKH
SUMEDH JOSHI
is a bonafide student of this institute and the work has been carried out by him under the
supervision of Mr. Satyajit Sirsath and it is approved for the partial fulfillment of the requirement
of Savitribai Phule Pune University, for the award of the degree of Bachelor of Engineering
(Computer Engineering).
Place : Pune
Date :
NMIET, Department of Computer Engineering 2022-23
i
ACKNOWLEDGEMENT
We express our sincere thanks to our Guide Mr. Satyajit Sirsath for his constant
encouragement and support throughout our project, especially for the useful suggestions given during
the course of project and having laid down the foundation for the success of this work.
We would also like to thank our Project Coordinator, Prof. for her assistance, genuine support
and guidance from early stages of the project. We would like to thank
Prof. Dr. Saurabh Saoji, Head of Computer Department for her unwavering support during the
entire course of this project work. We are very grateful to our Director, Prof. Dr. for providing us with
an environment to complete our project successfully. We also thank all the staff members of our college
and technicians for their help in making this project a success.
We also thank all the web committees for enriching us with their immense knowledge. Finally, we
take this opportunity to extend our deep appreciation to our family and friends, for all that they meant
to us during the crucial times of the completion of our project.
GAURI KADLAG
SHREYA DESHMUKH
SUMEDH JOSHI
Predicting a person's body part or joint placement from an image or a video is considered suspicious
activity. In this study, neural networks will be used to identify suspect human activity in live CCTV
footage.
In order to stop terrorism, theft, accidents and illegal parking, vandalism, fighting, chain snatching,
crime and other suspicious activities, human activities can be watched in sensitive and public areas
such as bus stations, railway stations, airports, banks, shopping malls, schools and colleges, parking
lots, roads, etc. through visual surveillance.Since it is exceedingly challenging to constantly monitor
public spaces, it is necessary to install intelligent video surveillance that can track people's
movements in real-time, classify them as routine or exceptional, and provide alerts. The vast majority
of the research being done concentrates on photos rather than videos.
Keywords—
01 Introduction 1
1.1 Overview 2
1.2 Motivation 2
1.4 Objectives 3
1.6 Limitations 4
02 Literature Survey 5
04 System Design 25
4.1 26
System Architecture
05 Project Plan 33
5.4.2 Management 38
06 Project Implementation 40
07 Software Testing 46
8.1 Outcomes 51
09 Conclusions 57
9.1 Conclusions 58
9.3 Applications 59
2 Comparative Study 13
INTRODUCTION
Suspicious human activity recognition from surveillance video is an active research area of image
processing and computer vision. Through the visual surveillance, human activities can be
monitored in sensitive and public areas such as bus stations, railway stations, airports, banks,
shopping malls, school and colleges, parking lots, roads, etc. to prevent terrorism, theft, accidents
and illegal parking, vandalism, fighting, chain snatching, crime and other suspicious activities. It is
very difficult to watch public places continuously, therefore an intelligent video surveillance is
required that can monitor the human activities in real-time and categorize them as usual and
unusual activities; and can generate an alert. Recent decade witnessed a good number of
publications in the field of visual surveillance to recognize the abnormal activities.
1.2 MOTIVATION
One of the main issues in computer vision that has been researched for more than 15
years is human suspicion activity. The sheer amount of apps that can profit from
Activity detection makes it crucial.
Human position estimation, for instance, is utilised in a variety of applications, such as
marker-less motion capture, advanced human-computer interaction, video surveillance,
animal tracking and behaviour analysis, and sign language recognition.
Low cost depth sensors have drawbacks including being restricted to indoor
application, and it is challenging to infer human poses from depth photos because of
their low resolution and noisy depth information.
We therefore intend to use neural networks to solve these issues. The detection of
suspicious human activity in surveillance footage is a current field of study for image
processing and computer vision.
1.4 OBJECTIVES
Probing trends in types of stock request and factors affecting the stock prices.
Support on sentiment analysis based on results derived from Twitter, Reddit .
Developing a feature to enable mock trading within the system.
Deriving productive results to filter stocks based on fundamental, technical analysis.
Developing a backtesting engine to ensure the correctness of the strategy.
Developing a valuation determine module for individual stocks based on several financial
metrics to know if it is undervalued, overvalued or fairly valued.
Developing a rating system for the stocks to indicate their fundamentals as well as their
technical health.
Probing trends in stock requests and factors affecting the stock prices.
Probing the available tools and ways for data mining and also opting those that are an
appropriate fit to arrive at a conclusive result.
The system will have provision for mock trading for the users. .
The system can display sentiment analysis results derived from twitter analysis.
The system will have an ability to filter potential SIP candidates based on results derived
from fundamental analysis.
The system will have provision for the users to backtest the filtered SIP candidates to check
the performance of the system.
Data Availability:
The availability and quality of data is one of the major problems in any financial evaluation effort.
Particularly when dealing with multivariate analysis, which necessitates data from many sources,
access to precise, trustworthy, and thorough data can be constrained. Your results' correctness can
be impacted by biased data that is incomplete or unreliable.
Financial data may be erratic, erratic, have missing numbers, or have outliers. It can take a lot of
time and effort to clean and preprocess the data in order to ensure its dependability and quality.
Financial markets are prone to a variety of external influences, such as the state of the economy,
geopolitical developments, and investor attitude. Because of the complex nature of these
interactions, it may be challenging to precisely record all the pertinent variables and components,
which could result in errors in your research.
LITERATURE SURVEY
In this study, They have analyzed and compared the accuracy of three algorithms LSTM,
ARIMA, and Linear Regression to predict stock prices. They have used Tweepy, a python
library to access. Twitter API to perform sentiment analysis of tweets. The App forecasts stock
prices for the subsequent seven days for any stock listed under NASDAQ or NSE. The
sentiment analysis of tweets combined with the predicted prices recommends the user whether
to shop for or sell a particular stock.
2. Predicting Stock Closing Price After COVID-19 Based on Sentiment Analysis and
LSTM: Christopher Chou, Junho Park, Eric Chou IAEAC 2021(ISSN 2689-6621)
2021
In this study, they focus on LSTM, GloVe, StockTwits, sentiment analysis, deep learning, Twitter.
METHODS used in this paper are A. Data Collection, B. Data Pre-processing, C. Sentiment
Analysis Based on LSTM, D. Price Prediction With LSTM and Attention Mechanism, E.
Evaluation Metrics. the accuracy of this paper is 74% using Recall and Precision using Evaluation
Metrics. the LSTM based on sentiment analysis has the highest accuracy in predicting the stock
closing price of AAPL across the year 2020.
In this paper, they focus on Prophets, predictive, investiture, sentiment analysis, and linear
regression. Process for developing the model 1) Data Pre-processing,2) Data Transformation, 3)
Data Cleaning 4) Data Integration 5) Feature Selection and Generation 6) For classification the
method used for our work is Support Vector Machine (SVM) classification 7) for Model Learning
They used Simple Linear Regression. In this paper, they analyze the daily stock prices of four
PCCOE, Department of Computer Engineering 2022-23
6
companies and classified the prices as positive or negative. The accuracy of the Apple Company is
81.3488 %, Google Company is 98.2878 %, Microsoft Company is 98.2186 % and Amazon
Company is 98.2186 %.
4. A Novel Twitter Sentiment Analysis Model with Baseline Correlation for Financial
Market Prediction with Improved Efficiency: Xinyi Guo†, Jinfeng Li 2019 Sixth
International Conference on Social Networks Analysis, Management and Security
(SNAMS)
This paper study on Twitter sentiment, financial prediction, closed-end fund discounts, lexicon-
based classification, and big data analytics. In this paper, they build up a Twitter sentiment
analysis solution targeting the stock market prediction and conduct a comparative study against
traditional investor sentiment analysis (ISA) methods. The methodology used 1)Closed-end Fund
Discount (Premium) Method. 2) Twitter Sentiment Analysis 3) Machine Learning and Lexicon-
based Classification. For TWITTER SENTIMENT ANALYSIS MODEL SETUP 1) Data Sources
and Collection 2) Sentiment Model and Data Analytics 3) Baseline Correlation for Decoupling
Historical. In this paper, they work on Twitter sentiment analysis model to inform fast decision-
making in FTSE 100 stock market with decent prediction accuracy. The accuracy of this paper is
67.22% under the 9th-order polynomial regression fit.
5. Stock Price Prediction Using News Sentiment Analysis: Saloni Mohan1, Sahitya
Mullapudi1, Sudheer Sammeta1 2019 IEEE Fifth International Conference on Big
Data Computing Service and Applications (BigDataService).
In this paper, they work on stock market prediction, cloud, big data, machine learning, and
regression. They improve the accuracy of forecasts by collecting a significant amount of time
series data using deep learning models. For EVALUATION METHODOLOGY they use Mean
Absolute Percentage Error (MAPE) .Used methodology 1) ARIMA(autoregressive (AR),
integrated (I), and moving average (MA) models). 2) Facebook Prophet 3) Recurrent Neural
Network (RNN) 4) Long Short-Term Memory (LSTM). They achieved good results with RNN and
8. Stock price prediction using MLR on Sentiments and Fundamental Profile : 2021
Akshat Goe 12th International Conference on Computing Communication and
Networking Technologies (ICCCNT)
This paper study outlines a new approach for a millennial problem “Stock price prediction” by
incorporating Sentimental Analysis and financial fundamental profile analysis .Use methodology
time-series analysis and applying ML techniques on the data to obtain the stock price. The
predicted output was compared with actual output and root mean square percent error and mean
9. Stock market analysis using candlestick regression and market trend prediction
(CKRM): 2020 K. Vijayakumar Springer-Verlag GmbH Germany, part of Springer
Nature 2020
The paper Stock market analysis using candlestick regression and market trend prediction
(CKRM) uses K-NN regression machine learning model for market trend prediction. The
accuracy as the results for K-NN regression was between range 90-96 %. And using line
regression they got accuracy upto 80-85%. Whereas on the other hand got 60-80% accuracy using
SVM(Support Vector Machine).
10. Efficient predictability of stock return volatility: The role of stock market implied
volatility: 2020 Zhifeng Daia, Huiting Zhoua, Fenghua Wenb,c, Shaoyi Hed North
American Journal of Economics and Finance 52
In this paper ,the summary statistics of realized volatility was given. The predictors were taken into
consideration such as S &P500 ,NIKKEL 225,DAX30 ,CAC40, FTSE100, WTI prices, BRT .
They states that stock market implied volatility has significant predictability of return volatility.
11. Neural networks and arbitrage in the VIX A deep learning approach for the VIX:
2020 Joerg Osterrieder , Daniel Kucharczyk , Silas Rudolf, Daniel Wittwer Digital
Finance (2020)
The study summarizes that prediction of VIX with accuracy of 61.2% using one LSTM layer.
LSTM network was trained on SPX option quote data for prediction of VIX values.
Stock Price Prediction analyzed and Paper provide 67% Data Availability
using Machine compared the accuracy ,Limited
1 Learning and Sentiment accuracy of three Timeframe
Analysis algorithms LSTM,
ARIMA, and Linear
Regression to predict
stock prices
Stock Price Prediction Support Vector Enhanced Data Quality and
Using News Sentiment Machines (SVM), Information Accuracy,
2 Analysis Gradient Boosting Incorporation, Sentiment
Methods, Real-Time Analysis
Convolutional Neural Decision-Making Limitations
Networks (CNN)
An Entropy-based Lexicon-based Objective Generalizability
Evaluation for approaches, Evaluation Metric, of Twitter Data,
3 Sentiment Analysis of Support Vector Utilization of Dependency on
Stock Market Prices Machines (SVM), Twitter Data Sentiment
using Twitter Data Naive Bayes. Analysis
Techniques
2 Comparative Study
Assumptions : Financial data for the tickers is accurate, data gathering libraries remain
open source, tweets fetched from the TwitterAPI are appropriate and not the ones coming
from bots.
1. Valuation Determiner:
Retrieve real-time financial data of companies.
Perform valuation calculations based on various methods (e.g., discounted cash flow, price-
to-earnings ratio).
Provide valuation reports and recommendations.
2. Mock Trader:
Simulate trading scenarios using historical data.
Allow users to practice trading strategies without real money.
Provide performance metrics and feedback on trades.
3. Sentiment Analyzer:
Analyze news articles and social media data for sentiment related to specific stocks or
crypto currencies.
Generate sentiment scores or indicators for decision-making.
Provide real-time sentiment updates.
4. Pattern Analyzer:
Analyze historical price patterns of stocks or crypto currencies.
Identify common patterns such as head and shoulders, double tops, or ascending triangles.
5. Indices Health:
Monitor the overall health and performance of stock market indices.
Track key indicators like market breadth, sector performance, and index composition.
Provide insights into market trends and potential investment opportunities.
6. SWOT Analyzer:
Analyze the strengths, weaknesses, opportunities, and threats of companies or crypto
currencies.
Evaluate factors such as financials, competitive landscape, market conditions, and
regulatory changes.
Generate SWOT reports and recommendations.
7. Fundamental Scans:
Conduct fundamental analysis of companies.
Evaluate financial statements, earnings reports, and key performance indicators.
Identify undervalued or overvalued stocks and provide analysis reports.
9. Strategy Backtester:
Allow users to test their trading strategies using historical data.
Provide performance metrics and comparison against benchmark indices.
Identify strengths and weaknesses of strategies.
The Yahoo Finance app offers hours of live, daily market coverage, along with professional
analysis and current market information. Investors, financial experts, and corporate executives who
take their money seriously go there because they are insight-driven.
Obtain information using the Twitter public API. Application Programming Interface, or API, is a
means for software to connect to the Twitter platform (as opposed to the Twitter website, which is
how people connect to Twitter). Although it supports a wide range of Twitter interaction tasks, the
API functions most important for obtaining a Twitter dataset include:
Retrieving tweets from a user timeline, or the collection of tweets that an account
has posted;
Tweet searching
Filtering real-time tweets (tweets as they are posted and are being processed by the
Twitter platform).
3.3.1.3. NSEPy
• The library NSEpy is used for extracting both historical and present-day data. The API of this
library has been designed to be as simple as feasible.
The primary objective of NSEpy is to provide data series that can be used and analyzed using the
Scipy stack.
• The Technical Analysis library, often known as TA-Lib, which contains 200 indicators, including
the MACD and RSI, may be easily accessed by NSEpy.
• For automated or semi-automatic algorithm trading systems or backtesting systems for Indian
markets, this library would serve as the essential building component.
Visual Studio Code, a simplified code editor, provides development tasks such task execution,
debugging, and version control. It tries to provide only the tools a developer requires for a brief
cycle of code-build-debugging and leaves more complex processes to IDEs with more features,
like Visual Studio IDE.
3.3.2.2.GitHub
GitHub is a commercial company that offers a hosting service for online Git repositories. The use
of Git for version control and team collaboration by individuals and teams is essentially
substantially facilitated by this. GitHub's user-friendly design makes Git accessible to even non-
programmers. Generally speaking, using Git without GitHub requires a little bit more technical
know-how and command-line knowledge.
1. Performance:
The app should respond quickly to user interactions and provide real-time data updates.
Processing large datasets and complex calculations should be efficient and timely.
2. Reliability:
The app should be reliable and available for use whenever users require.
It should handle errors gracefully and provide appropriate error messages.
3. Usability:
4. Security:
User data, including login credentials and trading strategies, should be securely stored and
protected.
Communication between the app and external APIs should be encrypted.
5. Scalability:
The app should be able to handle a growing user base and increased data volume.
It should be scalable in terms of computational resources and storage capacity.
6. Compatibility:
The app should be compatible with various operating systems and devices (e.g., web,
mobile).
7. Maintainability:
The app should be designed and implemented using modular and well-documented code.
It should be easy to maintain and extend with new features or updates.
8. Integration:
The app should integrate with external data providers or APIs to retrieve financial data and
news feeds.
It should allow users to import/export data or connect with their brokerage accounts.
9. Accessibility:
The app should be accessible to users with disabilities, complying with accessibility
standards such as WCAG (Web Content Accessibility Guidelines).
The app should comply with relevant data privacy regulations (e.g., GDPR) and protect
user data.
Clear privacy policies should be provided, and user consent should be obtained for data
processing.
A file system is a piece of software that manages and organises files on a storage medium.
It controls data storage and retrieval.
The file system is provided as a component of the operating system itself and is more
closely tied to an operating system.
The file system offers information on data formats and storage options.
Compactness: Information can be stored in a compact area.
Data Retrieval: Computer-based systems provide enhanced data retrieval methods that
make it easy and effective to extract data from files.
Editing: It is simple to make changes to any data stored as files on a computer.
3.5.2.1. Python
Python served as the select language for this project. For many reasons, this was an easy call.
Python has a sizable community supporting it as a language. A simple visit to Stack
Overflow can fix any issues that may arise. The most often used language on the site,
Python, provides the most straightforward answers to all queries.
Python has several strong tools available for scientific computing packages. Packages like
NumPy, Pandas, and SciPy are thoroughly documented and completely free. These
packages will drastically reduce and vary the amount of code required to create a certain
programme. This speeds up repetition.
Python is a language that is tolerant and allows for programmers to look to be pseudo code. This
will be useful when the pseudo code provided in tutorial papers is requested and checked. This step
is occasionally quite simple when using Python.Python does have certain flaws, though. Packages
are notorious for their duck writing because Python is a dynamically written language. When a
package approach provides something that, for example, resembles an array but is not an array,
this could be frustrating. Additionally, the return type of a method was not explicitly stated in the
standard Python documentation. This necessitated a great deal of trial-and-error testing, which is
not possible in a language that is so well written. This issue makes learning to use an alternative
Python package or library more challenging than it would otherwise be.
The agile software development life cycle (SDLC) model combines iterative and incremental
process models, with a focus on process adaptability and customer satisfaction through quick
delivery of functional software products. The product is broken down into manageable incremental
builds using agile methods. We offer these builds in iterations. The normal duration of each
iteration is one to three weeks. Teams from different functions collaborate on a variety of projects
at each iteration, including:
Planning
Requirements Analysis
Design
Coding
Unit Testing and Acceptance Testing.
A usable product is shown to the client and other key stakeholders at the conclusion of the
iteration. According to the Agile model, each project should be handled differently and the existing
procedures should be adjusted as needed to better meet the project's demands. The tasks in Agile
are broken down into time blocks.
The Agile thought process had started early in the software development and started becoming
popular with time due to its flexibility and adaptability.
SYSTEM DESIGN
PROJECT PLAN
We have made a metric of some of the factors which will be directly involved in the project, called
Project metric such as cost, time. Process metrics will be helpful to measure the quality of the
product that is produced. These metrics can be analyzed to provide indicators for guide
management and technical actions.
Month Activity
Identification of risks is one of the key topics covered during routine project status and reporting
meetings. The project team will be able to predict some dangers right away, while others may
require more diligence to find.
1. Dataset authentication: The data is collected from different open-source platforms. Hence,
the validity and authenticity of the data needs to be monitored.
2. Data Sources: Unreliable data source (in case the library is not maintained by the
collaborators regularly) due to involvement of open source data gathering libraries. This
may lead to inaccurate results.
3. System Failure: The system may fail in any circumstances.
Risk analysis involves examining how project outcomes and objectives might change due
to the impact of the risk event.
Once the risks are identified, they are analyzed to identify the qualitative and quantitative
impact of the risk on the project so that appropriate steps can be taken to mitigatethem.
Risk mitigation is a technique for a data center to prepare for and mitigate the effects of risks. Risk
mitigation, like risk reduction, aims to lessen the negative effects of risks anddisasters on business
continuity. Cyber-attacks, weather occurrences, and other causes of physical or virtual harm to a
data center are all potential threats to a corporation. Risks mustbe revisited at regular intervals for
the team to re-evaluate each risk to see whether new conditions have changed its probability or
impact. Individually owned and managed proactive measures are critical to successful risk
management.
Individual tasks and subtasks have interdependencies based on their sequence. A task network is a
graphic representation of the task flow for a project. Project tasks and their dependencies are noted
The project team is established. The team consists of four members, each with distinct
responsibilities
5.4.2. Management
PROJECT IMPLEMENTATION
Inputs:
1. Stock name
Output:
1. SWOT for the requested stock
Inputs:
1. Cryptocurrency name
Output:
1. List of stocks attractive as per its book value.
2. List of stocks attractive as per its sales.
3. List of stocks attractive as per its earnings.
4. List of stocks trading below Graham Number
PCCOE, Department of Computer Engineering 2022-23
43
6.1.9. STRATEGY BACKTESTER
For any requested stock, this module now backtests a hardcoded trading strategy and
generates a visually appealing report with information on the number of trades, total
returns, maximum drawdown, and average return.
Inputs:
1. Stock name
Output:
For any requested stock, this module produces a visually appealing plot with long/short
green and red colored markers respectively as signals. These signals can be used to interpet
the further direction of the stock.
Inputs:
1. Stock name
2. Number of signals (total number of buy or sell signals that must be plotted)
Output:
1. A visually appealing plot indicating buy and sell signals for the requested stock.
6.2.1. Python :
Python Programming Language was used to implement the backend and many
scripts involved in the web Application.
6.2.2. Vue.js
The front end was built using Vue.js framework of JavaScript.
6.2.4. GitHub
It's used for storing, tracking, and collaborating on software projects. It makes it
easy for us to share code files and collaborate with fellow developers on open-source
projects.
SOFTWARE TESTING
The purpose of testing is to identify flaws. Testing is the process of looking for defects or
vulnerabilities in a work product. Individual components, subassemblies, assemblies and/or
a completed product can be tested. It is the process of verifying that software fulfils
specifications, meets user expectations, and does not fail in an unacceptable manner. There
areseveral tests to choose from. The following is a list of them.
Black box testing method is applicable to the following levels of software testing
Integration Testing
System Testing
Acceptance Testing
White-box testing is a method of software testing that tests internal structures or workings
of an application, as opposed to its functionality. In white-box testing an internal perspective of the
system, as well as programming skills, are used to design test cases.
White box testing is testing of a software solution's internal structure, design, and coding.
In this type of testing, the code is visible to the tester. It focuses primarily on verifying the flow of
inputs and outputs through the application, improving design and usability, strengthening security.
White box testing is also known as Clear Box testing, Open Box testing, Structural testing,
Transparent Box testing, Code-Based testing, and Glass Box testing. It is usually performed by
developers.
7.1.3.1.UNIT TESTING
Unit testing is a level of software testing where individual units/ components of a software are
tested. The purpose is to validate that each unit of the software performs as designed. A unit is the
smallest testable part of any software. It usually has one or a few inputs and usually a single
output.
Unit testing, a testing technique using which individual modules are tested to determine if
there are any issues by the developer himself. A unit test is a way of testing a unit - the smallest
piece of code that can be logically isolated in a system.
Integration testing is the phase in software testing in which individual software modules are
combined and tested as a group. Integration testing is conducted to evaluate the compliance of a
system or component with specified functional requirements. It occurs after unit testing and before
validation testing.
PCCOE, Department of Computer Engineering 2022-23
48
7.1.3.3.SYSTEM TESTING
System testing is testing conducted on a complete integrated system to evaluate the system's
compliance with its specified requirements. System testing takes, as its input, all of the integrated
components that have passed integration testing.
System testing is a level of testing that validates the complete and fully integrated software
product. The purpose of a system test is to evaluate the end-to-end system specifications. Usually,
the software is only one element of a larger computer-based system.
7.1.3.4.PERFORMANCE TESTING
Performance testing is the process of determining the speed, responsiveness and stability of a
computer, network, software program or device under a workload. Performance testing can involve
quantitative tests done in a lab, or occur in the production environment in limited scenarios. In
software quality assurance, performance testing is in general a testing practice performed to
determine how a system performs in terms of responsiveness and stability under a particular
workload.
RESULTS
Fin-Maestro has been tested extensively to ensure the accuracy and reliability of the financial data
provided to users. The modules have been tested individually and in combination to ensure
seamless integration and user experience. The application has received positive feedback from
users, who have praised the application's ease of use and accuracy
8.2. SCREENSHOTS
CHAPTER 09
CONCLUSION
A full-fledged platform is developed which will aid the retail trader, investor to gain insights for
both long term, short term time frame. Fin-Maestro, a web application that aims to provide an
efficient and intelligent way for market participants to analyze various parameters of financial
instruments. The nine main techniques included in the application have demonstrated improved
accuracy in predicting financial outcomes, indicating that it is indeed possible to achieve greater
accuracy and efficiency in predicting the stock market using these techniques. In a nutshell, this
platform will help retail investors, traders to spot decent opportunities in the markets more quickly
and efficiently to make profitable trades.
The financial sector is constantly evolving, and there are several factors and parameters that play a
crucial role in its growth and development. The future of the financial sector is likely to be shaped
by various trends and technologies, including:
Artificial Intelligence (AI): AI has already started to transform the financial sector by
automating processes, reducing costs, and improving efficiency. In the future, AI is likely
to play an even more significant role in areas such as fraud detection, risk management, and
customer service.
Data Analytics: The financial sector generates vast amounts of data, and data analytics is
becoming increasingly important in making sense of this data. In the future, data analytics
is likely to play an even more critical role in areas such as risk management, customer
segmentation, and product development.
Description: This integration will ensure that the modules of Fin-Maestro can be accessed
more easily and conveniently by it's users with single line commands via Telegram. For
example: To fetch the valuation of a specific stock, the following command can be
used: /valuation ticker-name.
9.3. APPLICATIONS
This problem can be classified as an NP-Complete problem as all the algorithm exists in
polynomial time, all problems in NP would be polynomial time solvable. It satisfies both an NP
and NP-hard. With the exception of the sentiment, algorithms that fall under the category of NP-
hard problems can be solved in polynomial time. And where we are unable to receive definitive
answers
[2] Predicting Stock Closing Price After COVID-19 Based on Sentiment Analysis and
LSTM: (Christopher Chou1, Junho Park2, Eric Chou3) IAEAC 2021(ISSN 2689-6621)
[3] Sentiment Stock Based Prediction (B. L. Pooja, SuvarnaKanakaraddi, Meenaxi .M. Raikar) 2018
International Conference on Computational Techniques, Electronics and Mechanical Systems
(CTEMS)
[4] A Novel Twitter Sentiment Analysis Model with Baseline Correlation for Financial Market
Prediction with Improved Efficiency (XinyiGuo†, Jinfeng Li*) 2019 Sixth International Conference
on Social Networks Analysis, Management and Security (SNAMS)
[5] Stock Price Prediction Using News Sentiment Analysis (Saloni Mohan1, Sahitya
Mullapudi1, Sudheer Sammeta1) 2019 IEEE Fifth International Conference on Big Data Computing
Service and Applications (BigDataService).
[6] An Entropy-based Evaluation for Sentiment Analysis of Stock Market Prices using Twitter Data
(Andreas Kanavos∗, GerasimosVonitsanos∗)_ 2020 IEEE
[7] Machine Learning for Stock Prediction Based on Fundamental Analysis (Yuxuan Huang;
Luiz Fernando Capretz; Danny Ho) 2021 IEEE Symposium Series on Computational
Intelligence (SSCI)
[8] Stock price prediction using MLR on Sentiments and Fundamental Profile (AkshatGoe)
2021 12th International Conference on Computing Communication and Networking
Technologies (ICCCNT)
[9] Stock market analysis using candlestick regression and market trend prediction (CKRM) M.
Ananthi1 · K. Vijayakumar2 : Received: 12 October 2019 / Accepted: 13 March 2020 © Springer-
Verlag GmbH Germany, part of Springer Nature 2020
[10] Efficient predictability of stock return volatility: The role of stock market implied
volatility ZhifengDaia, HuitingZhoua, FenghuaWenb,c,⁎ , ShaoyiHed
PCCOE, Department of Computer Engineering 2022-23
64
[11] Neural networks and arbitrage in the VIX A deep learning approach for the VIX Joerg
Osterrieder1 · Daniel Kucharczyk2 · Silas Rudolf3 · Daniel Wittwer4
[12] Ticknor, J. L.(2013). A Bayesian regularized artificial neural network for stock
market forecasting. Expert Systems with Applications, 40(October (14)), 5501–5506.
[13] Lo, A. W., Mamaysky, H., & Wang, J. (2000). Foundations of technical
analysis: Computational algorithms, statistical inference, and empirical implementation. The Journal
of Finance, 55(August (4)), 1705–1770
[14] Oppenheimer, H. R., & Schlarbaum, G. G. (1981). Investing with Ben Graham: An Ex Ante test
ofthe efficient markets hypothesis. Journal of Financial and Quantitative Analysis, 16(September
(3)), 341–360
[15] Metghalchi, M., Chang, Y.-H., & Marcucci, J. (2008). Is the Swedish stock market
efficient? Evidence from some simple trading rules. International Review of Financial Analysis,
17(June (3)), 475–490.
[16] Bahtiar J. Z. , Rosnalini M, Norhayati Y, Beh Hui Sang. (2019) Classify Stock
Market Movement Based on Technical Analysis Indicators Using Logistic Regression, Journal
of Advanced Research in Business and Management Studies 14, Issue 1 (2019) 35-41
[17] Rohit C., Kumkum G. (2008) A Hybrid Machine Learning System for Stock
Market Forecasting, World Academy of Science, Engineering and Technology 15 2008
[18] Teaba W. A. K., Rana M., Wasim A. (2019) Stock Price Prediction using
Technical, Fundamental and News based Approach, 2019 2nd Scientific Conference of Computer
Sciences (SCCS)
[19] Yuxuan H., Luiz F. C., Danny H., (2019) Neural Network Models for Stock Selection Based on
Fundamental Analysis, 32nd Canadian Conference on Electrical & Computer
Engineering, Edmonton, Canada, 2019
[20] Andrea P. R., Simone M., Luca O., Yukun M., Lorenzo M., Erik C., (2018) 18-21 Nov 2018
IEEE Symposium Series on Computational Intelligence (SSCI)
[21] Isaac K. N., Adebayo F. A., Benjamin A. W., (2019) A systematic review of fundamental and
technical analysis of stock market predictions, April 2020 Artificial Intelligence Review 53(7)