Report Latest
Report Latest
1. INTRODUCTION
Trading View: This platform provides charts, analysis, and social networking for
traders and investors. It covers a wide range of financial instruments, including
stocks, cryptocurrencies, and commodities.
Feature engineering follows, where indicators like moving averages, RSI, and MACD
are extracted to enhance model performance. Model selection is a pivotal step, with
choices like LSTM, GRU, Random Forests, and Neural Networks, depending on the
nature of the data. The training and testing phase involves splitting the data,
hyperparameter tuning, and implementing cross-validation techniques for robustness.
Evaluation metrics, including MSE and RMSE, are used to assess model performance.
Once the models are trained, deployment in a production environment and periodic
updates with new data are essential. Continuous monitoring, improvement, and
adaptation to changing market conditions are crucial for sustained success.
Visualization tools like Matplotlib and Plotly aid in presenting predictions and
performance, while documentation ensures transparency and facilitates future
enhancements. Staying informed about the latest advancements in machine learning
and finance is imperative for ongoing model refinement.
Login
Registration
Search Stocks
Display Result
1.5.2.2. Module-Admin#2
Login
Add Data
Authorize Users
1.6. Applicability
Amidst the intricacies of the contemporary financial landscape, the "FinanciAI" project
emerges as a beacon of innovation, poised to tackle the multifaceted challenges individuals
and investors face. The project's ambitions extend to the broad spectrum of financial
assets, encompassing equities and cryptocurrency. It seeks to address the intricate web of
factors that influence financial decision-making, including market trends, historical data,
risk analysis, and future predictions. The overarching goal of the "FinanciAI" project is to
develop a machine learning-based web application that serves as a comprehensive and
reliable financial forecasting tool. In this context, it is crucial to explore the applicability of
the "FinanciAI" system, shedding light on how it promises to revolutionize financial
decision-making for a diverse range of stakeholders.
2. LITERATURE SURVEY
algorithm. The analysis of financial time series aims to correlate data points over
time with a dependent output and, hence, provide a way to predict future values
from historical data points. However, the quasi immediate information-adaptation
mechanism underlying the Efficient Market Hypothesis (EMH) severely reduces the
signal-to-noise ratio in the financial time series (Fama, 1965), and, hence, caps from
start the forecasting accuracy of any technical analysis algorithm.
sequential LSTM has outperformed all the models with minimum rmse (root mean
squared error) score. With the help of this are able to predict the stock closing price
of the company for up to one month. The second model will be of Financial Distress
Predication based on various Bagging and Boosting techniques with the integration
of various SMOTE techniques were used. But for this research Balanced Bagging
Method with ADASYN has outperformed from all the models with an accuracy of
93%. ADASYN Adaptive Synthetic Sampling Method is a modified version of SMOTE
which performed best with our bagging and boosting model, we have used ADASYN
to deal with the class imbalance problem. This empirical research is carried out
based on real world financial data of 3476 Chinees company with over 84 financial
and non-financial features.
based Bitcoin price prediction models using Bitcoin blockchain information. More
specifically, we tested the state-of-the-art deep learning models such as deep neural
networks (DNN), long short-term memory (LSTM) models, convolutional neural
networks (CNN), deep residual networks (ResNet), and their combinations. We
addressed both regression and classification problems, where the former predicts
the future Bitcoin price, and the latter predicts whether or not the future price will
go up or down. For regression problems, LSTM slightly outperformed the other
models, whereas for classification problems, DNN slightly outperformed the other
models unlike the previous literature on Bitcoin price prediction. Although CNN and
ResNet are known to be very effective in many applications, including sequence data
analysis, their performance was not particularly good for Bitcoin price prediction.
Overall, there was no clear winner and the performance of all deep learning models
studied in this work was comparable to each other. In addition, although deep
learning models seem to predict the Bitcoin price very well in terms of the
regression analysis, it is still premature to solely use such models for algorithmic
Bitcoin trading.
1 Determinants of 2023 In the literature, new The reason for the low
Credit Ratings and machine learning performance of the
Comparison of the algorithms are Artificial Neural Networks
Rating Prediction dynamically produced in algorithm in the study is
Performances of the field of artificial that it generally creates a
Machine Learning intelligence engineering learning neural network
Algorithms and the algorithms are that establishes inter-
constantly updated with layer relations with big
new parameter data and imitates the
estimations. The human brain. This study
performance of existing has data limitations. In
algorithms in various addition, there is no
business areas is still an feature structure suitable
important topic of for distance-based KNN
discussion. and SVM algorithms in
this study.
2 Forecasting the 2023 Cryptocurrencies, such as This study aims to
movements of Bitcoin, are one of the forecast the movements of
Bitcoin prices: an most controversial and Bitcoin prices at a high
application of complex technological degree of accuracy. To this
machine learning innovations in today’s aim, four different
algorithms financial system. This Machine Learning
study aims to forecast the algorithms are applied,
movements of Bitcoin namely the Artificial
prices at a high degree of Neural Network, Random
accuracy. Forest, Support Vector
Machines, the Naïve Bayes
and besides to the logistic
regression (LR) as
benchmark model
out-of-sample forecasting
exercise
6 Hafiz, F., Broekaert, 2021 This study proposes a new The problem of day ahead
J., La Torre, D. and framework to evolve prediction of NASDAQ
Swain, A., 2021. A efficacious yet index movement was
multi-criteria parsimonious neural explored from the neural
approach to evolve architectures for the design perspective. In
sparse neural movement prediction of particular, the attempts
architectures for stock market indices using have been made to clarify
stock market technical indicators as the issues related to
forecasting. arXiv inputs. In the light of a implications and possible
preprint sparse signal-to-noise remedies of disparate
arXiv:2111.08060. ratio under the Efficient market behaviors prior to
Market hypothesis, and during the ongoing
developing machine COVID pandemic with
learning methods to respect to the neural
predict the movement of a architecture design.
financial market using
technical indicators has
shown to be a challenging
problem..
7 Time-Series 2021 In the market of Our dataset contains the
Prediction of cryptocurrency the timestamps of yearly,
Cryptocurrency Bitcoins are the first monthly daily close, open,
Market using currency which has gain high, low and weighted
Machine Learning the significant importance. price of bitcoins. We have
Techniques To predict the market pre-processed that data
price and stability of according to our
Bitcoin in Crypto-market, requirement of
a machine learning based normalization. Then we
time series analysis has have applied three
been applied. machine learning
algorithm for time series
forecasting of the bitcoin
prices in the
cryptocurrency market.
8 Bitcoin Price 2021 This paper explains the The study reveals that the
Prediction Using working of the Multiple best accuracy rate is
Machine Learning Linear Regression and shown in Long Short-
and Artificial Neural Long Short-Term Memory Term Memory than
Network Model model in predicting the Multiple Linear
12 Investment advice 2020 In this day and age The above literature
based on market “Investment” has become review on different
trends and the a necessity and an researchers helped us to
financial distress of important factor for understand the research
the company companies and work done by different
individuals. Investment is researchers on Financial
something which is called distress prediction. This
a monetary asset helps us to understand
purchased with the idea how financial distress
that the investment will prediction previously
provide a profit in the performed using
future. Before investing statistical analysis models
people study Financial like Beavers Univariant
performance, Background Model, Altman Z-score
and experience in the which were stationary
industry, Company models, to machine
uniqueness, Effective learning techniques like
business model, Large SVM, Decision tree, Naïve
market size of a particular Bayes which performs
company, but they do not well with small scale data
focus on minute and data with fewer
fingerprints of financial features. As the
distress. The main of this technology evolved people
research is to draw down moved towards creating
the factors of investment hybrid models like a
under a single umbrella hybrid stepwise-SVM,
and generate and LDA-SVM, to using
investment advice. multiple classifiers like
Adaboost-SVM, Adaboost-
Decision tree while using
these techniques people
are only focusing on
increasing the
performance using
multiple classifier sand
neglecting the class
imbalance problem.
13 LITERATURE 2020 The Stock Market has been The research done so far it
SURVEY ON STOCK very successful in could be concluded that
PRICE PREDICTION attracting people from the RNN and LSTM
USING MACHINE various backgrounds be it libraries are very effective
LEARNING educational or business in determining the stock
.The nonlinear nature of price trends effectively
the Stock Market has made relative to the actual
its research one of the market trend. At the same
most trending and crucial time what we could find
topics all around the out is that the python
world.. People decide to libraries that were used as
invest in the stock market a part of the training
on the basis of some prior process were not very
research knowledge or optimal.
some prediction. In terms
of prediction people often
look for tools or methods
that would minimize their
risks and maximize their
profits and hence the stock
price prediction takes on
an influential role in the
ever challenging stock
market business
15 Predicting Crypto 2020 In the past eight years of In this paper, several
Currency Prices Bitcoin’s history, the approaches for crypto
Using Machine economy has seen the currencies like Bitcoin
Learning and Deep price of Bitcoin rapidly price prediction were
Learning grow due to its promising investigated. We
Techniques outlook on the future for compared the results of
crypto currencies. prediction with Multiple
Investors have taken note Linear Regression,
of several advantages Multiple Linear
Bitcoin provides over the Regression with Features,
traditional banking and Recurrent Neural
system. One such trait is Networks with LSTM cells
that Bitcoin allows for
decentralized banking,
meaning that Bitcoin
cannot be regulated by
powerful banks.
16 Price Movement 2019 Cryptocurrencies are In this paper, we proved
Prediction of becoming increasingly that it is possible to
3. ANALYSIS
Table3.1.MinimumHardwareRequirements
Table3.2.MinimumSoftwareRequirements
2. Predict/Search: The system should have search boxes for Crypto and Stocks
predictions. Enter the ticker name and no of days.
3. Ticker List: The system should have ticker list for find the ticker name of the specific
company and Crypto.
4. Training and Validation: The system should be able to provide functionality for model
training on historical data, considering different time periods and market conditions.
5. Real-time Prediction: The system should be able to make real-time predictions based
on the latest available data.
6. User Interface: Design a user-friendly interface for users to interact with the system.
Provide visualizations, charts, and reports summarizing predictions, historical
performance, and model accuracy.
2. Scalability: The system should be scalable to handle an increasing volume of data and
user interactions over time without compromising performance.
3. Reliability: Ensure high availability of the system, minimizing downtime during critical
market periods
4. Accuracy: Define acceptable levels of accuracy for predictions and continuously monitor
and improve model performance to meet these standards.
5. Security: Implement robust security measures to protect sensitive financial data and
ensure secure communication between the system and external platforms.
6. Security: Implement robust security measures to protect sensitive financial data and
ensure secure communication between the system and external platforms. Enforce user
authentication and authorization mechanisms to control access to different system
functionalities.
4. PLANNING
Waterfall Model
Waterfall Model The waterfall model is a sequential, plan driven-process where you must
plan and schedule all your activities before starting the project. Each activity in the
waterfall model is represented as a separate phase arranged in linear order. Each of these
phases produces one or more documents that need to be approved before the next phase
begins.
It has the following phases:
Requirement Analysis
System Design
Implementation
Testing
Deployment
Maintenance
The Waterfall Model is sequential design process, often used in Software development
processes, where progress is seen as flowing steadily download through the phase of
conception, Initiation, Analysis, Design, Construction, Testing, Production/Implementation
and Maintenance. This Model is also called as the classic Life cycle model as it suggests a
systematic sequential approach to software developments. This one of the oldest models
followed in software engineering. The process begins with the communication phase where
the customer specifies the requirements and then progress through other phases like
planning, modeling, construction and deployment of the software. There are 5 Phase of
water fall model: The Waterfall Model is sequential design process, often used in Software
development processes, where progress is seen as flowing steadily download through the
phase of conception, Initiation, Analysis, Design, Construction, Testing,
Production/Implementation and Maintenance. This Model is also called as the classic Life
cycle model as it suggests a systematic sequential approach to software developments. This
one of the oldest models followed in software engineering. The process begins with the
communication phase where the customer specifies the requirements and then progress
through other phases like planning, modeling, construction and deployment of the
software.
2. Planning: In planning major activities like planning for schedule, keeping tracks on the
processes and the estimation related to the project are done. Planning is even used to find
the types of risks involved throughout the projects. Planning describes how technical tasks
are going to take place and what resources are needed and how to use them.
3. Modeling: This is one the important phases as the architecture of the system is designed
in this phase. Analysis is carried out and depending on the analysis a software model is
designed. Different models for developing software are created depending on the
requirements gathered in the first phase and the planning done in the second phase.
4. Construction: The actual coding of the software is done in this phase. This coding is
done on the basis of the model designed in the modeling phase. So in this phase software is
actually developed and tested.
5. Deployment: In this last phase the product is actually rolled out or delivered installed at
customer’s end and support is given if required. Feedback is taken from the customer to
ensure the quality of the product. From the last two decades Waterfall model has come
under lot of criticism due to its efficiency issues. So let’s discuss the advantages and
disadvantages of waterfall model.
4.2. Estimations
Estimation is the process of predicting the most realistic amount of effort required to
develop or maintain software based on incomplete, uncertain, and noisy input. Estimation
consists of following steps:
- Estimate the size in lines of code of each module from empirical data.
- Estimate the effort in person-month or person-hours.
- Estimate the duration in calendar month.
- Estimate the number of people required.
- Estimate the cost in currency.
We have reviewed a project entitled “Finaci AI”. The proposed system has somewhat
similar functionality to it. The project is modularized as shown below
COCOMO (Constructive Cost Estimation Model) was proposed by Boehm [1981]. COCOMO
predicts the efforts and schedule of a software product based on the size of the software.
According to Boehm, software cost estimation should be done through three stages: Basic
COCOMO, Intermediate COCOMO and Detailed / Complete / Advanced COCOMO. [8,16]
- Basic COCOMO: It is a single-valued, static model that computes software development
effort (and cost) as a function of program size expressed in estimated thousand
delivered source instructions (KDSI) i.e., Lines of code (LOC).
- Intermediate COCOMO: an extension of the Basic model that computes software
development effort as a function of program size by adding a set of "cost drivers," that
will determine the effort and duration of the project, such as assessments of personnel
and hardware.
- Detailed COCOMO: an extension of the Intermediate model that adds effort multipliers
for each phase of the project to determine the cost driver’s impact on each step
(analysis, design, etc.) of the software engineering process.
In our project we are going to use “Basic COCOMO” model for estimations. Basic
COCOMO categorizes projects into three types:
- Organic Mode: (Application Programs such as: data processing, scientific, etc.)
Development projects typically are not complicated and involve small, experienced
teams. The planned software is not considered innovative (i.e., little innovation) and
requires a relatively small number of DSI (typically 2000 to 50,000 LOC). Organic
projects are those developed in a stable development environment and do not have
tight deadlines or constraints.
- Semidetached Mode: (Utility Programs such as: compilers, linkers, analyzers, etc.)
Development projects typically are more complicated than in Organic Mode and involve
teams of people with mixed levels of experience. The software requires no more than
50,000 to 300,000 DSI. The projects require minor innovations and has some deadline &
Project Type a b c d
Organic 2.4 1.05 2.5 0.38
Semi-Detached 3.0 1.12 2.5 0.35
Embedded 3.6 1.20 2.5 0.32
Total lines of code for the proposed system will be approximately 5000.
The system falls into the embedded category. The value a and b according to embedded
system is a =2.8 and b =1.20
Total LOC (approx.) of project is5000LOC = 5.00KLOC
Effort (E) =a(KLoC)b
E=3.6*(5.00)1.20
E=24.83 ≈ 25 Person Months
We assume each team member charges₹500/- per month, ₹1000/- required for other
resources & miscellaneous purposes, and ₹500/- for Microphone. Thus,
Estimated Cost of System = ((Person Charges * Person Required) + Resource Charges) *
Duration [+ Hardware Cost]
=((500 * 3) + 1000) * 7 + 500 = ₹18,000/-
Estimation Value
Size of the Project 5000 LoC
Effort Required 36 Person Months
Duration Required 8 Months
Person Required 4
Cost Required ₹37,500/-
rushikeshthakur0101@
2. Thakur Rushikesh Bharat Member Member
gmail.com
urveshpatil152003@gm
3. Patil Urvesh Madhukar Member Member ail.com
5. DESIGN
The unified modeling language (UML) is a Graphical Language for visualization, Specifying,
construction and documenting the artifacts of a software intensive system. The UML gives a
standard was to write system’s blueprints, covering conceptual thing, such as Business
Processes & system functions, as well as concrete things, such as classes written in a
specific programming language, database schemas, and reusable software components.
A use case defines behavioral features of a system. Each use case is named using a verb
phase expresses a goal of the system. A use case diagram shows a set of use cases and
actors &their relationships. Use case diagrams address the static use case view of a system.
These diagrams are especially important in organizing and modeling the behaviors of a
system. It shows the graphical overview of functionality provided by the system intents
actor.
An activity diagram of a special kind of state chart diagram that shows the flow from
activity within a system. An activity addresses the dynamic view of a system. The activity
diagram is often seen as part of the functional view of a system because it describes logical
processes, or functions. Each process describes a sequence of tasks and the decisions that
govern when and when they are performed. The flow in an activity diagram is driven by the
completion of an action.
A class diagram shows a set of classes, interfaces and collaborations and their relationship.
These diagrams are the most common diagram found in modeling object-oriented systems.
Class diagram addressed the static design view of a system.
Class Description
Register The user is able to register to the application
Login The user is able to register to the application
The user can select which company stock to predict via ticker
System
or can directly predict it
A user is able to see future value of stock that has been
Stocks
selected
A user is able to see future value of stock that has been
Crptocurrency
selected
Deployment diagram shows the configuration of run time processing nodes and
components that live on them. Deployment diagram addresses the static deployment view
of architecture. A deployment diagram shows the configuration of run-time processing
nodes and the components that live on them. Deployment diagrams address the static view
of architecture. They are related to components diagram in that a node typically encloses
one or more components.
6 . IMPLEMENTATION
The Python 2 language was officially discontinued in 2020 (first planned for 2015), and
”Python 2.7.18 is the last Python 2.7 release and therefore the last Python 2 release.” [30]
No more security patches or other improvements will be released for it. With Python 2’s
end-of-life, only Python 3.6.x and later are supported.
6.1.1. Features
3. Easy to Read As you will see, learning Python is quite simple. As was already
established, Python’s syntax is really straightforward. The code block is defined by
the indentations rather than by semicolons or brackets.
4. Object-Oriented Language One of the key features of Python is Object-Oriented
programming. Python supports objectoriented language and concepts of classes,
object encapsulation, etc.
5. GUI Programming Support Graphical User interfaces can be made using a module
such as PyQt5, PyQt4, wx Python, or Tk in Python. PyQt5 is the most popular option
for creating graphical apps with Python.
6. High-Level Language Python is a high-level language. When we write programs in
Python, we do not need to remember the system architecture, nor do we need to
manage the memory.
7. Large Community Support Python has gained popularity over the years. Our
questions are constantly answered by the enormous Stack Overflow community.
These websites have already provided answers to many questions about Python, so
Python users can consult them as needed. 8. Easy to Debug Excellent information for
mistake tracing. You will be able to quickly identify and correct the majority of your
program’s issues once you understand how to interpret Python’s error traces.
Simply by glancing at the code, you can determine what it is designed to perform
tasks, including lane and object detection. Python provides seamless integration
with OpenCV, enabling you to leverage its functionalities for tasks like image
processing and feature extraction.
Flexibility and Scalability: Python is a versatile language that can be used for both
smallscale experiments and large-scale production systems. Its flexibility allows you
to easily integrate machine learning components into your project and scale them as
needed.
Python is one of the most used programming languages due to its simplicity,
readability and extensive library support. Python consists of extensive machine learning
libraries and frameworks. Some of the most used libraries for machine learning models are
NumPy, Pandas, TensorFlow, etc., which are used for training models.
Python is often the first choice of most beginners as well as professional developers around
the world. Why is Python the first choice of machine learning developers? Some of the most
important reasons are given below.
However, Java is also a competitive choice for machine learning developers. But in the end,
it depends on the developer’s choice and the kind of projects they are currently working.
Programming languages are just the medium to tell systems the work they need to carry
out. We must feel free while deciding which programming language to go with and make a
selection based on personal research and preference.
Visual Studio Code is a source-code editor made by Microsoft for Windows, Linux and
macOS. Features include support for debugging, syntax highlighting, intelligent code
completion, snippets, code refactoring, and embedded Git. Features of NetBeans.
VS Code has a modern and minimalist user interface, which can be customized through
themes and extensions. It offers a clean and uncluttered coding experience. Eclipse, on the
other hand, has a more traditional IDE interface with multiple views, perspectives, and
toolbars. Eclipse provides a rich set of menus and options, giving you more control and
flexibility in configuring the IDE. VS Code has a vast marketplace of extensions that can
enhance its functionality. It has a strong ecosystem with support for various programming
languages and frameworks.
6.2.1. Features
9. Task Runner: It provides a task running system that allows developers to define
and run tasks, such as building projects or running tests, directly from the editor.
10. Multi-root Workspaces: VS Code supports working with multiple folders open in
the same window, allowing developers to organize their projects more efficiently.
We have selected <VS Code> to implement the system because of following reasons:
Both VS Code and Pycharm have active communities, but Pycharm has a longer
history and a larger user base. It has been around for many years and has a mature
ecosystem with extensive documentation, tutorials, and online resources.
VS Code has gained significant popularity in recent years and has a growing
community with a focus on web development, including python. VS Code and
Pycharm are both popular integrated development environments (IDEs) used for
python development, but they have some key differences in terms of features,
performance, and community support.
As we are not focused on feature-rich IDE with comprehensive python development
tools. We are more leaned towards prioritizing simplicity, extensibility, and a
lightweight environment, The VS Code might be the right choice for us.
An integrated, feature-rich IDE that's specifically tailored for Python. It has a rich set of
features including advanced refactoring and debugging tools, and supports multiple Python
frameworks. PyCharm is a good choice for developers who want an integrated, feature-rich
IDE that streamlines the Python development process. However, it's more resource-
intensive than VS Code, which can result in slower startup times and higher memory usage.
PyCharm may also have a steeper learning curve for beginners.
7. TESTING
Unit Testing: It is the testing of individual software units of the application .It is
done after the complexion of an individual unit before integration. Unit testing
involves the design of test cases that validate that the internal program logic is
functioning properly, and that program inputs produce valid outputs. All decision
branches and internal code flow should be validated. This is a structural testing, that
relies on knowledge of its construction and is invasive. Unit tests perform basic tests
at component level and test a specific business process, application, and/or system
configuration. Unit tests ensure that each unique path of a business process
7.1.1. Features
• Bug Detection: One of the primary purposes of testing is to identify and eliminate bugs
or defects in the software. This helps in enhancing the overall quality and reliability of the
software. Validation and Verification: Testing is used to validate that the software meets the
requirements specified by the stakeholders. It verifies that the software is built according to
the design specifications.
• Functional and Non-functional Testing: Testing can be classified into two broad
categories: functional testing, which focuses on testing the functionality of the software,
and non-functional testing, which focuses on aspects like performance, usability, security,
etc.
• Test Planning and Execution: Testing involves planning the testing activities, creating
test cases, executing the tests, and analyzing the results. It requires a systematic approach
to ensure that all aspects of the software are adequately tested.
• Quality Assurance: Testing ensures that the software meets the quality standards
expected by the stakeholders. By identifying and fixing bugs and defects, testing contributes
to the overall reliability and usability of the software.
• Compliance and Regulatory Requirements: Many industries have strict compliance and
regulatory requirements that software must meet. Testing ensures that the software
complies with these requirements, reducing the risk of legal or financial penalties for non-
compliance.
• Cost Savings: While investing in testing may seem like an additional expense, it often
results in cost savings in the long run. By identifying and fixing defects early, testing
reduces the cost of fixing issues later in the development lifecycle or after deployment.
Test Case #1
Test Case #1
Test Case #1
Personnel Factors
Analyst Capability (ACAP) Platform Experience (PLEX)
Applications Experience (APEX) Language and Tool Experience (LTEX)
Programmer Capability (PCAP) Personnel Continuity (PCON)
Project Factors
Use of Software Tools (TOOL) Development Schedule (SCED)
Multisite Development (SITE)
Platform Factors
Execution Time Constraint (TIME) Database Size (DATA)
Main Storage Constraint (STOR) Product Complexity (CPLX)
Platform Volatility (PVOL) Required Reusability (RUSE)
Documentation Match to Lifecycle Needs
Required Software Reliability (RELY)
(DOCU)
After assigning rating to each of the cost drivers the ratings are multiplied together to yield
Effort Adjustment Factor (EAF).
The Detailed COCOMO formula takes the form:
Effort, E = a(KLoC)b*EAF person months
Duration, D = c(E)d months
Person, P = E/D persons
where E is the effort applied in person-months, KLoC is the estimated number of thousands
of delivered lines of code for the project, D is total time duration to develop the system in
months, and P is number of persons required to develop that system.
The coefficients a, c and the exponent b, d are given in the following table.
Table 8.2. Coefficient/Exponent Values of Detailed COCOMO
Project Type a b c d
Organic 3.2 1.05 2.5 0.38
Semi-Detached 3.0 1.12 2.5 0.35
Embedded 2.8 1.20 2.5 0.32
Ratings
Cost
Drivers Very Very Extra
Low Usual High
Low High High
Personnel Factors
Project Factors
Platform Factors
Product Factors
Documentation Match to Life cycle Needs(DOCU) 0.81 0.91 1.00 1.11 1.23 ---
1.19*1.13*1.00*0.90*0.95*1.12*0.91*1.24*1.04*1.00
Effort Adjustment Factor (EAF)
*1.00*0.87*1.40*1.00*1.15*0.95*1.00 = 2.01
The system falls into the Semi-Detached category. The value a and b according to
embedded system is a =3,0 and b =1.12.
Total LOC (approx.) of project is 5000LOC = 5.00KLOC
Effort (E) =a(KLoC)b * EAF
E=3.0*(5.00)1.20 * 2.01
E = 36.57≈ 36Person Months
Each team member has charged₹900/- per month with ₹1000/- spent for other resources
& miscellaneous purposes in each month, in additional Deployment Server is purchased for
₹1000/-. Thus,
Total Cost of System = ((Person Charges * Person Required) + Resource Charges) *
Duration [+ Hardware Cost]
=((900 * 4) + 1000) * 8 + 1000 = ₹37,500/-
Calculation Value
Size of the Project 5000 LoC
Effort Required 36 Person Months
Duration Required 8 Months
Person Required 4
Total Cost ₹37,500/-
9. CONCLUSION
In the future for better accuracy, The models will be trained with more variety of
different data sets, and also other algorithms like CNN and Hybrid models Like Multi-
Multiple Linear Regression and CNN will be used to create more precise predictions. The
Financial AI project will enhance predictive accuracy with advanced techniques, expand to
global markets and diverse assets, and offer personalized insights, real-time processing,
seamless platform integration, and educational resources. Adding commodities and index
modules will provide comprehensive market analysis for strategic investment decisions.