Ba Term 2 Project
Ba Term 2 Project
This project focuses on applying neural networks for predicting stock prices using historical financial
data. The approach involves building and evaluating a feedforward neural network model with
historical stock prices as input features. The dataset includes key indicators such as opening price,
high, low, close, and volume. By utilizing neural network techniques, this project aims to predict the
adjusted closing price of the stock while evaluating the model's performance through key metrics such
as mean squared error (MSE) and R-squared. The overall goal is to provide a robust and accurate
model that offers valuable insights into stock price movements.
Chapter 1: Introduction
Predicting stock prices has long been a significant challenge in finance due to the inherent volatility of
the market and the multitude of factors that influence it. Machine learning techniques, particularly
neural networks, provide the capability to capture nonlinear relationships and hidden patterns in
historical data that traditional methods may fail to recognize. This project demonstrates the feasibility
and effectiveness of implementing a neural network using Keras to predict stock prices. By leveraging
historical market data, this project illustrates the ability of neural networks to model financial trends
and make accurate predictions.
Chapter 4: Methodology
1. Data Preprocessing:
o The numerical features (Open, High, Low, Close, and Volume) were normalized using
MinMaxScaler to scale them to a range between 0 and 1.
o The target variable (Adj Close) was also scaled using the same approach to ensure
compatibility with the neural network’s training requirements.
o The dataset was divided into training (80%) and testing (20%) subsets to enable the
evaluation of the model’s generalization performance.
2. Model Design:
o The model architecture was implemented as a sequential feedforward neural network
with the following structure:
▪ Input Layer: Accepts 5 features as input.
▪ Hidden Layers:
▪ A first hidden layer comprising 64 neurons with ReLU activation to
capture complex patterns.
▪ A second hidden layer with 32 neurons and ReLU activation to refine
the model’s predictive capabilities.
▪ Output Layer: A single neuron to produce a continuous output for
regression.
o Compilation involved the following settings:
▪ Loss Function: Mean Squared Error (MSE) to measure prediction error.
▪ Optimizer: Adam optimizer for efficient learning.
▪ Metrics: Mean Absolute Error (MAE) to track prediction accuracy.
3. Model Training:
o The model was trained over 50 epochs with a batch size of 32, allowing it to
iteratively learn from the training data.
o Validation data was employed during training to monitor and prevent overfitting,
ensuring the model’s generalization to unseen data.
4. Evaluation:
o Predictions generated by the model were evaluated on the test set using MSE and R-
squared metrics to gauge accuracy and fit quality.
This graph provides a visual representation of the adjusted close price trends over time, highlighting
significant peaks and troughs that the neural network uses to learn and predict future prices.
o Residual Plot:
The residual plot showcases the differences between true and predicted values. The residuals are
predominantly centered around zero, demonstrating that the model’s predictions are both accurate and
unbiased.