CVDL Assignment 2
CVDL Assignment 2
you could
capture.
OBJECTIVE:
1. Define the neural network structure ( # of input units, # of hidden units, etc).
2.Initialize the model’s parameters
3. Loop: – Implement forward propagation – Compute loss – Implement
backward
propagation to get the gradients – Update parameters (gradient descent)
THEORY:
Neural Network model Logistic regression did not work well on the ”flower
dataset”. You are
going to train a Neural Network with a single hidden layer. Here is our model:
Figure Neural
Network Model
Packages:
• numpy is the fundamental package for scientific computing with Python.
• sklearn provides simple and efficient tools for data mining and data analysis.
• matplotlib is a famous library to plot graphs in Python.
• testCases provides some test examples to assess the correctness of your
functions
• planar_utils provide various useful functions used in this assignment
Dataset:
Dataset First, let’s get the dataset you will work on. The following code will load
a ”flower” 2-
class dataset into variables X and Y. X, Y = load_planar_dataset() Visualize the
dataset using
matplotlib. The data looks like a ”flower” with some red (label y=0) and some
blue (y=1) points.
Your goal is to build a model to fit this data. # Visualize the data: plt.scatter(X[0,
:], X[1, :],
c=Y, s=40, cmap=plt.cm.Spectral);
Defining the neural network structure:
above if
needed.
– Use:
np.random.randn(a,b) * 0.01 to randomly initialize a matrix of shape (a,b).
– Use: np.zeros((a,b)) to initialize
a matrix
of shape (a,b) with zeros
The Loop:
n of your classifier.
-in (imported) in the notebook.
you have
to implement are: 1 Retrieve each parameter from the dictionary “parameters”
(which is
the output of ‘initialize_parameters()’) by using ‘parameters[“..”]’. 2 Implement
Forward
Propagation. Compute Z [1], A[1], Z[2] and A[2] (the vector of all your
predictions on
all the examples in the training set).
SAMPLE CODE:
PLATFORM REQUIRED:
Operating System: Windows
Software or Tools: GOOGLE COLAP
CONCLUSION:
Hence we studied the concept of Planner data classification with one hidden layer
with
91% accuracy.