0% found this document useful (0 votes)
19 views

How to build a Neural Network from scratch

The document outlines the steps to build both shallow and deep neural networks from scratch using Python. Key steps include importing libraries, loading and normalizing data, designing the network architecture, implementing forward and backward propagation, and updating parameters. It provides a structured approach to creating neural networks with detailed functions for each stage of the process.

Uploaded by

saksham2700
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

How to build a Neural Network from scratch

The document outlines the steps to build both shallow and deep neural networks from scratch using Python. Key steps include importing libraries, loading and normalizing data, designing the network architecture, implementing forward and backward propagation, and updating parameters. It provides a structured approach to creating neural networks with detailed functions for each stage of the process.

Uploaded by

saksham2700
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

How to build a Neural Network from scratch

04 July 2020 15:56

STEPS TO MAKE A SHALLOW NEURAL NETWORK IN P

1. Import all the required modules and libraries.


2. Load the Dataset into your iPython notebook.
3. Plot the data using scatter plot to visualize (if possible)
4. Now, let's get a better sense of how our data is present. Ho
shape of variables X and Y. The number of training and cro
5. Normalize the data in the dataset (like: mean normalization)
NN.
6. Design a neural network you want to make (in rough - pen a
How many input units, how many output units, how many hi
each hidden layer ?
7. Decide what activation functions you want to use for every l
8. Start making the NN:
a. Layer_sizes (x, y) ----> (n_x, n_h, n_y)
shallow NN
b. Initialize_parameters (n_x, n_h, n_y) ----> (parameters)
w1, b1, w2, b2, ..
c. Forward_propagation(X, parameters) ----> (Y, cache)
z2, a2
d. computeCost (A2, Y) -----> cost
sum-mean formula
e. Back_prop(parameters, cache, X, Y) ----> grad
dW1, db1, ...
f. Update_params(parameters, grads, learning_rate) ---->
g. NN_model(X, Y, n_h, epochs, print_cost = False) ---->
9. Make a [predict (parameters, X) ----> Y] function

STEPS TO MAKE A DEEP NEURAL NE

1. Import all the required modules and libra


PYTHON FROM SCRATCH.

ow many variables do we have, the


oss validation examples.
) and make it ready to be fed into the

and paper). Visualize this network.


idden layers and how many units in

layer.

## Initialize the layer sizes for

) ## parameters is a dictionary of

## cache is a dictionary of z1, a1,

## Compute cost using the log-

## A dictionary of all the gradients

> parameters
parameters

ETWORK IN PYTHON FROM SCRATCH.

aries.
STEPS TO MAKE A DEEP NEURAL NE

1. Import all the required modules and libra


2. Load the Dataset into your iPython noteb
3. Plot the data using scatter plot to visualiz
4. Now, let's get a better sense of how our
the shape of variables X and Y. The num
5. Normalize the data in the dataset (like: m
the NN.
6. Design a neural network you want to ma
network. How many input units, how man
many units in each hidden layer ?
7. Decide what activation functions you wa
8. Start making the NN:
1. Initialize_parameters_deep (layer_d
of W1, b1, W2, b2...
2. Linear_forward (A_prev, w, b) ----> (
W, b)
3. Linear_activation_fwd (A_prev, W, b
with all cache from linear_fwd and a
4. L_layer_mode_fwd (X, parameters)
values in the network
5. Compute_cost (AL, y) ----> cost
6. Linear_backward (dz, cache) ----> (d
7. Linear_Activation_backward (dA, ca
8. L_model_backward (A_L, Y, caches
9. Update_params(parameters, grads,
10. NN_model(X, Y, n_h, epochs, print_
9. Make a [predict (parameters, X) ----> Y]
ETWORK IN PYTHON FROM SCRATCH.

aries.
book.
ze (if possible)
data is present. How many variables do we have,
mber of training and cross validation examples.
mean normalization) and make it ready to be fed into

ake (in rough - pen and paper). Visualize this


ny output units, how many hidden layers and how

ant to use for every layer.

dims) ----> parameters ## parameters is a dictionary

(Z, cache) ## cache is a dictionary with (A_prev,

b, activation_type) ----> (A, cache) ## Dictionary


activation values
----> (A_L, caches) ## caches = dict. of all cache

dA_prev, dW, db)


ache, Activation_type) ----> (dA_prev, dW, db)
s) ----> grads
, learning_rate) ----> parameters
_cost = False) ----> parameters
] function

You might also like