0% found this document useful (0 votes)
10 views10 pages

Lab 8

Uploaded by

mahnoorr680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views10 pages

Lab 8

Uploaded by

mahnoorr680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 10

LAB EIGHT: ARTIFICIAL NEURAL NETWORKS

Lab Task: Building a Simple Neural Network for Image Classification

In this lab, you'll work through a complete neural network pipeline from data loading, parameter
initialization, forward propagation, cost calculation, backpropagation, and parameter updates to
model evaluation.

Objectives

By the end of this lab, you will be able to:

1. Load and visualize image data using NumPy and Matplotlib.


2. Implement the forward and backward propagation functions.
3. Train a simple neural network for image classification.
4. Evaluate the accuracy of your model.

Task 1: Data Loading and Visualization

1. Load train_X.csv, train_label.csv, test_X.csv, and test_label.csv using


np.loadtxt and print their shapes to understand the data dimensions.

X_train = np.loadtxt('train_X.csv', delimiter =',' ).T


Y_train = np.loadtxt('train_label.csv', delimiter =',').T

X_test = np.loadtxt('test_X.csv', delimiter =',' ).T


Y_test = np.loadtxt('test_label.csv', delimiter =',' ).T

2. Visualize a random sample from X_train by reshaping it to a 28x28 grayscale image and
displaying it with matplotlib.pyplot.imshow.

index = random.randrange(0, X_train.shape[1])


plt.imshow(X_train[:,index].reshape(28,28), cmap ='gray')
plt.show()

Model
Initialize parameters Randomly
W1=np.random.randn(n1,n0)
b1=np.zeros((n1,1))
W2=np.random.randn(n2,n1)
b2=np.zeros((n2,1))
*Repeat Below Steps for many times : *
Forward Propagation
Z1=W1∗X+B1
A1=f(Z1)
Z2=W2∗A1+B2
A2=Softmax(Z2)

Updating Parameters
W2=W2−α∗∂Cost∂W2
B2=B2−α∗∂Cost∂B2
W1=W1−α∗∂Cost∂W1
B1=B1−α∗∂Cost∂B1
Task 2: Parameter Initialization

1. Implement the initialize_parameters function to initialize weights and biases for a


two-layer neural network. Ensure the parameters are:
o W1 (of shape (n_h, n_x)) initialized randomly.
o b1 (of shape (n_h, 1)) initialized as zeros.
o W2 (of shape (n_y, n_h)) initialized randomly.
o b2 (of shape (n_y, 1)) initialized as zeros.

2. #activation functions
3. def tanh(x):
4. return np.tanh(x)
5.
6. def relu(x):
7. return np.maximum(x, 0)
8.
9. def softmax(x):
10. expX = np.exp(x)
11. return expX/np.sum(expX, axis = 0)
12.
13. def derivative_tanh(x):
14. return (1 - np.power(np.tanh(x), 2))
15.
16. def derivative_relu(x):
17. return np.array(x > 0, dtype = np.float32)

Task 3: Forward Propagation

1. Implement the forward_propagation function using the formulas given in the code:
o Compute the intermediate values Z1, A1, Z2, and A2 using tanh for the hidden
layer and softmax for the output layer.

Forward Propagation
Z1=W1∗X+B1
A1=f(Z1)
Z2=W2∗A1+B2
A2=Softmax(Z2)

def forward_propagation(x, parameters):

w1 = parameters['w1']
b1 = parameters['b1']
w2 = parameters['w2']
b2 = parameters['b2']

z1 = np.dot(w1, x) + b1
a1 = tanh(z1)

z2 = np.dot(w2, a1) + b2
a2 = softmax(z2)

forward_cache = {
"z1" : z1,
"a1" : a1,
"z2" : z2,
"a2" : a2
}

return forward_cache

Task 4: Cost Function

1. Implement the cost_function that calculates the cross-entropy loss between the
predicted and actual labels.

Cost Function
Cost=−1m∑mi=1∑nk=1[yk∗log(ak)]

Task 5: Backward Propagation

1. Implement the backward_prop function to compute gradients of the loss with respect to
each parameter (dW1, db1, dW2, and db2).
2. Use the derivative functions provided (derivative_tanh and derivative_relu) where
necessary.
Backpropagation
dZ2=(A2−Y)
dW2=1m.dZ2.AT1

dZ1=WT2.dZ2∗f|1(Z1)
dB2=1m.sum(dZ2,1)

dW1=1m.dZ1.XT
dB1=1m.sum(dZ1,1)
def backward_prop(x, y, parameters, forward_cache):

w1 = parameters['w1']
b1 = parameters['b1']
w2 = parameters['w2']
b2 = parameters['b2']

a1 = forward_cache['a1']
a2 = forward_cache['a2']

m = x.shape[1]

dz2 = (a2 - y)
dw2 = (1/m)*np.dot(dz2, a1.T)
db2 = (1/m)*np.sum(dz2, axis = 1, keepdims = True)

dz1 = (1/m)*np.dot(w2.T, dz2)*derivative_tanh(a1)


dw1 = (1/m)*np.dot(dz1, x.T)
db1 = (1/m)*np.sum(dz1, axis = 1, keepdims = True)

gradients = {
"dw1" : dw1,
"db1" : db1,
"dw2" : dw2,
"db2" : db2
}

return gradients

Task 6: Update Parameters


1. Implement the update_parameters function to adjust parameters based on the gradients
calculated from backpropagation and a learning rate.

2. def update_parameters(parameters, gradients, learning_rate):


3.
4. w1 = parameters['w1']
5. b1 = parameters['b1']
6. w2 = parameters['w2']
7. b2 = parameters['b2']
8.
9. dw1 = gradients['dw1']
10. db1 = gradients['db1']
11. dw2 = gradients['dw2']
12. db2 = gradients['db2']
13.
14. w1 = w1 - learning_rate*dw1
15. b1 = b1 - learning_rate*db1
16. w2 = w2 - learning_rate*dw2
17. b2 = b2 - learning_rate*db2
18.
19. parameters = {
20. "w1" : w1,
21. "b1" : b1,
22. "w2" : w2,
23. "b2" : b2
24. }
25.
26. return parameters
Task 7: Model Training and Cost Visualization

1. Complete the model function to train the neural network over a given number of
iterations.
2. def model(x, y, n_h, learning_rate, iterations):
3.
4. n_x = x.shape[0]
5. n_y = y.shape[0]
6.
7. cost_list = []
8.
9. parameters = initialize_parameters(n_x, n_h, n_y)
10.
11. for i in range(iterations):
12.
13. forward_cache = forward_propagation(x, parameters)
14.
15. cost = cost_function(forward_cache['a2'], y)
16.
17. gradients = backward_prop(x, y, parameters,
forward_cache)
18.
19. parameters = update_parameters(parameters, gradients,
learning_rate)
20.
21. cost_list.append(cost)
22.
23. if(i%(iterations/10) == 0):
24. print("Cost after", i, "iterations is :", cost)
25.
26. return parameters, cost_list

2. Track the cost at each iteration and plot the cost values using matplotlib.pyplot.plot.
Task 8: Model Evaluation

1. Implement the accuracy function to compute the classification accuracy for both the
training and test datasets.

def accuracy(inp, labels, parameters):

forward_cache = forward_propagation(inp, parameters)

a_out = forward_cache['a2'] # Contains probabilities with shape (10, 1)

print(a_out)

# finds the class with the highest probability for each column

a_out = np.argmax(a_out, axis=0) # axis=0 for row-wise

# Find the index of the true labels

labels = np.argmax(labels, axis=0)

# Calculate accuracy as the percentage of correct predictions

acc = np.mean(a_out == labels) * 100

return acc

2. Display the accuracy for X_train and X_test.

Task 9: Test Prediction Visualization

1. Select a random test image, visualize it, and use the trained model to predict its label.
Print the predicted label and compare it to the true label.
Submission

Submit the following:

1. A completed code file with each function implemented.


2. A plot of the cost vs. iteration number.
3. Accuracy of the model on training and test datasets.
4. Screenshots of at least two visualized test images and their corresponding predictions.

You might also like