0% found this document useful (0 votes)
27 views4 pages

Sample Final Q1

The document discusses how to update weights in the backpropagation algorithm when the activation function is not linear, specifically using a sigmoid function. It provides a numerical example with initial weights and a learning rate, asking for a step-by-step solution to calculate the new weights after one update. Additionally, it addresses the calculation of initial error and error after the update while noting that biases remain unchanged during the process.

Uploaded by

by.messaging
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views4 pages

Sample Final Q1

The document discusses how to update weights in the backpropagation algorithm when the activation function is not linear, specifically using a sigmoid function. It provides a numerical example with initial weights and a learning rate, asking for a step-by-step solution to calculate the new weights after one update. Additionally, it addresses the calculation of initial error and error after the update while noting that biases remain unchanged during the process.

Uploaded by

by.messaging
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

 Ask Data Science!


Login

Ask a Question

How to update the weights in backpropagation algorithm when


activation function in not linear?

+3
votes

4.3k views
asked Aug 10, 2020 in Machine Learning by AskDataScience (116k points)

The goal of backpropagation is to optimize the weights so that the neural network can learn
how to correctly map arbitrary inputs to outputs.

Assume for the following neural network, inputs = [i1 , i2 ] = [0.05, 0.10], we want the neural
network to output = [o1 ,o2 ] = [0.01, 0.99], and for learning rate, α = 0.5 .
In addition, the activation function for the hidden layer (both h1 and h2 ) is sigmoid (logistic):

1
S (x) =
1+e−x


Hint:
∂E
w new = w old − α
∂w

1 2
Etotal = ∑ (target − output)
2

a) Show step by step solution to calculate weights w1 to w8 after one update in table below.
b) Calculate initial error and error after one update (assume biases [b1 , b2 ] are not changing
during the updates).

Updating weights in backpropagation algorithm

Weights Initialization New weights after one step

w1 0.15 ?

w2 0.20 ?

w3 0.25 ?

w4 0.30 ?

w5 0.40 ?

w6 0.45 ?


w7 0.50 ?

w8 0.55 ?

machine-learning deep-learning neural-network backpropagation back-propagation ml-exercise ml-final


ele888-final

answer comment

Like 0 Post
Share

1 Answer

+2
votes

answered Aug 10, 2020 by AskDataScience (116k points)


selected Aug 10, 2020 by AskDataScience

You can find the step by step solution which is provided here.

comment

commented Jul 3, 2021 by Raha (100 points)

While the question says activation function for hidden layer, the solution applies the same activation
function to output layer as well. Do we also need to apply activation function to output layer?

Related questions

How to update weights in backpropagation algorithm


+3
votes
(a numerical example)?
asked Apr 11, 2019 in Machine Learning by AskDataScience (116k points) 
machine-learning deep-learning backpropagation back-propagation
1 gradient-decent ml-exercise ml-final ele888-final
answer

4.8k views

How to update weights using gradient decent


+2
votes
algorithm?
asked Mar 28, 2019 in Machine Learning by AskDataScience (116k points)
1 machine-learning deep-learning gradient-decent backpropagation
answer cost-function linear-regression ml-exercise ele888-final

4.6k views

How to calculate feed-forward (forward-propagation)


+3
votes
in neural network?
asked Apr 4, 2019 in Machine Learning by AskDataScience (116k points)
1 machine-learning feed-forward forward-propagation deep-learning
answer ml-exercise ml-final ele888-midterm ele888-final ml-midterm

7.7k views

How to calculate convolutions on a CONV layer for a


+5
votes
Convolutional Neural Network?
asked Jun 26, 2019 in Deep Learning by AskDataScience (116k points)
1 machine-learning deep-learning ml-exercise cnn ml-final ele888-final
answer ai-final

6.3k views

How to calculate Softmax Regression probabilities in


+3
votes
this example?
asked Apr 4, 2019 in Machine Learning by AskDataScience (116k points)
2 machine-learning softmax-regression softmax ml-exercise ml-midterm
answers ml-final ele888-midterm ele888-final ai-final

7k views

Send feedback Privacy Policy Terms of Use Powered by Ask Data Science! © 2017-2025

You might also like