Sample Final Q1
Sample Final Q1
Login
Ask a Question
+3
votes
4.3k views
asked Aug 10, 2020 in Machine Learning by AskDataScience (116k points)
The goal of backpropagation is to optimize the weights so that the neural network can learn
how to correctly map arbitrary inputs to outputs.
Assume for the following neural network, inputs = [i1 , i2 ] = [0.05, 0.10], we want the neural
network to output = [o1 ,o2 ] = [0.01, 0.99], and for learning rate, α = 0.5 .
In addition, the activation function for the hidden layer (both h1 and h2 ) is sigmoid (logistic):
1
S (x) =
1+e−x
Hint:
∂E
w new = w old − α
∂w
1 2
Etotal = ∑ (target − output)
2
a) Show step by step solution to calculate weights w1 to w8 after one update in table below.
b) Calculate initial error and error after one update (assume biases [b1 , b2 ] are not changing
during the updates).
w1 0.15 ?
w2 0.20 ?
w3 0.25 ?
w4 0.30 ?
w5 0.40 ?
w6 0.45 ?
w7 0.50 ?
w8 0.55 ?
answer comment
Like 0 Post
Share
1 Answer
+2
votes
You can find the step by step solution which is provided here.
comment
While the question says activation function for hidden layer, the solution applies the same activation
function to output layer as well. Do we also need to apply activation function to output layer?
Related questions
4.8k views
4.6k views
7.7k views
6.3k views
7k views
Send feedback Privacy Policy Terms of Use Powered by Ask Data Science! © 2017-2025