Tasks On Neurons and ANN
Tasks On Neurons and ANN
Answer: C
Assignment tasks - Week 6 2021
Activation function
Problem # 4 Correct Marks: 2 Theme: Feed Forward Multiple Layer ANN
How will the final weight vector look like when all data-items are processed?
A. -0.25 0 0 0.25 B. 0 -0.25 0.25 0 C. 0.5 0 0.75 -0.35 D. -0.25 0.5 -0.25 0.25 E. None of the above
Answer: D
Perceptron
The perceptron is an algorithm for learning a linear binary
classifier in the form of a threshold function that maps its
input X, a real-valued input vector, to a single output
binary value Y.
The treshold can also be learned by transforming it to a bias = - threshold and completing the
dataitems with an extra input x0j=1 and a corresponding weight w0j=bias.
For each example j in the dataset, perform the following steps until total training set error ceases to improve:
• calculate the output
• calculate the new weights
Perceptron
0.1 Y1=1
0.3 -0.2
0.4*0.8 -0.2*0.3 =0 .26 > 0.1 -> Y1 = 1
Y1=1
0.3 -0.2
0.4*0.8 + -0.2*0.3 - 0.1 =0 .16 > 0 -> Y1 = 1
How will the final weight vector look like when all data-items are processed?
Xn
Wn Y = if Sum>0 then f (Sum) else 0
X2 W2 f = activation function
Sum = Sum (Wi*Xi) Target value = T
i=1..n
for the output
W1
X1 The differences between the T
W0= and Y values is the basis for an
X0=1 - Treshold error estimate = E
The core computation of an ANN unit
1. We identify the studied Neuron by j. All parameters have Real values.
5. Typically the Threshold is remodelled as its negation and named Bias. An extra input x0 is added with
constant value = 1. The weight x0j is set to the Bias or –Threshold. This move enables the Bias to be
adapted in the same fashion as weights.
Wj i =W j i+ a *(Tj-Yj) *X i
Single Neuron Excitation The activation function f is ReLU
X1=0.8 X2=0.3 T1= 0.26 W11=0.4 W21=-0.2 W01=-0.1, The segments are differentiable
a=1 , Treshold = 0.1 -> Bias =-1
0.8 0.4
0.3 -0.2
W11=0.4+1*(0.26-0.16)*0.8 = 0.48 W12= -0.2 +1*(0.26-0.16) *0.3= - 0.17
1 -0.1
W00=-0.1 + 1*(0.26-0.16)*1= 0
14
Solution to Q 4
Activation function