Week 06 Assignment Solution
Week 06 Assignment Solution
A. Pattern Recognition
B. Classification
C. Clustering
Ans: D
Explanation: ANN are used for all the given tasks in the options.
2. A perceptron can correctly classify instances into two classes where the classes are:
A. Overlapping
B. Linearly separable
C. Non-linearly separable
Ans: B
3. The logic function that cannot be implemented by a perceptron having two inputs
is?
A. AND
B. OR
C. NOR
D. XOR
Ans: D
A. wi ← wi + h(t - o)
B. wi ← wi + h(t - o) x
C. wi ← h(t - o) x
D. wi ← wi + (t - o) x
Ans: B
Δ wi= h(t - o) x
where t is the target output for the current training example, o is the output generated
by the perceptron, and h is a positive constant called the learning rate.
5. A neuron with 3 inputs has the weight vector [0.2 -0.1 0.1]^T and a bias θ = 0. If
the input vector is X = [0.2 0.4 0.2]^T , then the total input to the neuron is:
A. 0.2
B. 0.02
C. 0.4
D. 0.10
Ans: B
A. E≡ 1 ∑ ( t i−oi )
2 i=1 .. n
B. 1
E≡ ∑ ( t −o ) 2
2 i=1 .. n i i
1
C. E≡ 2 ∑ ( t i +o i )
2
i=1 .. n
D. E≡ 1 ∑ ( t i +o i )
2 i=1 .. n
Ans : B
1
Explanation: error function is E≡ ∑ ( t −o )2
2 i=1 .. n i i
where t is the target output for the current training example, o is the output generated
by the perceptron.
2
7. The tanh activation function h ( z )= −2 z
−1 is:
1+e
Ans: D
8. The neural network given bellow takes two binary valued inputs x 1, x 2 ϵ {0,1} and
the activation function is the binary threshold function ( h ( z )=1 if z >0 ;0 otherwise ) . Which
of the following logical functions does it compute?
-1
1
X1 h(X)
5
X2
5
A. OR
B. AND
C. NAND
D. NOR
Ans: A
For different values of X1 and X2 we will obtain the value of h(X), this resembles the truth table of
OR.
9. The neural network given bellow takes two binary valued inputs x 1, x 2 ϵ {0,1} and
the activation function is the binary threshold function ( h ( z )=1 if z >0 ;0 otherwise ) . Which
of the following logical functions does it compute?
-1
8
X1 h(x)
5
X2
5
A. OR
B. AND
C. NAND
D. NOR
Ans: B
For different values of X1 and X2 we will obtain the value of h(X), this resembles the truth table of
AND.
A.With training iterations, error on training set as well as test set decreases
B. With training iterations, error on training set decreases but test set increases
C. With training iterations, error on training set as well as test set increases
D. With training iterations, training set as well as test set error remains constant
Ans: B
Explanation: Overfitting is when training error decreases and test error increases.