0% found this document useful (0 votes)
32 views6 pages

Tutorial Sheet For Unit 1,2 and 3

Uploaded by

Deepak singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views6 pages

Tutorial Sheet For Unit 1,2 and 3

Uploaded by

Deepak singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Chapter 1:

Q. 1. Draw the structure of a biological Neuron and develop the Artificial Neural Network
from it.
Q. 2. Explain Threshold Logic Unit architectures with an example.
Q. 3. Realize the OR and Ex-OR gate using Mc Culloch Pitts model of ANN.
Q. 4. What is Hebbian Hypothesis for ANN? Also explain the basic building blocks
(processing elements) of artificial neural network with network architectures.
Q. 5. Realize the AND and Ex-OR gate using Mc Culloch Pitts model of ANN.
Q. 6. Explain different Learning Rules for the Artificial Neural Network.
Q. 7. Draw the structure of a biological Neuron and develop the Artificial Neural Network
from it.
Q. 8. What is the XOR problem? Draw and explain the equivalent circuit to resolve it.
Q. 9. Define bias and threshold in context of ANN. How does threshold help to obtain the
output? Explain it with mathematical expressions.
Q. 10. Explain different network architectures with example.
Q. 11. Realize the ANDNOT Logic using Mc Culloch Pitts model of ANN.
Q. 12. Draw the structure of a biological Neuron and develop the Artificial Neural Network
from it.
Q. 13. Define bias and threshold in context of ANN. How does threshold help to obtain the
output? Explain it with mathematical expressions.
Q. 14. If the net input to an output neuron is 0.64, calculate its output when the activation
function is
a. Binary Sigmoidal
b. Bipolar Sigmoidal
Q. 15. Realize the OR and Ex-OR gate using McCulloch Pitts model of ANN.
Q. 16. Draw the structure of a biological Neuron and develop the Artificial Neural Network
from it.
Chapter 2:

Q. 1. What is Hebbian Hypothesis for ANN? Also explain the basic building blocks
(processing elements) of artificial neural network with network architectures.
Q. 2. Is it possible to implement OR logic with Hebbian Learning rule? Justify your
answer.
Q. 3. Write the major difference between Perceptron and McCullouch-Pitts model.
Q. 4. Implement a bipolar AND gate for Hebb’s Network having synaptic weights w1= w2
= 1. Also, suggest the appropriate activation function to convert actual output f(net)
into desired output Y.
Q. 5. Develop a Perceptron for the AND function for 2 epochs. Consider bipolar inputs
and targets to train the network.
Q. 6. Develop a Perceptron for the AND function with binary inputs and bipolar targets
without bias up to second epoch. Comment on the result for the following cases:
a. Take (0, 0) into consideration
b. Without (0, 0).
Q. 7. Write short notes on following:
a. types of Learning Rules
b. Steepest Descent
c. Linear Regression
Q. 8. What is Perceptron? Explain architecture and algorithm of Perceptron.
Q. 9. Develop a Perceptron for the OR function with binary inputs and bipolar targets
without bias up to second epoch. Comment on the result for following cases:
a. Take (0, 0) into consideration
b. Without (0, 0).
Q. 10. Using Perceptron learning rule, find the weight required to perform the following
classifications. Vector (1,-1,1,-1), and (-1,1,-1,-1) are the members of first class
(having target value 1); vector (1,1,1,-1) and (1,-1,-1,1) are the members of second
class (having target value -1). Use learning rate of 1 and starting weights of 0. Using
each of the training and vectors as input, test the response of the net.
Q. 11. Write short notes on following:
a. Adaline
b. Steepest Descent
Q. 12. Develop an Adaline network for ANDNOT function with bipolar inputs and targets.
Consider learning up to second epoch.
Q. 13. Derive the Delta Rule for single output unit. How it can be extended for multiple
output units?
Q. 14. Write the major difference between Perceptron and McCullouch-Pitts.
Q. 15. Develop an Adaline network for ANDNOT function with bipolar inputs and targets.
Consider learning up to second epoch.
Q. 16. Explain architecture and algorithm of Adaline in detail. Also derive delta learning
rule for single output unit.
Q. 17. Develop an Adaline network for ANDNOT function with bipolar inputs and targets.
Also, find the final weights after second epoch.
Q. 18. Explain the architecture and algorithm of MR-I (Madaline) in detail.
Q. 19. Explain the architecture and algorithm of MR-II (Madaline) in detail.
Q. 20. Explain Backpropagation network in detail.
Q. 21. The network shown below is trained with the input vector = [0.55,0.45] and output
vector = [0.2, 0.9]. Find the weight vectors after second epoch, if learning rate of the
network is 0.1.

Q. 22. The network shown below is trained with the input vector = [0.55,0.45] and output
vector = [0.6]. Find the weight vectors after second epoch, if learning rate of the
network is 0.1.

Q. 23. Write short notes on following activation function:


(a). Identity/ Linear Activation Function
(b). ReLU
(c). Bipolar Sigmoidal/Hyperbolic tangent
(d). Sigmoid
(e). Binary and Bipolar
Chapter 3:

Q. 1. Write the difference between auto associative and hetero associative memory.
Q. 2. Draw the structure of Auto-associative memory network and explain its training and
testing.
Q. 3. Draw the structure of Hetero-associative memory network and explain its training and
testing.
Q. 4. Consider an Auto associative net to store the vector having weight matrix,
1 1 1 -1
w=[ 1 1 1 -1]
1 1 1 -1
-1 -1 -1 1
Prove that the network recognizes the pattern (1 1 1 1) as known vector
Q. 5. A Hetero-associative net shown below, is trained using Hebb rule.

Find the updated weight matrix of the network, if it has been trained by the input and
output vectors given below.

Inputs Outputs

X1 X2 X3 X4 Y1 Y2

1 0 0 0 1 0

1 1 0 0 1 0

0 0 0 1 0 1
0 0 1 1 0 1

Q. 6. Train a Hetero-associative net to store the given bipolar i/p vector s(s 1,s2,s3,s4) to the
o/p vector t(t1,t2). Also, test the network with missing vector v1 = [1,1,0,0] and mistake
for test vector v2 = [-1,1,1,-1].

Q. 7. Explain architecture and working of Bidirectional Associative Memory (BAM)


network.
Q. 8. Explain the application of BAM network for character recognition.
Q. 9. Construct an Auto-associative discrete Hopfield network with I/P vector [1 1 1 -1]. Test
discrete Hopfield network with missing entries in first and second component of stored
vector.
Q. 10. Explain training and testing of Auto-associative discrete Hopfield network with a
suitable example.

You might also like