0% found this document useful (0 votes)
6 views

09-Neural Networks

The document discusses neural networks and parallel computing topics. It provides 7 questions related to neural networks concepts like the McCulloch and Pitts neuron model, correlation matrix memory, learning techniques, perceptrons, backpropagation, vector quantization with SOM and its learning algorithm, Hopfield networks and stable states. It also lists optional questions on traveling salesperson problem, Kohenon model and stability of dynamic systems.

Uploaded by

sudhakar k
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

09-Neural Networks

The document discusses neural networks and parallel computing topics. It provides 7 questions related to neural networks concepts like the McCulloch and Pitts neuron model, correlation matrix memory, learning techniques, perceptrons, backpropagation, vector quantization with SOM and its learning algorithm, Hopfield networks and stable states. It also lists optional questions on traveling salesperson problem, Kohenon model and stability of dynamic systems.

Uploaded by

sudhakar k
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

www.jntuworld.

com

Code No: D109117404 R09


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD
M.Tech I Semester Regular Examinations March/April 2010
NEURAL NETWORKS
(PARALLEL COMPUTING)
Time: 3hours Max.Marks:60
Answer any five questions
All questions carry equal marks
---
1. Mention the characteristics of McCulloch and Pitts model of artificial neuron?
Show that linearly separable problems are represented by perception?

2. Consider the following Ortho-normal key patterns applied to a correlation


matrix memory:
X1 = [1,0,0,0]T, X2 = [0,1,0,0]T, X3=[0,0,1,0]T
The respective output patterns are
Y1=[5,1,0]T, Y2 =[-2,1,6]T, Y3=[-2,4,3]T
i) Calculate weight – Matrix
ii)
iii)
Show that memory associates perfectly
Get output for input X = [.8, -.15, 0.15, -2.0]T .

L D
the expression for weight- updating.

O R
3. What is learning? Write any FOUR learning techniques and in each case give

4. a) Show that perceptron does not resolve the problems which are not linearly

W
separable.
b) Explain the Bayes classifier in Gaussian environment.

T U
5. Derive the expression for back propagation training for weight updating the
output layer neuron, Hidden layer neuron?

J N
6. What is vector quantization? Explain the SOM and its learning algorithm?

7. Show that weight matrix of symmetric Hop-field network that stores TWO
memory states and containing FOUR neurons is
ξ1 = (-1,-1,-1,+1) ,ξ2 = (+1,+1,+1,+1)

Obtain the stable states for input (-1,-1,-1,+1) , (+1,+1,+1,+1) .

8. Answer any TWO


i) Traveling Sales person-Hopfield Network
ii) Kohenon model of SOM
iii) Stability of dynamic system
*********

www.jntuworld.com

You might also like