0% found this document useful (0 votes)
2 views

deep learning

The document is an examination paper for the B.Tech. 7th semester course on Neural Networks and Deep Learning, dated December 2022. It consists of two parts: Part-A requires short answers to compulsory questions, while Part-B includes detailed answers to selected questions. Topics covered include perceptrons, associative memory, activation functions, deep learning, and various neural network architectures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

deep learning

The document is an examination paper for the B.Tech. 7th semester course on Neural Networks and Deep Learning, dated December 2022. It consists of two parts: Part-A requires short answers to compulsory questions, while Part-B includes detailed answers to selected questions. Topics covered include perceptrons, associative memory, activation functions, deep learning, and various neural network architectures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

|4011o04 ov0

Sr. No 011703
December 2022
B.Tech. 7th Sem., December 2022
Neural Networks and Deep Learning (PEC-CSD-703)
Time: 3 Hours Max. Marks:75
Instructions: 1. Itis compulsory to answer all the questions (1.5 marks each) of Part -A in short.
2. Answer any four questions from Part -B in detail.
3. Different sub-parts of aquestion are to be attempted adja cent to each other.

PART -A
Q1 (a) Why a single perceptron cannot be used for XOR type classification? (1.5)
(b) What is the condition for perfect recall in an associative memory. (1.5)
(c) Name any three functions used as signal (activation) functions by the artificial (1.5)
neural networks and draw their graphs.
(d) What is a radial basis function? Why Guassian function is chosen as a default (1.5)
radial basis function.
(e) What is a support vector? (1.5)
() Define Hebbian learning
(1.5)
(g) Differentiate between classification and regression.
(1.5)
(h) Define clustering. (1.5)
() Define deep learning. (1.5)
) What is least mean square crror? Where it is used? (1.5)

PART -B
Q2 (a) Explain how a single perceptron can be used as AND type classifier. Also write (7)
the algorithm.
(b) Explain the architecture of a Multilayer Perceptron Model. How it can be (8)
trained using feedback mechanism.

+03 (a) Explain the architecture of aAdaptive Resonance Theory network. How it (7)
overcomes Stability-Plasticity Dilemma.
(b) Differentiate between autoasociation and Heteroassociation.
(c) Encode the following vectors in an auto associative memory and (3)
recall one of (5)
them.
[-11-1 1], [1 11-1], [-1 -1 -1 1|
4(a) Explain the following process in reference to self organizing maps: (7)
Competition, Cooperation and adaptation.
(b) Describe three to four applications for which the neural network is best tool.
(c) What is gradient descent method? How it is used for error reduction and (4)
(4)
weight optimization.
Ls5 (a) What is principal component analysis. How it is used for dimensionality (5)
reduction.
(b) Write ashort note on vector quantization and its applicability in self (7)
organizing maps.
(c) State and briefly describe cover's theorem. (3)

6(a) What is a support vector machine? How it helps in getting the better decisive (8)
plane.
(b) Describe the architecture and working of the Boltzman machine. (7)

Writeshort noteson the following: (15)


1. Reinforcement I.earning
. Supervised and unsupervised Learning
II. Radial Basis networks

You might also like