0% found this document useful (0 votes)
244 views3 pages

Question Bank New

The document discusses various concepts related to convolutional neural networks and recurrent neural networks. It provides 20 questions on topics such as convolution operations, pooling, parameter sharing, convolutional network architectures, transfer learning in deep learning, recurrent neural network architectures like LSTM, and other gated RNNs.

Uploaded by

savitaannu07
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
244 views3 pages

Question Bank New

The document discusses various concepts related to convolutional neural networks and recurrent neural networks. It provides 20 questions on topics such as convolution operations, pooling, parameter sharing, convolutional network architectures, transfer learning in deep learning, recurrent neural network architectures like LSTM, and other gated RNNs.

Uploaded by

savitaannu07
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

DEEP LEARNING

UNIT - III: CONVOLUTIONAL NETWORKS


PART – A

B
1 Write an example function for Convolution operation and explain in detail.

Explain the following with suitable diagram. (i) Sparse interactions (ii)
2
Parameter Sharing

3 DescribePooling with suitable example.

4 Write an expression for Unshared convolution with explanation and


explain Tiled convolution.
5 Discuss in detail the variants of the Basic Convolution Function.

6 Constructanarchitecturethat show complex layer terminology and Simple


layer terminology in convolutional neural network.
Discuss local connections, convolution and full connections with
7
diagram?

Develop a table with examples of different formats of data that can be


8
used with convolutional networks.

9 Describe in detail about the following.


i. Parameter Sharing.
ii. Equivariant representation.
10 Differentiatelocally connected layers, tiled convolution and standard
convolution with suitable examples and diagram.
11 i. Write short notes Max Pooling.
ii. Explain Pooling with downsampling.
12 Explain random or Unsupervised Features.

13 Illustrateunshared convolutionwith suitable examples.

14 i. Show three properties of V1 that a convolutional network layer is


designed to capture.
ii. Provethe working learned invariances with necessary example
and diagram.

15. Construct a convolutional network to demonstrate the effect of zero padding on


16 Explain Neuro scientific basis for Convolutional Networks

UNIT - IV
1 Compare echo state network and liquid state machines.

2 Distinguish content based addressing and location based addressing in


memory networks.
3 Classify the different strategies for Multiple Time Scales.

4 Developa block diagram for LSTM.

5 Illustrate block diagram of LSTM recurrent network “cell”.

Explain the concept of transfer learning in the context of deep learning .


6
Provide examples to illustrate your points

Describe how transfer learning can be beneficial for training deep neural
7 networks and discuss some common scenarios where transfer learning can
be applied

8 i. Describe Unfolding Computational Graphs.


ii. Explain Bidirectional RNNs.

9 Describe the following.

i. Teacher Forcing in Recurrent Neural Networks.


ii. Networks with Output Recurrence.

10 i. Describe Echo State Networks.


ii. Explain challenge of Long-Term Dependencies

11 Discuss Recurrent Neural Networks in detail.

12 Describe Deep Recurrent Networks in detail

13 Illustrate Encoder-Decoder sequence-to-sequence Architecture.

14 Explain Leaky Units and Other Strategies for Multiple Time Scales.

15 Describe the following.


i. Long Short-Term Memory.

ii. Other Gated RNNs.


16 Develop an example for Unfolding Computational Graphs and describe the
major advantages of unfolding process.
17 Explain how to compute the gradient in a Recurrent Neural Network.

18 Explain a modeling sequences Conditioned on Context with RNNs.

Prepare an example of Encoder- Decoder or sequence-to-sequence RNN


19
architecture.

20 Explain variousGated RNNs.

You might also like