QB3RDIA
QB3RDIA
SEMESTER – VII
1 Evaluate the architecture of LeNet and its significance in shaping the evolution of convolutional
neural networks (CNNs).
Design the architecture of a deep recurrent network and determine its improvement upon traditional
2
RNNs for handling complex sequential data.
3 Evaluate the impact of efficient convolution algorithms on the design and training of CNN
architectures, such as AlexNet.
4 Design a natural language processing pipeline that integrates large-scale deep learning models for
tasks such as machine translation or sentiment analysis.
Evaluate the contributions of LeNet and AlexNet to the development of modern deep learning
5
systems. Interpret the roles of LeNet and AlexNet as milestones in the development of CNN research.
Design a deep learning framework for large-scale image processing tasks and determine its impact on
6
computer vision applications.
Explain the key differences between an RNN and a feedforward neural network. Why are RNNs
7
well-suited for sequential data .
8 Design strategies to address vanishing gradient issues in deep recurrent networks and discuss how
techniques like LSTMs or GRUs contribute to effective learning.
9 Design an RNN-based solution for real-time stock price prediction using a given dataset. Outline the
key steps involved, including data preprocessing, model architecture, training, and evaluation.
Develop an understanding of the Long Short-Term Memory (LSTM) network architecture. Illustrate
10 the effectiveness of LSTMs in addressing the vanishing gradient problem and describe the roles of the
different gates in the LSTM.
11 Design an RNN-based approach for natural language processing tasks such as machine translation or
text generation. Illustrate the key components and steps involved, providing a relevant example.
Develop an understanding of Bidirectional RNNs (BRNNs). Discuss their advantages in processing
12 sequential data and provide examples of how they are applied in natural language processing (NLP)
tasks.
Explain the concept of deep recurrent networks. How do they differ from standard RNNs, and what
13
challenges arise when training them?
14 Develop an in-depth explanation of the concept of unfolding computational graphs in recurrent neural
networks (RNNs), and construct the technique that facilitates the processing of sequential data.
Design a comprehensive explanation of the architecture of a standard Recurrent Neural Network
15 (RNN) and highlight its differences from traditional feedforward neural networks in processing
sequential data.
16 Develop a comprehensive explanation of the working of recursive neural networks, emphasizing their
application in structured data tasks such as sentiment analysis and tree parsing.
Given a text sentiment analysis task, explain how a Bidirectional RNN can be implemented to
17
improve accuracy.
Design a Recursive Neural Network for parsing the syntactic structure of a sentence. Explain how
18
the tree structure influences the forward and backward passes.
MAHARAJA INSTITUTE OF TECHNOLOGY MYSORE
Behind Belawadi, KR Mills,Srirangapatna(T), Mandya-571477
Illustrate the role of Encoder-Decoder architectures in sequence-to-sequence tasks. How does this
20
architecture handle variable-length input and output sequences?
21 Design a pipeline for a speech recognition system using RNNs or LSTMs, and explain the key steps.
What are random or unsupervised features in the context of convolutional neural networks? Explain
22
their advantages and limitations in feature representation.