0% found this document useful (0 votes)
48 views6 pages

Recurrent Neural Networks

Recurrent neural networks (RNNs) are neural networks that receive an input sequence and generate an output sequence. RNNs contain cycles, allowing information to persist, and are well-suited for sequence prediction and generation tasks. RNNs use a hidden state to remember information about previously seen elements of the sequence, and their simple architecture can be extended for various types of sequence tasks like machine translation, speech recognition, and text generation.

Uploaded by

B. Vasanthi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views6 pages

Recurrent Neural Networks

Recurrent neural networks (RNNs) are neural networks that receive an input sequence and generate an output sequence. RNNs contain cycles, allowing information to persist, and are well-suited for sequence prediction and generation tasks. RNNs use a hidden state to remember information about previously seen elements of the sequence, and their simple architecture can be extended for various types of sequence tasks like machine translation, speech recognition, and text generation.

Uploaded by

B. Vasanthi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

RECURRENT NEURAL NETWORK

Recurrent Neural Network(RNN) is a type of Neural Network where the


output from the previous step is fed as input to the current step. In traditional
neural networks, all the inputs and outputs are independent of each other, but
in cases when it is required to predict the next word of a sentence.Thus RNN
came into existence, which solved this issue with the help of a Hidden Layer.
The main and most important feature of RNN is its Hidden state, which
remembers some information about a sequence. The state is also referred to
as Memory State since it remembers the previous input to the network.

ARCHITECTURE OF RNN:
RNNs have the same input and output architecture as any other deep neural
architecture. However, differences arise in the way information flows from
input to output. Unlike Deep neural networks where we have different weight
matrices for each Dense network in RNN, the weight across the network
remains the same. It calculates state hidden state Hi for every input Xi . By
using the following formulas:
h= σ(UX + Wh-1 + B)
Y = O(Vh + C) Hence
Y = f (X, h , W, U, V, B, C)
Here S is the State matrix which has element si as the state of the network at
timestep i
The parameters in the network are W, U, V, c, b which are shared across
timestep
WORKING OF RECURRENT NEURAL NETWORK:

The Recurrent Neural Network consists of multiple fixed activation function


units, one for each time step. Each unit has an internal state which is called the
hidden state of the unit. This hidden state signifies the past knowledge that the
network currently holds at a given time step. This hidden state is updated at
every time step to signify the change in the knowledge of the network about
the past. The hidden state is updated using the following recurrence relation:-
The formula for calculating the current state:

where:
ht -> current state
ht-1 -> previous state
xt -> input state
Formula for applying Activation function(tanh):

where:

whh -> weight at recurrent neuron

wxh -> weight at input neuron

Backpropagation Through Time (BPTT):


In RNN the neural network is in an ordered fashion and since in the ordered
network each variable is computed one at a time in a specified order like first h1
then h2 then h3 so on. Hence we will apply backpropagation throughout all
these hidden time states sequentially.

Types of RNN :

1. One-to-One RNN:

The above diagram represents the structure of the Vanilla Neural Network. It
is used to solve general machine learning problems that have only one input
and output.
Example: classification of images.

2. One-to-Many RNN:
A single input and several outputs describe a one-to-many Recurrent Neural
Network. The above diagram is an example of this.
Example: The image is sent into Image Captioning, which generates a
sentence of words.
3. Many-to-One RNN:

This RNN creates a single output from the given series of inputs.
Example: Sentiment analysis is one of the examples of this type of network, in
which a text is identified as expressing positive or negative feelings.
4. Many-to-Many RNN:

Many-to-Many RNN

This RNN receives a set of inputs and produces a set of outputs.


Example: Machine Translation, in which the RNN scans any English text and
then converts it to French.
Applications of Recurrent Neural Networks:

 Prediction problems.
 Machine Translation.
 Speech Recognition.
 Language Modelling and Generating Text.
 Video Tagging.
 Generating Image Descriptions.
 Text Summarization.
 Call Center Analysis

Machine Translation:
RNN can be used to build a deep learning model that can translate text from
one language to another without the need for human intervention. You can, for
example, translate a text from your native language to English.
Text Creation:
RNNs can also be used to build a deep learning model for text generation.
Based on the previous sequence of words/characters used in the text, a trained
model learns the likelihood of occurrence of a word/character.
Captioning of images:
The process of creating text that describes the content of an image is known as
image captioning. The image's content can depict the object as well as the action
of the object on the image.
Recognition of Speech:
This is also known as Automatic Speech Recognition (ASR), and it is capable
of converting human speech into written or text format. Don't mix up speech
recognition and voice recognition.
Forecasting of Time Series:
After being trained on historical time-stamped data, an RNN can be used to
create a time series prediction model that predicts the future outcome. The stock
market is a good example.

Advantages of Recurrent Neural Network:

1. An RNN remember piece of information through time. It is useful in


time series prediction only because of the feature to remember
previous inputs as well. This is called Long Short Term Memory.
2. Recurrent neural networks are even used with convolutional layers to
extend the effective pixel neighborhood.

Disadvantages of Recurrent Neural Network:

1. Gradient vanishing and exploding problems.


2. Training an RNN is a very difficult task.
3. It cannot process very long sequences if using tanh or relu as an
activation function.

Conclusion:

RNNs allow us to perform modeling over a sequence or a chain of vectors.


These sequences can be either input, output or even both. Therefore, we can
conclude that neural networks are related to lists or sequences. So, whenever
you have data of sequential nature, you should apply recurrent neural networks.

You might also like