Unit 3 Chapter 1 RNN
Unit 3 Chapter 1 RNN
Chapter-1
Sequence Modeling: Recurrent and Recursive nets
Unfolding Computational Graphs
• In deep learning, unfolding computational graphs is a
process that creates a repetitive structure in a
computational graph by sharing parameters across a deep
network structure.
• This is done by recursively or recurrently computing a set
of computations.
• Computational graphs are a way to represent
mathematical operations that machines use to learn from
data.
• They are similar to flowcharts, where each node represents
an operation and the lines between nodes show how the
results flow from one step to the next.
• A computational graph is a way to formalize the
structure of a set of computations, such as
those involved in mapping inputs and
parameters to outputs and loss.
• The idea of Unfolding Computational Graphs is
sharing of parameters across a deep network
structure.
Unfolding the equation by repeatedly applying the definition in this
way has yielded an expression that does not involve recurrence.
Such an expression can now be represented by a traditional
directed acyclic computational graph. The unfolded computational
graph of equation
Recurrent
Neural
Network
represented
as a
computation
graph
unfolded view of RNN
Recurrent Neural
Network represented
as an unfolded
computational graph.
• As each internal state relies on the previous
one, you have information that is propagated
onto each layer of neurons in the network
since the beginning of the sequence.
• Like an old memory that is passed on to future
generations.
• In the case of a Recurrent Neural Network,
memories are information about the
computations applied to the sequence
Teacher forcing and networks
with output recurrence
What is Teacher Forcing?