Unit V Deep Generative Models - Part 01
Unit V Deep Generative Models - Part 01
• where:
• P ( hj = 1∣v ) = σ ( bj + ∑i vi Wij )
• P ( vi = 1∣h ) = σ ( ai + ∑j hj Wij)
Restricted Boltzmann machine (RBM)
• where 𝜎(𝑥) is the logistic sigmoid function:
• σ (x)=1 / 1+e−x1
Training RBMs
• Forward pass
multiple inputs.
• The inputs are multiplied by the weights and then added to the
bias.
Restricted Boltzmann machine (RBM)
• The top two layers are the associative memory, and the
bottom layer is the visible units.
• The “Input Layer” represents the initial layer, which has one neuron
for each input vector.
• After the DBN has been assembled through the training of its
RBMs, it can be fine-tuned for supervised learning tasks.