Variational AutoEncoder
Variational AutoEncoder
• They then sampled from the distribution during forward pass, and
utilized the re-parametrization trick allowing for backpropagation
through the sampling step
• The sum is taken over all the dimensions in the latent space
Image Credit
• end up describing every observation
using the same unit Gaussian
Image Credit
• However, when the two terms are
optimized simultaneously
• The latent state is described for an
observation with distributions close
to the prior but deviating when
necessary to describe salient features
of the input
Image Credit
Summary
• The encoder network takes raw input data and transforms it into a
probability distribution within the latent space
• The probabilistic nature of the latent space also enables the generation
of novel samples by drawing random points from the learned
distribution
References
• https://www.linkedin.com/pulse/understanding-variational-autoencoders-
vaes-how-useful-raja
• https://www.analyticsvidhya.com/blog/2021/04/generate-your-own-
dataset-using-gan/
• https://www.geeksforgeeks.org/variational-autoencoders/
• https://medium.com/retina-ai-health-inc/variational-inference-derivation-
of-the-variational-autoencoder-vae-loss-function-a-true-story-
3543a3dc67ee
• https://towardsdatascience.com/reparameterization-trick-126062cfd3c3