Tutorial 1 - Un-Self-supervised Learning Methods - Neuromatch Academy - Deep Learn
Tutorial 1 - Un-Self-supervised Learning Methods - Neuromatch Academy - Deep Learn
Print to PDF
methods
Contents
Tutorial 1: Un/Self-supervised learning methods
Tutorial Objectives
Setup
Section 0: Introduction
Section 1: Representations are important
Section 2: Supervised learning induces invariant representations
Section 3: Random projections don’t work as well
Section 4: Generative approaches to representation learning can fail
Section 5: The modern approach to self-supervised training for invariance
Section 6: How to train for invariance to transformations with a target network
Section 7: Ethical considerations for self-supervised learning from biased datasets
Summary
Daily survey
Bonus 1: Self-supervised networks learn representation invariance
Bonus 2: Avoiding representational collapse
Bonus 3: Good representations enable few-shot learning
By Neuromatch Academy
Content creators: Arna Ghosh, Colleen Gillon, Tim Lillicrap, Blake Richards
Content reviewers: Atnafu Lambebo, Hadi Vafaei, Khalid Almubarak, Melvin Selim Atay, Kelson Shilling-Scrivo
In this tutorial, you will learn about the importance of learning good representations of data.
Train logistic regressions (A) directly on input data and (B) on representations learned from the data.
Compare the classification performances achieved by the different networks.
Compare the representations learned by the different networks.
Identify the advantages of self-supervised learning over supervised or traditional unsupervised methods.
Download completed!
# Imports
import torch
import torchvision
import numpy as np
import matplotlib.pyplot as plt
Figure settings
Show code cell source
Plotting functions
Function to plot a histogram of RSM values: plot_rsm_histogram(rsms, colors)
Helper functions
Show code cell source
Video 0: Introduction
Note on dataset: For convenience, we will be using a subset of the original, full dataset which is available here, on
GitHub.
The images in the dataset can be described using different combinations of latent dimension values, sampled from: