0% found this document useful (0 votes)
23 views3 pages

Tutorial 1 - Un-Self-supervised Learning Methods - Neuromatch Academy - Deep Learn

The document discusses unsupervised and self-supervised learning methods. It introduces the dSprites dataset and explores how representations are important for learning. The document compares classification performance using representations learned with different network training approaches and identifies advantages of self-supervised learning over supervised or unsupervised methods.

Uploaded by

Ravi kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views3 pages

Tutorial 1 - Un-Self-supervised Learning Methods - Neuromatch Academy - Deep Learn

The document discusses unsupervised and self-supervised learning methods. It introduces the dSprites dataset and explores how representations are important for learning. The document compares classification performance using representations learned with different network training approaches and identifies advantages of self-supervised learning over supervised or unsupervised methods.

Uploaded by

Ravi kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Tutorial 1: Un/Self-supervised learning

Print to PDF

methods
Contents
Tutorial 1: Un/Self-supervised learning methods
Tutorial Objectives
Setup
Section 0: Introduction
Section 1: Representations are important
Section 2: Supervised learning induces invariant representations
Section 3: Random projections don’t work as well
Section 4: Generative approaches to representation learning can fail
Section 5: The modern approach to self-supervised training for invariance
Section 6: How to train for invariance to transformations with a target network
Section 7: Ethical considerations for self-supervised learning from biased datasets
Summary
Daily survey
Bonus 1: Self-supervised networks learn representation invariance
Bonus 2: Avoiding representational collapse
Bonus 3: Good representations enable few-shot learning

Open in Colab Open in Kaggle

Week 3, Day 3: Unsupervised and self-supervised learning

By Neuromatch Academy

Content creators: Arna Ghosh, Colleen Gillon, Tim Lillicrap, Blake Richards

Content reviewers: Atnafu Lambebo, Hadi Vafaei, Khalid Almubarak, Melvin Selim Atay, Kelson Shilling-Scrivo

Content editors: Anoop Kulkarni, Spiros Chavlis

Production editors: Deepak Raya, Gagana B, Spiros Chavlis

In this tutorial, you will learn about the importance of learning good representations of data.

Specific objectives for this tutorial:

Train logistic regressions (A) directly on input data and (B) on representations learned from the data.
Compare the classification performances achieved by the different networks.
Compare the representations learned by the different networks.
Identify the advantages of self-supervised learning over supervised or traditional unsupervised methods.

https://deeplearning.neuromatch.io/tutorials/W3D3_Unsupe edAndSelfSupervisedLearning/student/W3D3_Tutorial1.html 04/06/24, 2 59 PM


Page 1 of 45
:
r
Install dependencies
Show code cell source

Downloading and unzipping the file... Please wait.

Download completed!

Install and import feedback gadget


Show code cell source

# Imports
import torch
import torchvision
import numpy as np
import matplotlib.pyplot as plt

# Import modules designed for use in this notebook.


from neuromatch_ssl_tutorial.modules import data, load, models, plot_util
from neuromatch_ssl_tutorial.modules import data, load, models, plot_util
importlib.reload(data)
importlib.reload(load)
importlib.reload(models)
importlib.reload(plot_util)

<module 'neuromatch_ssl_tutorial.modules.plot_util' from '/home/runner/work/course-content-


dl/course-content-
dl/tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/neuromatch_ssl_tutorial/modules/pl
ot_util.py'>

Figure settings
Show code cell source

Plotting functions
Function to plot a histogram of RSM values: plot_rsm_histogram(rsms, colors)

Show code cell source

Helper functions
Show code cell source

Set random seed


Executing set_seed(seed=seed) you are setting the seed

Show code cell source

https://deeplearning.neuromatch.io/tutorials/W3D3_Unsupe edAndSelfSupervisedLearning/student/W3D3_Tutorial1.html 04/06/24, 2 59 PM


Page 2 of 45
:
r
Set device (GPU or CPU). Execute set_device()
Show code cell source

# Set global variables


SEED = 2021
set_seed(seed=SEED)
DEVICE = set_device()

Random seed 2021 has been set.


WARNING: For this notebook to perform best, if possible, in the menu under `Runtime` -> `Change
runtime type.` select `GPU`

Pre-load variables (allows each section to be run independently)


Show code cell source

Video 0: Introduction

Submit your feedback


Show code cell source

Time estimate: ~30mins

Video 1: Why do representations matter?

Submit your feedback


Show code cell source

Section 1.1: Introducing the dSprites dataset


In this tutorial, we will be using a subset of the openly available dSprites dataset to investigate the importance of
learning good representations.

Note on dataset: For convenience, we will be using a subset of the original, full dataset which is available here, on
GitHub.

Interactive Demo 1.1.1: Exploring the dSprites dataset


In this first demo, we will get to know the dSprites dataset. This dataset is made up of black and white images (20,000
images total in the subset we are using).

The images in the dataset can be described using different combinations of latent dimension values, sampled from:

https://deeplearning.neuromatch.io/tutorials/W3D3_Unsupe edAndSelfSupervisedLearning/student/W3D3_Tutorial1.html 04/06/24, 2 59 PM


Page 3 of 45
:
r

You might also like