Pytorch Tutorial by Chongruo Wu
Pytorch Tutorial by Chongruo Wu
Chongruo Wu
Agenda
1. Popular Frameworks
2. Pytorch, Basics
3. Helpful skills
Popular Deep Learning Frameworks
lua
C++, Python,
R, Julia, Perl
Scala
Stanford cs231n.
MxNet Tutorial, CVPR 2017
MxNet Tutorial, CVPR 2017
MxNet Tutorial, CVPR 2017
Stanford cs231n.
MxNet Tutorial, CVPR 2017
MxNet Tutorial, CVPR 2017
MxNet Online Document, https://goo.gl/UZ2byD
Stanford cs231n.
Stanford cs231n.
Pytorch
Stanford cs231n.
Stanford cs231n.
Pytorch Tensors
https://transfer.d2.mpi-inf.mpg.de/rshetty/hlcv/Pytorch_tutorial.pdf
Stanford cs231n.
Stanford cs231n.
Stanford cs231n.
Stanford cs231n.
Stanford cs231n.
Stanford cs231n.
Stanford cs231n.
Variable
The autograd package provides automatic differentiation for all operations on Tensors.
“ autograd.Variable is the central class of the package. It wraps a Tensor, and supports
nearly all of operations defined on it.
Once you finish your computation you can call .backward() and have all the gradients
computed automatically. “
Other layers:
Dropout, Linear,
Normalization Layer
Module, network
http://book.paddlepaddle.org/03.image_classification/
Module, sub-network
https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
Module
Stanford cs231n.
When starting a new project
3. Training Strategy
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
Train a simple Network
Stanford cs231n.
MNIST example
MNIST example
MNIST example
https://goo.gl/mQEw15
MNIST example
Data Loading
https://goo.gl/mQEw15
MNIST example
Define Network
https://goo.gl/mQEw15
MNIST example
Training
https://goo.gl/mQEw15
MNIST example
Inference
eval() mode:
*Dropout Layer
*Batchnorm Layer
https://goo.gl/mQEw15
When starting a new project
3. Training Strategy
Data Loading
Data Loading
https://goo.gl/mQEw15
Data Loading
Stanford cs231n.
Data Loading
https://goo.gl/mQEw15
Data Processing
Pix2pix Code
https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
When starting a new project
Stanford cs231n.
Learning Rate Scheduler
torch.optim.lr_scheuler
● StepLR: LR is delayed by gamma every step_size epochs
● MultiStepLR: LR is delayed by gamma once the number of epoch reaches milestones.
● ExponentialLR
● CosineAnnealingLR
● ReduceLROnPlateau
https://github.com/Jiaming-Liu/pytorch-lr-scheduler
http://pytorch.org/docs/master/optim.html#how-to-adjust-learning-rate
Pretrained Model
Load Model
https://gist.github.com/panovr/2977d9f26866b05583b0c40d88a315bf
Weights Initialization
net.apply( weights_init_normal)
https://goo.gl/bqeW1K
Weights Initialization
https://goo.gl/bqeW1K
Hooks
http://pytorch.org/tutorials/beginner/former_torchies/nn_tutorial.html
Hooks ( Backward )
http://pytorch.org/tutorials/beginner/former_torchies/nn_tutorial.html
Cudnn.benchmark flag
https://goo.gl/5gzj8F
Visualization, Pytorch Visdom
https://github.com/facebookresearch/visdom
Visualization, TensorBoard
https://www.tensorflow.org/get_started/summaries_and_tensorboard
Other Resources
Official examples. https://goo.gl/Q6Z2k8
Other Resources
Official documents. https://goo.gl/gecKC4
Other Resources
Pix2pix code https://github.com/phillipi/pix2pix
Other Resources
Pytorch, Zero to All (HKUST) https://goo.gl/S3vEUN
Thank You