Skip to content

verages/MAML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Keras - MAML

Part 1. Introduction

This is an experimental code base

The biggest difference between MAML and pre-training weights:Pre-training weights minimize only for original task loss. MAML can minimize all task loss with a few steps of training.

Part 2. Quick Start

  1. Pull repository.
git clone https://github.com/verages/MAML.git
  1. You need to install some dependency package.
cd MAML
pip installl -r requirements.txt
  1. Download the Omiglot dataset and maml weights.
wget https://github.com/verages/MAML/releases/download/v0.1/Omniglot.tar
wget https://github.com/verages/MAML/releases/download/v0.1/maml.h5
tar -xvf Omniglot.tar
  1. Run evaluate.py, you'll see the difference between MAML and random initialization weights.
python evaluate.py

Part 3. Train your own dataset

  1. You should set same parameters in config.py.
n_way = "number of classes"
k_shot = "number of support set"
q_query = "number of query set"
  1. Start training.
python train.py
  1. Running tensorboard to monitor the training process.
tensorboard --logdir=./summary

tensorboard.png

Part 4. Paper and other implement

About

This is an experimental code base

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages