DistillerNeural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
FKDA Fast Knowledge Distillation Framework for Visual Recognition
simpleAICV-pytorch-ImageNet-COCO-trainingSimpleAICV:pytorch training example on ImageNet(ILSVRC2012)/COCO2017/VOC2007+2012 datasets.Include ResNet/DarkNet/RetinaNet/FCOS/CenterNet/TTFNet/YOLOv3/YOLOv4/YOLOv5/YOLOX.
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
roberta-wwm-base-distillthis is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
CCLPyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
ZAQ-codeCVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)