Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical StudyDownload PDF

Published: 12 Jan 2021, Last Modified: 05 May 2023ICLR 2021 PosterReaders: Everyone
Keywords: label smoothing, knowledge distillation, image classification, neural machine translation, binary neural networks
Abstract: This work aims to empirically clarify a recently discovered perspective that label smoothing is incompatible with knowledge distillation. We begin by introducing the motivation behind on how this incompatibility is raised, i.e., label smoothing erases relative information between teacher logits. We provide a novel connection on how label smoothing affects distributions of semantically similar and dissimilar classes. Then we propose a metric to quantitatively measure the degree of erased information in sample's representation. After that, we study its one-sidedness and imperfection of the incompatibility view through massive analyses, visualizations and comprehensive experiments on Image Classification, Binary Networks, and Neural Machine Translation. Finally, we broadly discuss several circumstances wherein label smoothing will indeed lose its effectiveness.
One-sentence Summary: This work empirically clarifies a recently discovered perspective that label smoothing is incompatible with knowledge distillation. Project page: http://zhiqiangshen.com/projects/LS_and_KD/index.html.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Data: [CUB-200-2011](https://paperswithcode.com/dataset/cub-200-2011), [ImageNet](https://paperswithcode.com/dataset/imagenet), [ImageNet-LT](https://paperswithcode.com/dataset/imagenet-lt), [Places](https://paperswithcode.com/dataset/places), [iNaturalist](https://paperswithcode.com/dataset/inaturalist)
18 Replies

Loading