Skip to main content
DistilBERT, a distilled version of BERT

DistilBERT, a distilled version of BERT

Victor Sanh, Lysandre Debut, Julien Chaumond, Thomas Wolf

00
2019-10-08
distillationnlp

Abstract

This paper introduces and evaluates the idea described in “DistilBERT, a distilled version of BERT”, and reports empirical results that helped shape subsequent work in distillation, nlp.