Skip to main content
XLNet: Generalized Autoregressive Pretraining for Language Understanding

XLNet: Generalized Autoregressive Pretraining for Language Understanding

Zhilin Yang, et al.

00
2019-06-19
nlptransformers

Abstract

This paper introduces and evaluates the idea described in “XLNet: Generalized Autoregressive Pretraining for Language Understanding”, and reports empirical results that helped shape subsequent work in nlp, transformers.