Skip to main content
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

Mike Lewis, et al.

00
2019-10-29
nlpseq2seq

Abstract

This paper introduces and evaluates the idea described in “BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension”, and reports empirical results that helped shape subsequent work in nlp, seq2seq.