BART: Denoising Sequence-to-Sequence Pre-training for Natural...
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to...
https://arxiv.org/abs/1910.13461