카테고리 없음
DEEP Learning Summary
Rudi
2020. 8. 27. 19:48
Text Summarization
BART: Denoising Sequence-to-Sequence Pre-training
for Natural Language Generation, Translation, and Comprehension
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based
www.groundai.com
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training