.

Technology

BART

BART is a denoising autoencoder that combines bidirectional encoding with autoregressive decoding for superior text generation and comprehension.

Developed by Facebook AI Research (FAIR), BART (Bidirectional and Auto-Regressive Transformers) bridges the gap between BERT-style understanding and GPT-style generation. It uses a standard sequence-to-sequence architecture with a 12-layer encoder and 12-layer decoder (in the large variant) to reconstruct original text from corrupted inputs. By applying noising techniques like sentence shuffling and token masking, the model learns deep contextual representations that excel at abstractive summarization and machine translation. On the CNN/Daily Mail dataset, BART achieved a significant 2.12 ROUGE-L gain over previous benchmarks, proving its efficiency in high-stakes natural language generation.

https://arxiv.org/abs/1910.13461
4 projects · 4 cities

Related technologies

Recent Talks & Demos

Showing 1-4 of 4

Members-Only

Sign in to see who built these projects