.

Technology

T5

Google's Text-to-Text Transfer Transformer unifies all NLP tasks into a single sequence-to-sequence framework.

T5 redefines NLP by treating every problem (summarization, translation, and classification) as a text generation task. Built on the standard Transformer architecture and pre-trained on the 745GB Colossal Clean Crawled Corpus (C4), it uses a 'prefix' system to toggle between functions. A single 11-billion parameter model can switch from 'translate English to German' to 'summarize' without architecture changes. This approach simplifies the pipeline while maintaining state-of-the-art performance across the GLUE and SuperGLUE benchmarks.

https://arxiv.org/abs/1910.10683
1752 projects · 95 cities

Related technologies

Recent Talks & Demos

Showing 1701-1724 of 1752

Members-Only

Sign in to see who built these projects