Technology
T5
Google's Text-to-Text Transfer Transformer unifies all NLP tasks into a single sequence-to-sequence framework.
T5 redefines NLP by treating every problem (summarization, translation, and classification) as a text generation task. Built on the standard Transformer architecture and pre-trained on the 745GB Colossal Clean Crawled Corpus (C4), it uses a 'prefix' system to toggle between functions. A single 11-billion parameter model can switch from 'translate English to German' to 'summarize' without architecture changes. This approach simplifies the pipeline while maintaining state-of-the-art performance across the GLUE and SuperGLUE benchmarks.
Related technologies
Recent Talks & Demos
Showing 281-304 of 1767