.

Technology

ELMo

ELMo provides deep contextualized word representations by using a bidirectional LSTM trained on a massive language modeling objective.

Developed by the Allen Institute for AI (AI2) in 2018, ELMo (Embeddings from Language Models) generates vector embeddings that change based on a word's surrounding context. Unlike static models like word2vec, ELMo uses a two-layer BiLSTM to distinguish between different meanings of the same word (e.g., 'bank' as a river edge versus a financial institution). This architecture improved the state of the art across six major NLP benchmarks, including SQuAD and SNLI, by capturing both complex syntax and nuanced semantics.

https://allennlp.org/elmo
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects