Technology
ESM-1b
Meta AI's 650-million parameter transformer model trained on 250 million protein sequences to predict structure and function from amino acid patterns.
ESM-1b represents a milestone in protein language modeling, utilizing a BERT-style architecture to master the grammar of life. Trained on the UniRef50 database, the model captures complex biological properties (such as secondary structure and thermal stability) without explicit supervision. Researchers use its high-dimensional embeddings to accelerate drug discovery and protein engineering, outperforming traditional alignment-based methods on critical benchmarks like the Critical Assessment of Structure Prediction (CASP).
1 project
·
1 city
Related technologies
Recent Talks & Demos
Showing 1-1 of 1