.

Technology

sentence-transformers

Sentence-Transformers (SBERT) is the Python framework for generating high-quality dense vector embeddings for sentences, paragraphs, and images.

Sentence-Transformers is the state-of-the-art Python module for efficient embedding and reranker model operations. It solves the quadratic complexity problem of standard BERT for tasks like semantic search, which previously required $N(N-1)/2$ inference computations for $N$ sentences. The framework adapts Transformer models (e.g., BERT, RoBERTa) using a Siamese or Triplet network structure, yielding fixed-size, semantically meaningful vectors. This approach drastically improves performance for key applications: semantic search, clustering, semantic textual similarity, and paraphrase mining. The library provides access to over 10,000 pre-trained models on Hugging Face and supports easy fine-tuning for custom use cases.

https://www.sbert.net/
17 projects · 13 cities

Related technologies

Recent Talks & Demos

Showing 1-17 of 17

Members-Only

Sign in to see who built these projects