.

Technology

Embeddings

Embeddings are dense, low-dimensional vectors that translate complex data (text, images, audio) into a mathematical space, enabling machines to process and understand semantic relationships.

Embeddings are the crucial machine learning technique for transforming discrete data (like words or images) into continuous, dense vectors of real numbers. This vectorization maps objects into a high-dimensional space where proximity directly correlates with semantic similarity: closer vectors mean more related objects. For example, a BERT model might generate a 768-dimensional vector for a sentence, capturing its full context. This is foundational for AI applications, including semantic search, where a query vector finds document vectors that are mathematically near, and for recommendation systems, which use vector distance to suggest similar items. The entire process is learned via neural networks (e.g., Word2Vec, GloVe), automating the capture of complex, nuanced relationships in the data.

https://www.ibm.com/topics/embedding
61 projects · 39 cities

Related technologies

Recent Talks & Demos

Showing 61-61 of 61

Members-Only

Sign in to see who built these projects