.

Technology

Embedding space

The multi-dimensional vector space where complex, high-dimensional data (e.g., words, images, users) are mapped to dense numerical vectors, positioning semantically similar items closer together for efficient processing.

This is the core engine for modern AI: an $N$-dimensional space that transforms discrete data into continuous vectors. The process uses models like Word2Vec or BERT to project entities—a word, a user profile, an image—into a space of, say, 512 or 768 dimensions. The key is geometric proximity: vectors for 'king' and 'queen' are closer than 'king' and 'banana,' enabling vector arithmetic like the classic analogy: king – man + woman ≈ queen. This structure powers critical applications: semantic search, personalized recommendations (Netflix, Spotify), and advanced Natural Language Understanding (NLU). It converts raw information into a quantifiable, relational format that machine learning algorithms can process at scale.

https://developers.google.com/machine-learning/glossary/embedding-space
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects