Zubnet AILearnWiki › Embedding
Training

Embedding

Also known as: Vector Embedding
A way to represent text (or images, or audio) as a list of numbers (a vector) that captures its meaning. Similar concepts end up close together in this number space — "cat" and "kitten" are nearby, while "cat" and "economics" are far apart.

Why it matters

Embeddings are the foundation of semantic search and RAG. They're how AI understands that a search for "fix login bug" should match a document about "authentication error resolution" even though no words overlap.

Related Concepts

← All Terms
← ElevenLabs Emergence →
ESC