NewGraphRAG now in early beta

Glossary

Embedding

Also known as: Vector embedding, Text embedding

Definition

An embedding is a high-dimensional vector that maps a piece of content (for example a sentence or a document) into a continuous space. Embedding models are trained so that semantically similar inputs end up at small distances (for example small cosine distance). Embeddings are the foundation for semantic search, retrieval in RAG systems, clustering, and recommender systems. Typical dimensionalities range from 384 to 4,096.

How Swiss Knowledge Hub uses this term

Swiss Knowledge Hub generates an embedding for every chunk and stores it in the configured vector store (LanceDB, Pinecone, or ChromaDB). At query time, the question is embedded as well, and the semantically nearest chunks are passed as context to the language model.

Related terms

Sources

  1. Wikipedia: Word embeddinghttps://en.wikipedia.org/wiki/Word_embedding

Last updated: April 22, 2026