NewGraphRAG now in early beta

Glossary

Hallucination

Also known as: AI hallucination, Confabulation

Definition

In language AI, a hallucination (sometimes called confabulation) is an output that is grammatically correct and convincingly worded yet factually wrong, misleading, or made up. Causes include gaps in the training data, probabilistic generation without access to facts, and imprecise prompts. Mitigations include RAG with mandatory citations, confidence estimates, human review steps, and structured outputs.

How Swiss Knowledge Hub uses this term

Swiss Knowledge Hub reduces hallucinations by binding every answer to retrieved source chunks and surfacing the page number and source link alongside it. When relevant evidence is missing, the system prefers an explicit "I don't know" response over free-form generation.

Related terms

Sources

  1. Wikipedia: Hallucination (artificial intelligence)https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)

Last updated: April 22, 2026