← Back to glossary
Glossary

Hallucination

Reviewed 20 March 2026 Canonical definition

A hallucination occurs when an AI model generates information that sounds plausible but is factually incorrect or fabricated. In agentic systems, hallucinated tool calls, data references, or decisions can trigger real-world consequences.