AI Dictionary
Hallucination
Definition
A phenomenon where an AI generates output that is confident but factually incorrect or nonsensical.
Deep Dive
In the context of Artificial Intelligence, particularly with large language models (LLMs) and generative AI, "hallucination" refers to a phenomenon where the AI generates output that is confident and often plausible-sounding but is factually incorrect, nonsensical, or entirely made up. This isn't a simple error or inaccuracy; rather, it's the AI presenting fabricated information as truth, often without any grounding in its training data or the real world. Hallucinations can range from subtle inaccuracies to wildly fictitious narratives.
Examples & Use Cases
- 1A chatbot providing a detailed historical account of an event that never happened, citing non-existent research papers or fictitious individuals
- 2An image generation AI creating an image of a person with an extra limb, distorted facial features, or an anatomically impossible body part, despite the prompt being for a normal human
- 3An AI coding assistant generating code that looks syntactically correct but contains logical flaws, references non-existent libraries, or provides functions that don't solve the intended problem, leading to runtime errors or incorrect outputs
Related Terms
Factual ErrorBias in AIAI Ethics