What is an hallucination?
An AI hallucination is when a large language model (LLM) generates false information. Hallucinations largely happen due to three causes: poor data quality, the AI’s training and generation methods, and unclear input context.