Hallucination

« Back to Glossary Index

A hallucination is when an AI tool produces information that is false, misleading, or entirely made up. These errors often sound convincing but are not based on factual data. For example, an AI like Copilot might invent sources, misstate facts, or confidently provide incorrect answers. Hallucinations occur because AI systems generate responses based on patterns in data, not verified truth. It’s important for users to fact-check AI outputs, especially in educational or professional settings, where accuracy and credibility are essential.

« Back to Glossary Index