LLMs hallucinate when fine-tuned with new factual knowledge, as they learn new information slower than consistent knowledge
AI Hallucination Notion
Bigger AI chatbots more inclined to spew nonsense
Masking Retrieval Head or relevant Induction head could induce hallucinations