Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The LLM is always operating as designed. All LLM outputs are "hallucinations".




The LLM is always operating as designed, but humans call its outputs "hallucinations" when they don't align with factual reality, regardless of the reason why that happens and whether it should be considered a bug or a feature. (I don't like the term much, by the way, but at this point it's a de facto standard).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: