Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Hallucination" has always seemed like a misnomer to me anyway considering LLMs don't know anything. They just impressively get things right enough to be useful assuming you audit the output.

If anything, I think all of their output should be called a hallucination.





We don't know if anything knows anything because we don't know what knowing is.

On the other hand, once you're operating under the model of not knowing if anything knows anything, there's really no point in posting about it here, is there?

This is just something that sounds profound but really isn’t.

Knowing is actually the easiest part to define and explain. Intelligence / understanding is much more difficult to define.


I took a semester long 500 level class back in college on the theory of knowledge. It is not easy to define - the entire branch of epistemology in philosophy deals with that question.

... To that end, I'd love to be able to revisit my classes from back then (computer science, philosophy (two classes from a double major), and a smattering of linguistics) with the world state of today's technologies.


Others have suggested "bullshit". A bullshitter does not care (and may not know) whether what they say is truth or fiction. A bullshitter's goal is just to be listened to and seem convincing.

The awareness of the bullshitter is used to differentiate between 'hard' and 'soft' bullshit. https://eprints.gla.ac.uk/327588/1/327588.pdf

> "Hallucination" has always seemed like a misnomer to me anyway considering LLMs don't know anything. They just impressively get things right enough to be useful assuming you audit the output.

If you pick up a dictionary and review the definition of "hallucination", you'll see something in the lines of "something that you see, hear, feel or smell that does not exist"

https://dictionary.cambridge.org/dictionary/english/hallucin...

Your own personal definition arguably reinforces the very definition of hallucination. Models don't get things right. Why? Because their output contrasts with content covered by their corpus, thus outputting things that don't exist or were referred in it and outright contrast with factual content.

> If anything, I think all of their output should be called a hallucination.

No. Only the ones that contrast with reality, namely factual information.

Hence the term hallucination.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: