Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs do not lie. That implies agency and intentionality that they do not have.

LLMs are approximately right. That means they're sometimes wrong, which sucks. But they can do things for which no 100% accurate tool exists, and maybe could not possibly exist. So take it or leave it.





No way to ever know in which condition that being somewhat accurate is going to be good enough or not. And no way to know how accurate the thing is before engaging with it so you have to babysit it... "Can do things" is carrying a lot of load in your statement. It makes the car with no brakes and you tell it not to do that so it makes you one without an accelerator either.

>That implies agency and intentionality that they do not have.

No, but the companies have agencies. LLMs lie, and they only get fixed when companies are sued. Close enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: