Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One thing it can't really do is admitting that it doesn't actually know stuff

That's because it "doesn't understand knowing". It's ultimately a search engine. When a smart, wise person knows something, they have something of a model of the limitations knowledge - indeed a serious person's model of their knowledge increases along with their knowledge.

And certainly some people don't have anything like a model of knowledge either. But the strength of humans is when they do have this.



> It's ultimately a search engine.

Not even that. It's make stuff up engine. Search engines are able to return "no results found".

Current incarnation of GPT chat can tell you it doesn't know something, like current net worth of Elon Musk, but it's definitely canned answer that crudely patches the underlying model response which is never empty.


I'm not convinced that it's a canned response.


Try asking it about anything about real life people. You'll often encounter a sentence or two of agenerated test but then very similarily phrased excuse why it can't know any better as the "shepard" however it is implemented kicks in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: