Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I feel like the interface in this case has caused us to fool ourselves into thinking there's more there than there is.

Before 2022 (most of history), if you had a long seemingly sensible conversation with something, you could safely assume this other party was a real thinking human mind.

it's like a duck call.

edit, i want to add because this is neural net that's trained to output sensible text, language isn't just the interface.

unlike a website there's no separation between anything, with LLM's the back and front end are all one blob.

edit2: seems I have upset the ducks that think the duck call is a real duck .





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: