Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t quite follow. Just because a tool has the potential for misuse, doesn’t make it not a tool.

Anthropomorphizing LLMs, on the other hand, has a multitude of clearly evident problems arising from it.

Or do you focus on the “just” part of the statement? That I very much agree with. Genuinely asking for understanding, not a native speaker.



When you have "a tool" that's capable of carrying out complex long term tasks, and also capable of who knows out what undesirable behaviors?

It's no longer "just a tool".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: