Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's harmful because in this context it leads to an incorrect conclusion. There's no reason to believe that LLMs "averaging" behavior would cause a suicidal person to be "pulled toward normal"




It's a philosophical argument more than anything I think. And it does beg the question, does your mind form itself around with the humans (entities?) you converse with? So if you talk with a lot of smart people, you'll end up a bit smarter yourself, and if you talk with a lot of dull people, you'll end up dulling yourself. If you agree with that, I can see how someone would believe that LLMs would pull people closer to the material they were trained on.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: