Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There’s a lot of misconceptions here but LLMs and stable diffusion have spat out copyrighted material verbatim.

So that’s not accurate.



What is not accurate? They are still not storing any material internally, even if the patterns they have learned can cause them to output copyrighted material verbatim. People need to break out of the mental model that an LLM is just a bunch of pointers fetching data from an internal data store.


Have a read through other comments on this thread, you'll see some good examples.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: