Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can't think of a single time that we've ever willingly put down a technology that a single person could deploy and appear to be highly productive. You may as well try to ban fire.

Looking at some of the most successful historical pushbacks against technology, taxes and compensation for displaced workers is about as much as we can expect.

Even trying to put restrictions on AI is going to be very practically challenging. But I think the most basic of restrictions like mandating watermarks or tracing material of some kind in it might be possible and really that might do a lot to mitigate the worst problems.



> But I think the most basic of restrictions like mandating watermarks or tracing material of some kind in it might be possible and really that might do a lot to mitigate the worst problems.

Watermarking output (anything that is detectable that is part of the structure of the text, visual--if even imperceptible--image, or otherwise integrated into whatever the primary output is) will make it take a bit more effort to conceal use, but people and tooling will adapt to it very quickly. Tracing material distinct from watermarking -- i.e., accompanying metadata that can be stripped without any impact to the text, image, or whatever else is the primary output -- will do the same, but be even easier to strip, and so have less impact.


And also, mandates for either are mainly going to effect use of public, hosted services; but the proliferation of increasingly-capable open models where fine-tuning and inference can be done locally on consumer hardware will continue and be an additional problem for anything that relies on such a mandate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: