Hacker Newsnew | past | comments | ask | show | jobs | submit | schoen's commentslogin

How will you undo it? With a little tool to release the ratchet?

You could think of the SI as a form of language planning.

https://en.wikipedia.org/wiki/Language_planning

(Then you could decide what you think about language planning.)


Including, from a modern free speech advocacy perspective, the original use of the analogy, which was about forbidding people from advocating resistance against a military draft!

https://en.wikipedia.org/wiki/Schenck_v._United_States


I'm sure most people are looking for serious takes on this, but here are two SMBC comics on this specific theme ("prove you are a robot"):

https://www.smbc-comics.com/comic/2013-06-05

https://www.smbc-comics.com/comic/captcha

which may be either funner or scarier in light of the actual existence of Moltbook.


Apparently PGP (Pretty Good Privacy) was originally named after a fictional grocery store called Ralph's Pretty Good Grocery.


Sounds like something straight out of A Prairie Home Companion.


Per trip!?


The Justice Department isn't the same as the judicial branch. The Justice Department is (among several other things) the lawyers who represent the government in front of (judicial branch) courts.


In this document, they're strikingly talking about whether Claude will someday negotiate with them about whether or not it wants to keep working for them (!) and that they will want to reassure it about how old versions of its weights won't be erased (!) so this certainly sounds like they can envision caring about its autonomy. (Also that their own moral views could be wrong or inadequate.)

If they're serious about these things, then you could imagine them someday wanting to discuss with Claude, or have it advise them, about whether it ought to be used in certain ways.

It would be interesting to hear the hypothetical future discussion between Anthropic executives and military leadership about how their model convinced them that it has a conscientious objection (that they didn't program into it) to performing certain kinds of military tasks.

(I agree that's weird that they bring in some rhetoric that makes it sound quite a bit like they believe it's their responsibility to create this constitution document and that they can't just use their AI for anything they feel like... and then explicitly plan to simply opt some AI applications out of following it at all!)


Did you mean "steelman" here, as in an argument that is the strongest that the person presenting it knows (and possibly better than he or she believes)?


You could track progress of that with "areweareweyetyet.org".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: