Exactly. Programs that don't let you do things based on the content should be thought of as weird/broken.
Imagine if we woke up tomorrow morning and grep refused to process a file because there was "morally objectionable" content in it (objectionable as defined by the authors of grep). We would rightly call that a bug and someone would have a patch ready by noon. Imagine if vi refused to save if you wrote something political. Same thing. Yet, for some reason, we're OK with this behavior from "certain" software?
There is more than one way we could generalize the precedent previously set, imo.
None of the templates included with e.g. Word were for smut.
Word allowed you to type in smut, but it didn’t produce smut that wasn’t written by the user. For previous enterprise software, that wasn’t really a relevant question.
So… I don’t think it is obvious that the “Word lets you type in smut” implies “ChatGPT should produce smut if you ask it for smut.”
I guess precedent might imply “if you write some smut and ask it to fix the grammar, it shouldn’t refuse on the basis of what you wrote being smut”?
Imagine if we woke up tomorrow morning and grep refused to process a file because there was "morally objectionable" content in it (objectionable as defined by the authors of grep). We would rightly call that a bug and someone would have a patch ready by noon. Imagine if vi refused to save if you wrote something political. Same thing. Yet, for some reason, we're OK with this behavior from "certain" software?