I understand the moderators working for the big social networks have a terrible job and often see the worst the internet has to offer.
Who is going to do that job as a volunteer? Or is that expected to be solved by technology? Hard to imagine them achieving what Google, Facebook etc could not reliably.
Some people seem to get immense satisfaction and pleasure out of censoring other people online.
It's something I've seen time and time again, in a wide variety of discussions forums, for decades now.
Such people will happily do it for free, and they're willing to dedicate many hours per day to it, too.
I don't understand their motivation(s), but perhaps it simply gives them a sense of power, control, or influence that they otherwise don't have in their lives outside of the Internet.
Praying he doesn't take this the wrong way, but perhaps /u/dang would be so kind as to weigh in? I don't equate what he does on a daily basis to censoring, but I'm certain it constitutes a part of the job (after all, this is the Internet, and I'm sure there's all manner of trash making an appearance on occasion). Furthermore, I would posit that there's a bit of overlap between censorship and moderation -- even excellent moderation -- although I welcome any nuance I'm missing on this topic.
Moreover, while I hope he is compensated well enough, I imagine this was initially, if not any longer, a job that demanded effort disproportionate to the monetary reward. What would keep someone interested in such a job and naturally driven to perform it well?
Coming from a place of curiosity, meaning no offense, and happy to let this comment slip quietly away to a back room to sit alone if that's what it merits.
> Furthermore, I would posit that there's a bit of overlap between censorship and moderation -- even excellent moderation -- although I welcome any nuance I'm missing on this topic.
You aren't missing anything. Many people have oppositional defiance disorder and have never used an unmoderated forum; they are completely unusable because they're full of spam.
if there are no reprocussions businesses won’t do jacksh*t but if there were, FB/Google/… would solve any issue like this by the time teapot started whistling…
it is one thing to say they “can’t” vs. “the won’t cause they no reason to”
I mean they cannot do it _automatically_ with high certainty, which is why they hire companies to do it for them, who then make employees look at suspected problematic content / reported content.
I'm sure Google and Facebook wouldn't pay these companies if they could achieve similar results without them.
Who is going to do that job as a volunteer? Or is that expected to be solved by technology? Hard to imagine them achieving what Google, Facebook etc could not reliably.