Aside from the privacy nightmare, what about someone who is 18 and just doesn't have the traditional adult facial features? Same thing for someone who's 15 and hit puberty early? I can imagine that on the edges, it becomes really hard to discern.
If they get it wrong, are you locked out? Do you have to send an image of your ID? So many questions. Not a huge fan of these recent UK changes (looking at the Apple E2E situation as well). I understand what they're going for, but I'm not sure this is the best course of action. What do I know though :shrug:.
Wise (nee Transferwise) requires a passport style photo taken by a webapp for KYC when transferring money. I was recently unable to complete that process over a dozen tries, because the image processing didn't like something about my face. (Photos met all criteria.)
On contacting their support, I learned that they refused to use any other process. Also it became apparent that they had outsourced it to some other company and had no insight into the process and so no way to help. Apparently closing one's account will cause an escalation to a team who determines where to send the money, which would presumably put some human flexability back into the process.
(In the end I was able to get their web app to work by trying several other devices, one had a camera that for whatever reason satisfied their checks that my face was within the required oval etc.)
> On contacting their support, I learned that they refused to use any other process.
I suspect this won't help you, but I think it's worth noting that the GDPR gives people the right to contest any automated decision-making that was made on a solely algorithmic basis. So this wouldn't be legal in the EU (or the UK).
Hah, indeed, a similar experience here. The desktop option is worse, trying to get a webcam to focus on an ID card took forever. The next step wanted a 3rd party company to do a live webcam session, no thanks! Closed the account. Or at least tried, after a several step nag process, they still keep the email blocked to that account, in case you change your mind...
There seems no way to push back against these technologies. Next it will be an AI interview for 'why do you transfer the money?'
Also, key point in the framing, when was it decided that Discord supposed to be the one enforcing this? A pop-up saying "you really should be 18+" is one thing, but this sounds like a genuine effort to lock out young people. Neither Discord nor a government ratings agency should be taking final responsibility for how children get bought up, that seems like something parents should be responsible for.
When a corner shop sells cigarettes to minors, who's breaking the law?
When a TV channel broadcast porn, who gets fined?
These are accepted laws that protect kids from "harm", which are relatively uncontroversial.
Now, the privacy angle is very much the right question. But as Discord are the one that are going to get fined, they totally need to make sure kids aren't being exposed to shit they shouldn't be seeing until they are old enough. In the same way the corner shop needs to make sure they don't sell booze to 16 year olds.
Now, what is the mechanism that Discord should/could use? that's the bigger question.
Can government provide fool proof, secure, private and scalable proof of age services? How can private industry do it? (Hint: they wont because its a really good source of profile information for advertising.)
At least the ways that a corner shop verifies age don't have the same downsides as typical online age verifiers. They just look at an ID document; verify that it's on the official list of acceptable ID documents, seems to be genuine and valid and unexpired, appears to relate to the person buying the product, and shows an old enough age; and hand the document back.
The corner shop has far fewer false negatives, far lower data privacy risk, and clear rules that if applied precisely won't add any prejudice about things like skin color or country of origin to whatever prejudice already exists in the person doing the verification.
That's exactly how a digital ID system would work, and yet people argue against those all the time as well.
Additionally, the corner shop does not have far lower data privacy risks - actually it's quite worse. They have you on camera and have a witness who can corroborate you are that person on camera, alongside a paper trail for your order. There is no privacy there, only the illusion of such.
By data privacy risks I meant the risk of a breach, compromise, or other leak of the database of verified IDs. No information about the IDs are generally collected in a corner shop, at least when there's no suspicion of fraud; they're just viewed temporarily and returned. Not only do online service providers retain a lot of information about their required verifications, they do so for hugely more people than a typical corner shop.
Also, corner shop cameras don't generally retain data for nearly as long as typical online age verification laws would require. Depending on the country and the technical configuration, physical surveillance cameras retain data for anywhere from 48 hours to 1 year. Are you really saying that most online age verification laws worldwide require or allow comparably short retention periods? (This might actually be the case for the UK law, if I'm correctly reading Ofcom's corresponding guidance, but I doubt that's true for most of the similar US state laws.)
A lot of these shops have cameras which could similarly be compromised. In fact the camera is likely to be more vulnerable and probably already had been hacked by DDoS orgs.
I hate sites asking for photo verification, but I think it is more about convenience/reliability for me. My bigger fear is that if AI locks me out with no one to go for support.
Where I live they scan the barcode on the back of the ID into their POS. I don't know what data that exposes, or exactly what's retained, but I suspect it's enough to thoroughly compromise the privacy of that transaction - with no pesky, gumshoe witness-statement and camera-footage steps necessary.
At least the US laws I've looked at have all specifically mandated that data shall not be retained, some with rather steep penalties for retention (IIRC ~$10k/affected user).
When the corner shop checks your ID, they won't take a photo of it. Digital IDs can easily be stored without the user's knowledge. That's a privacy nightmare.
The cornershop does not have access to your friend graph. Also, if you pay by card, digital ID only provides corroboration, your payment acts as a much more traceable indicator.
The risk of "digital ID" is that it'll leak grosly disprocotionate amounts of data on the holder.
For Age verification, you only need a binary old enough flag, from a system that verifies the holder's ID.
The problem is, people like google and other adtech want to be the people that provide those checks, so they can tie your every action to a profile with a 1:1 link. Then combine it to card transactions to get an ad impression to purchase signal much clearer.
The risk here is much less from government but private companies.
Expiry places a bound on duplication and forcing additional duplication allows you to update standards. It's a tradeoff to produce a strictness ratchet.
Broadcasting porn isn't an age ID issue, it's public airwaves and they're regulated.
These aren't primarily "think of the children" arguments, the former is a major public health issue that's taken decades to begin to address, and the latter is about ownership.
I don't think that chat rooms are in the same category as either public airwaves or drugs. Besides what's the realistic outcome here? Under 18's aren't stupid, what would you have done as a kid if Discord was suddenly blocked off? Shrug and not talk to your friends again?
Or would you figure out how to bypass the checks, use a different service, or just use IRC? Telegram chats? Something even less moderated and far more open to abuse, because that's what can slip under the radar.
So no I don't think this is about protecting kids, I think it's about normalizing the loss of anonymity online.
> These aren't primarily "think of the children" arguments
Are you kidding me? v-chip, mary whitehouse, Sex on TV are all the result of "think of the children" moral panics. Its fuck all to do with ownership.
> I don't think that chat rooms are in the same category as either public airwaves
Discord are making cash from underage kids, in the same way that meta and google are, in the same way that disney and netflix offering kids channels.
Look I'm not saying that discord should be banned for kids, but I really do think that there is a better option than the binary "Ban it all"/"fuck it, let them eat porn"
Kids need to be able to talk to each other, but they also should be able to do that without being either preyed upon by nonces, extremists, state actors and more likely bored trolls.
Its totally possible to provide anonymous age gating, but its almost certainly going to be provided by an adtech company unless we, the community provide something cheaper and better.
> This is over-reach. Both in the UK and Australia
2/3 of Australians support minimum age restrictions for social media [1] and it was in-particular popular amongst parents. Putting the responsibility solely on parents shows ignorance of the complexities of how children are growing up these days.
Many parents have tried to ban social media only for those children to experience ostracisation amongst their peer group leading to poorer educational and social developmental outcomes at a critical time in their live.
That's why you need governments and platform owners to be heavily involved.
You realize that is how EVERY law works right... The person your replying to says the public overall supports the idea/law. If following that law is a deal breaker for you you either need to persuade thos ppl to your view or move
maybe it's "puritan" or maybe it's a normal view and it looks puritan from where you stand. how do you know which? One bit of evidence is the 2/3 to 1/3 split.
It almost certainly is overreach, but locking young people out of porn is hardly a new concern. We have variants of this argument continuously for decades. I'm not sure there is a definitive answer.
There's a SCOTUS case in FSC v. Paxton that could very well decide if age verification is enforced in the US as well so sadly this is just the beginning.
It's a good thing to think about. I knew a guy in high school who had male pattern baldness that started at 13 or 14. Full blown by the time he was 16. Dude looked like one of the teachers.
Same in my drivers ed at 16, guy had a mans face, large stocky build, and thick full beard. I once was talking to a tall pretty woman who turned out to be a 12 year old girl. And I have a friend who for most of his 20's could pass for 13-14 and had a hell of a time getting into bars.
This facial thing feel like a loaded attempt to both check a box and get more of that sweet, sweet data to mine. Massive privacy invasion and exploitation of children dressed as security theater.
It's not even edge cases - I was a pretty young looking woman and was mistaken for a minor until I was about 24-25. My mother had her first child (me) at 27 and tells me about how she and my father would get dirty looks because they assumed he was some dirty old man that had impregnated a teenager. (He was 3 years older than her).
I think, ironically, the best way to fight this would be to lean on identity politics: There are probably certain races that ping as older or younger. In addition, trans people who were on puberty blockers are in a situation where they might be 'of age' but not necessarily look like an automated system expects them to, and there might be discrepancies between their face as scanned and the face/information that's show on their ID. Discord has a large trans userbase. Nobody cares about privacy, but people make at least some show of caring about transphobia and racism.
> So many questions.
Do they keep a database of facial scans even though they say they don't? If not, what's to stop one older looking friend (or an older sibling/cousin/parent/etc.) from being the 'face' of everyone in a group of minors? Do they have a reliable way to ensure that a face being scanned isn't AI generated (or filtered) itself? What prevents someone from sending in their parent's/sibling's/a stolen ID?
Seems like security theater more than anything else.
I don't think they make much of a show of caring about trans rights in the UK right about now, unfortunately. In the US you can make a strong case that a big database of faces and IDs could be really dangerous though I think
It's mostly about the service's audience. Discord is a huge trans/queer/etc. hub. If Discord were X or Instagram etc. it wouldn't matter. Users of Discord are, as a group, more likely to be antagonistic to anything that could be transphobic or racist than the general populace. (Whereas they don't care about disability rights, which is why people with medically delayed puberty aren't a concern.)
I witnessed the Better Off Ted water fountain skit play out in real life once, it was incredible awkward. I was helping my buddy and his black friend and his wife set up accounts on online casinos in Michigan for the promos/refer-a-friend rewards. Some of the sites require the live video facial verification and we were doing it in a darkly lit space at night. It worked instantly and without issue for my friend and me but oh man, many many attempts later and many additional lights needed to get it to work for his friends.
With the UK currently battling Apple, Discord has no chance of not getting a lawsuit.
Ofcom is a serious contender in ruling their rules especially where Discord is multi-national that even "normies" know and use.
And if they got a slap of "we will let you off this time" they would still have to create some sort of verification service to please the next time.
You might as well piss off your consumers, loose them whatever and still hold the centre stage than fight the case for not. Nothing is stopping Ofcom from launching another lawsuit there after.
> Is there a market for leaked facial scans?
There's a market for everything. Fake driver licenses with fake pictures have been around for decades, that would be no different.
> what about someone who is 18 and just doesn't have the traditional adult facial features?
This can be challenging even with humans. My ex got carded when buying alcohol well into her mid thirties, and staff at the schools she taught at mistook her for a student all the time.
I grew a beard when I was younger because I was tired of being mistaken for a highschooler its quite annoying to have people assume you are 15 when your 20. still regularly carded in my 30s
No it's just nonsense you invented because you were unwilling to do any research.
The actual situation was that the board refused classification where an adult was intentionally pretending to be an underage child not that they looked like one.
I added an edit to correct myself however this was not something I invented. This story goes back to 09 - 2010. I will confess I didn't do any research to confirm though and that was my bad.
FWIW, I can confirm that user 9283409232 didn't make that up. I heard that multiple reputable places, years ago.
And it was believable, given a history of genuine but inept attempts by some to address real societal problems. (As well as given the history of fake attempts to solve problems for political points for "doing something". And also given the history of "won't someone think of the children" disingenuous pretexts often used by others to advance unrelated goals.) Basically, no one is surprised when many governments do something that seems nonsensical.
So, accusing someone of making up a story of a government doing something odd in this space might be hasty.
I suspect better would be to give a quick check and then "I couldn't find a reference to that; do you have a link?"
Interestingly I actually heard this in Australia many years ago. I assumed it was real (as an Australian) but the answer is more complicated (with the actual answer being no)
If they get it wrong, are you locked out? Do you have to send an image of your ID? So many questions. Not a huge fan of these recent UK changes (looking at the Apple E2E situation as well). I understand what they're going for, but I'm not sure this is the best course of action. What do I know though :shrug:.