“Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,”
Which could be a carefully worded truth that says absolutely nothing about government demands. What is Apple going to do when they get a national security letter demanding that they never speak of it and demanding that they scan devices for other things?
> It’s worth noting that despite Apple’s assurances, the company has made concessions to governments in the past in order to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China it’s removed thousands of apps from its App Store, as well as moved to store user data on the servers of a state-run telecom.
In many jurisdictions they can be forced to accede to government requests and they have already done so many times to the detriment of their users' rights.
Do you have any examples of this happening in the United States where CSAM scanning is being deployed? It is my understanding that Apple has generally declined to implement requests like backdoored firmware for the FBI.
After Snowden blew the lid off the NSA's mass surveillance program, why are we here asking questions about whether some particular type of surveillance has or hasn't happened in the USA?
The USA is absolutely capable of expanding this far beyond what it should be.
No, the 'technology' knows nothing of the nature of what it processes. It can be adapted for any purpose.
A slippery slope problem I see with privacy starting with a clear-cut, 'yes we should' use case, is then we go on to another clear-cut case, maybe terrorism, whatever the second one is, then it's a pattern. I don't trust the process, people that administer, and other people that influence the process.
This addresses none of the concerns. "We have designed processes" means absolutely nothing since the code is entirely capable of scanning any file on your phone and sending matches up through their existing infrastructure.
This feature must not exist on the customers' device at all. Continue running it in iCloud like you (and others) have been doing but leave our devices free of backdoors.
Apple, I really don't care about your "design", I don't want you to scan my pictures or my kids pictures to compare or report them to anybody. It's like a rapist telling his victim it will feel good, fuck off.
This ahole move presuming all parents are incapable of what their kids are exposed to and wants to be their big brother is ridiculous. This would have be a welcome feature had it been part of screen time and parents had option to turn it on.
Anyone know how large this hash set is? Hopefully they only download it on WiFi…
The human review aspect is scary as well. Imagine some stranger reviewing your private photos after the system mistakes one of your photos as illegal content.
"Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations."
What a load of nonsense. "We're not technically capable of tweaking this system in any way". Why would they say this instead of "We pledge not to do that", or at least "We have no intention to do that"?
All, Apple said to trust them. Case closed. Nothing to see here.
We should always trust trillion+ market cap megacorps. They always have our interest at heart and would never lie or tell us something that isn't true.
> The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design.
I can't figure out how they are able to guarantee this. It seems like they could take any image (e.g. one making fun of a government official) generate its NeuralHash and reverse engineer what the output of the on-device encryption would produce if that had been in the database at the time of publication. This would let them target a particular individual.
Can't say I care what Apple says, but I suspect there are actually good solutions out there, but they prob cost a few cents, so Apple said, "Nope -- we'll just go totalitarian instead -- we can do that for almost free."
How? Does Apple have a private army? Does Apple have nukes? Will Apple fight to death until last man standing? I'm serious. Because in order to deny the request of government, you'll need these.
Apple, do your scanning bullshit on US and Israeli politician, government, and bank employees.
And, political/financial elites of course <-- That's where you'll find child abuse a la friends of Jeffrey Epstein
Target the people who need targeting.
No one needed a CSAM scanner to know Bill Gates, Bill Clinton, et al. were (and probably still are) diddling children. Their friends and colleagues knew it and some even engaged in it too, for years.
This is a power grab. There's zero concern on Apple executives' part, for children. If there was, they'd be leading the charge against the Jeffrey Epstein & Israeli Mossad Honey Pot conspiracy.
https://appleprivacyletter.com/