Apple defends anti-child abuse imagery tech after claims of ‘hash collisions’
epistasis 2021-08-18 20:08:41 +0000 UTC [ - ]
For a company that marketed itself as one of the few digital service providers that consumers could trust, I just don't understand how they acted this way at all.
Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust.
helen___keller 2021-08-18 20:48:39 +0000 UTC [ - ]
There's a simple answer to this right? Despite everyone's reaction, Apple genuinely believe this is a novel and unique method to catch CSAM without invading people's privacy. And if you look at it from Apple's point of view that's correct: other major cloud providers catch CSAM content on their platform by inspecting every file uploaded, i.e. total invasion of privacy. Apple found a way to preserve that privacy but still catch the bad people doing very bad things to children.
dabbledash 2021-08-19 02:51:10 +0000 UTC [ - ]
chrismorgan 2021-08-19 02:59:33 +0000 UTC [ - ]
There’s this bizarre notion that using end-to-end encryption can absolve you of responsibility, that the authorities will have to accept an answer of “we literally can’t access it”.
That’s just not the case for centralised things: you’re deliberately facilitating some service, government will find you liable for some things in its operation, and if you don’t comply, they’ll fine or shut you down. E2EE doesn’t absolve you from law; law is all about saying you’re not allowed to do things that are physically possible.
(Decentralised things, now they can be banned but not truly stopped because there’s no central party to shut down.)
Zak 2021-08-19 03:57:52 +0000 UTC [ - ]
A state that does not accept it might retaliate against the entity giving that answer or forbid future use of end-to-end encryption without backdoors, but the truth of the answer doesn't depend on anyone's acceptance.
onethought 2021-08-19 04:52:06 +0000 UTC [ - ]
This is what has happened in many countries already.
3np 2021-08-19 05:09:26 +0000 UTC [ - ]
Legal terms such as "murder", "fraud" and "rape" do change as effect of regulatory changes. "Encryption" and "privacy" do not.
There's a limit to how much you can bend semantics in your PR before it breaks and you get backlash.
onethought 2021-08-19 05:35:06 +0000 UTC [ - ]
3np 2021-08-19 06:30:03 +0000 UTC [ - ]
They're deliberately misrepresenting what's happening, appearing surprised when people misunderstand, and bundling together legitimate criticism with misunderstandings.
I can draw some parallels to how Google went out with FLoC.
Honestly I can't tell where Hanlon's razor should cut here.
onethought 2021-08-19 11:08:07 +0000 UTC [ - ]
Zak 2021-08-19 11:15:23 +0000 UTC [ - ]
onethought 2021-08-19 14:08:19 +0000 UTC [ - ]
dabbledash 2021-08-19 03:04:10 +0000 UTC [ - ]
Retric 2021-08-19 03:16:46 +0000 UTC [ - ]
pengaru 2021-08-19 03:48:17 +0000 UTC [ - ]
supertrope 2021-08-19 03:55:18 +0000 UTC [ - ]
raxxorrax 2021-08-19 07:00:01 +0000 UTC [ - ]
Problem is that the law is self-contradictory and it is up to the judicative institutions to fix it as soon as possible.
UncleMeat 2021-08-19 12:29:00 +0000 UTC [ - ]
Retric 2021-08-19 03:11:28 +0000 UTC [ - ]
You can still do secure backups of your phone without using iCloud, but there isn’t a way for Apple to do end to end encryption of backups transparently like you can with real time communication. The only way end to end encryption of backups works is to require people keep a separate secure key(s) to avoid losing their data, which means a universal implementation has real direct risk for users.
As long as Apple has access to these files the FBI can legally require them to do these searches. From a pure PR perspective they should have communicated what was already going on before releasing this system because people assume something significant changed.
simfree 2021-08-19 05:56:38 +0000 UTC [ - ]
There is no reason the password can't be the encryption key, with backup keys stored with a trusted third party (eg: your credit union or bank) without notation as to what these backup keys are tied to.
Retric 2021-08-19 14:01:27 +0000 UTC [ - ]
Trusting third parties with the password in unencrypted form is either systematic in which case the FBI now just needs collect data from 2 different organizations, or on a case by case basis in which case users will mess it up. Apple etc would have no way to verify users actually did something to back up their keys.
Apple’s current approach is to let users setup their own backups if they want security which allows for privacy just fine without providing a service with fundamental issues.
noapologies 2021-08-18 21:54:20 +0000 UTC [ - ]
> other major cloud providers catch CSAM content on their platform by inspecting every file uploaded, i.e. total invasion of privacy.
> Apple found a way to preserve that privacy ...
So scanning for CSAM in a third-party cloud is "total invasion of privacy", while scanning your own personal device is "preserving privacy"?
The third-party clouds can only scan what one explicitly chooses to share with the third-party, while on-device scanning is a short slippery slope away from have its scope significantly expanded (to include non-shared and non-CSAM content).
spullara 2021-08-19 03:51:36 +0000 UTC [ - ]
helen___keller 2021-08-18 22:01:51 +0000 UTC [ - ]
> is a short slippery slope away
Obviously people who trust Apple aren't concerned about slippery slopes. What's the point of your post?
stjohnswarts 2021-08-19 07:16:17 +0000 UTC [ - ]
carnitas 2021-08-18 21:05:26 +0000 UTC [ - ]
squarefoot 2021-08-19 02:53:10 +0000 UTC [ - ]
spullara 2021-08-19 03:52:25 +0000 UTC [ - ]
stjohnswarts 2021-08-19 07:17:32 +0000 UTC [ - ]
onethought 2021-08-19 04:53:41 +0000 UTC [ - ]
stjohnswarts 2021-08-19 07:18:19 +0000 UTC [ - ]
squarefoot 2021-08-19 12:39:37 +0000 UTC [ - ]
That wouldn't be a problem in an ideal world, but the one in which we live is far from even resembling one. Mining data is already a huge business, and governments everywhere would love tools to use to get advantage over people they don't like. There's huge motivation and demand for those tools at all levels, and at least governments have the resources to buy them and the power to force whoever implements them to stay silent. I'm not implying that spyware tools exist in any phone, PC, smart TV, car, etc. because we can't prove they don't; that's the argument used for UFOs, witches and unicorns, no thanks, but we better think like they do because technology, resources and demand for their adoption are real, and the rest is probability.
onethought 2021-08-19 11:05:39 +0000 UTC [ - ]
Apple are not uploading anything to government organisations, in fact they are uploading less than google, by their own(and googles admission).
You have 0 proof that they are uploading to gov orgs… right?
slg 2021-08-18 21:06:52 +0000 UTC [ - ]
codeecan 2021-08-18 21:20:15 +0000 UTC [ - ]
Apple has already proven it will concede to China's demands.
They are building the worlds most pervasive surveillance system and when the worlds governments come knocking to use it ... they will throw their hands up and feed you the "Apple complies with all local laws etc.."
tshaddox 2021-08-19 03:15:09 +0000 UTC [ - ]
bsql 2021-08-18 21:49:15 +0000 UTC [ - ]
GeekyBear 2021-08-18 21:44:14 +0000 UTC [ - ]
They control what goes into the on-device database that is used.
>The on-device encrypted child abuse database contains only data independently submitted by two or more child safety organizations, located in separate jurisdictions, and thus not under the control of the same government
https://www.techwarrant.com/apple-will-only-scan-abuse-image...
Invictus0 2021-08-19 02:33:43 +0000 UTC [ - ]
threeseed 2021-08-19 02:27:25 +0000 UTC [ - ]
It's bizarre to me that people are freaking out about governments adding client side hashes but no concern that they could be doing server side checks.
babesh 2021-08-19 03:01:13 +0000 UTC [ - ]
I tried disabling iCloud keychain and it just flips back on. Sometimes it asks for a login first. Sometimes it shows a cancel/continue modal. Either way, it magically flips back on. No error message.
I tried backing up my device to my hard drive (with Photos already on iCloud) and it kept complaining that there wasn’t enough space. It throws error message after warning message that your content will be deleted. It created additional copies of my photos each time my phone synced.
To properly back up, I had to copy the photos directory to an external hard drive, delete the original, mark the external hard drive one as the system one and then finally free up enough space to back up. iCloud and the device backup weren’t smart enough to free up space for my backup. In fact, I first backed up all my photos to iCloud first because they said that it would free up space on my hard drive as necessary. LOL.
BTW, iCloud keychain is still on for me. Fuck Apple.
babesh 2021-08-19 03:13:42 +0000 UTC [ - ]
CoolGuySteve 2021-08-18 21:17:16 +0000 UTC [ - ]
I think I'm not the only one who'd rather not have my devices call the cops on me in a country where the cops are already way too violent.
UncleMeat 2021-08-19 12:31:31 +0000 UTC [ - ]
slg 2021-08-18 21:28:48 +0000 UTC [ - ]
CoolGuySteve 2021-08-18 22:24:59 +0000 UTC [ - ]
simondotau 2021-08-19 02:24:46 +0000 UTC [ - ]
lupire 2021-08-18 23:53:31 +0000 UTC [ - ]
croutonwagon 2021-08-19 02:31:48 +0000 UTC [ - ]
Further, this is step 1 of a process they have explicitly said they are looking to expand on [1], even going as far to state it in bold font with a standout color.
So theres no telling that they wont expand it by simply scanning everything, regardless of icloud usage, or pivot it to other combat "domestic terrorism" or "gun violence epidemics" or whatever else they feel like.
Its an erosion of trust, even if not a full stop erosion, its something they intend to expand upon and wont be taking back.
[1] https://www.apple.com/child-safety/pdf/Expanded_Protections_...
fsflover 2021-08-18 21:12:27 +0000 UTC [ - ]
slg 2021-08-18 21:18:21 +0000 UTC [ - ]
lokedhs 2021-08-19 04:39:09 +0000 UTC [ - ]
fsflover 2021-08-18 21:21:16 +0000 UTC [ - ]
This is the key point.
1. What if I change my mind and decide not to upload the picture?
2. This is a new mechanism for scanning private pictures on the device. What could go wrong?
> If we can't trust Apple to follow their promise, their products should already have been considered compromised before this change was announced.
Many people did trust Apple to keep their files private until now.
zepto 2021-08-18 21:44:09 +0000 UTC [ - ]
No it isn’t. It’s a mechanism for scanning pictures as they are uploaded to iCloud Photo Library.
Private pictures on the device are not scanned.
fsflover 2021-08-18 21:59:01 +0000 UTC [ - ]
zepto 2021-08-18 22:23:49 +0000 UTC [ - ]
int_19h 2021-08-19 02:21:41 +0000 UTC [ - ]
zepto 2021-08-19 04:45:29 +0000 UTC [ - ]
Also, what apps are you talking about?
int_19h 2021-08-19 06:09:29 +0000 UTC [ - ]
As for apps - WhatsApp, for example, saves everything to Camera Roll by default. Which then gets auto-uploaded to iCloud.
simondotau 2021-08-19 02:30:07 +0000 UTC [ - ]
slg 2021-08-18 21:32:28 +0000 UTC [ - ]
>Many people did trust Apple to keep their files private until now.
And that was my original point. If a pinky promise from Apple is not enough to trust them, then Apple should have never been trusted.
fsflover 2021-08-18 22:00:36 +0000 UTC [ - ]
You can choose to upload many pictures. They will start uploading. Then, you change your mind. Some pictures were not uploaded yet. But they were scanned by the new algorithm.
kelnos 2021-08-19 02:41:43 +0000 UTC [ - ]
ravenstine 2021-08-18 22:01:25 +0000 UTC [ - ]
mox1 2021-08-18 21:00:54 +0000 UTC [ - ]
I don’t mind my one drive being scanned for “bad stuff”, I very much mind my personally owned data stores being scanned, with no opt out.
GeekyBear 2021-08-18 21:07:35 +0000 UTC [ - ]
If you turn off iCloud Photos, nothing is scanned.
Microsoft scans everything.
>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)
https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...
erklik 2021-08-19 02:44:21 +0000 UTC [ - ]
According to Apple. and For Now. Patriot Act was only for terrorists. Apple makes concessions for China. Creating this technology, makes it very easy for China to go, "Look at all photos, always". If they only want to scan stuff on iCloud Photos, no worries, just implement on it on their end. This tech does not need to exist in that case.
> Microsoft scans everything.
Everything uploaded to the Cloud. The Cloud bit is fairly important here. That's me, willingly, putting information on their property. Perfectly acceptable and fine for them to ensure that it's nothing unethical.
However, someone else snooping through your drawers looking for something to pin on you is not private, nor the same as checking their own property.
tshaddox 2021-08-19 03:17:29 +0000 UTC [ - ]
kelnos 2021-08-19 02:43:47 +0000 UTC [ - ]
By your definition of "everything", both Microsoft and Apple scan "everything". (Or Apple will, after this new system is rolled out.)
Waterluvian 2021-08-19 02:12:50 +0000 UTC [ - ]
ffhhj 2021-08-19 03:58:23 +0000 UTC [ - ]
nonbirithm 2021-08-19 00:15:48 +0000 UTC [ - ]
Nobody is arguing about the legality of CSAM itself. It's an issue that is not popular to discuss, and of course being on the wrong side of it results in near-universal, justifiable derision. Stopping its spread is absolutely the right thing to do.
So at the point that companies will always be held liable for storing it, they will have to put up countermeasures of some kind or find themselves sued out of existence. Server-side scanning is one method, and Apple's on-device scanning is another.
There are certainly ways that Apple can go too far with whatever it happens to come up with as its solution to stopping CSAM, but there is still seems to have been a line behind which nobody particularly cares how the detection is implemented and life continues as usual. If Apple had chosen not to cross that line, maybe many of the arguments being made here would never have been brought up at all.
nine_k 2021-08-18 21:43:15 +0000 UTC [ - ]
What it takes to accept is the "nothing to hide" mentality: your files are safe to scan (locally) because they can't be known CSAM files. You have to trust the scanner. You allow the scanner to touch your sensitive files because you're not the bad guy, and you want the bad guy be caught (or at least forced off the platform).
And this is, to my mind, the part Apple wasn't very successful at communicating. The whole thing should have started with an educational campaign well ahead of time. The privacy advantage should have been explained again and again: "unlike every other vendor, we won't siphon your files unecrypted for checking; we do everything locally and are unable to compromise your sensitive bits". Getting one's files scanned should have become a badge of honor among the users.
But, for some reason, they tried to do it in a low-key and somehow hasty manner.
int_19h 2021-08-19 00:01:07 +0000 UTC [ - ]
chii 2021-08-19 03:38:02 +0000 UTC [ - ]
Everybody knows what goes on behind that door. And yet, everyone would much prefer to close it. Apple's method is equivalent to the toilet door being removed, so that you cannot do anything nefarious behind that door.
Your phone also (on average) have nothing to hide, but that's also why you need privacy on your devices.
ravenstine 2021-08-18 21:58:13 +0000 UTC [ - ]
jjjensen90 2021-08-19 02:32:13 +0000 UTC [ - ]
ravenstine 2021-08-19 04:20:22 +0000 UTC [ - ]
That's purely specious, and you say it as if I ever asserted there aren't "dumb people of every type.", which I didn't say.
Yes, there are dumb people of every "type". Are there as many intelligent people as those of low intelligence who struggle to think abstractly, struggle to read and write, have poor motor control, rely heavily on dogma for a moral compass, or have trouble with impulse control?
If you think there are as many people with a high IQ as a low IQ with those traits, I don't think you're being honest.
Here's some data for you:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4478390/
https://pubmed.ncbi.nlm.nih.gov/14744183/
https://pubmed.ncbi.nlm.nih.gov/961457/
https://www.dw.com/en/scientists-find-brain-differences-in-p...
https://sci-hubtw.hkvisa.net/10.1023/a:1018754004962
https://www.livescience.com/4671-study-pedophiles-tend-short...
https://pubmed.ncbi.nlm.nih.gov/29625377/
Are most pedophiles of low intelligence overall? It's doubtful, but there's room for more study.
Are the pedophiles that are offenders of low intelligence? Significantly more offending pedophiles are of low intelligence than the general population. Yes, the factors at play may also apply to other forms of crime. But when you factor out non-offending pedophiles, those we aren't as concerned with in regards to CSAM, the ones that are caught aren't exactly the cream of the crop.
Furthermore, I've met a handful of pedophiles, both convicted and not. Every one of them either was clearly mentally retarded or bordering on retardedness, or they had a severe personality disorder. Yes, it's a small sample size, but the ones that were caught did so doing the dumbest of shit (putting aside the horrific nature of child exploitation in the first place). This even includes a female pedophile, who was both stupid and psychopathic, and I find it funny that one of those studies questions the very existence of female pedophiles.
There are many pedophiles that are of above-average intelligence. Most may not even be considered "dumb". But to imply that there's no difference in distribution of, frankly, dumb people to smart people between different behaviors is absurd in that it ignores any sort of relation implication that intelligence has on said behaviors.
As a group, the data does not suggest that pedophiles are exceptionally intelligent, but rather the opposite.
So yes, while I believe there is a privacy concern (I said earlier I would not want such software or hardware running on my own compuer), I stand by my conclusion that CSAM detection would catch a significant number of pedophiles and probably continue to do so for some time. The point of such systems isn't to be a superweapon to catch every pedophile. The ones that it might catch happen to be the ones who are most likely to offend. If you play the stupid game of uploading CSAM to Apple iCloud, you win the stupid prize of getting convicted, and it just so happens that a ton of pedophiles are likely to do something that boneheaded.
kelnos 2021-08-19 02:48:04 +0000 UTC [ - ]
dragonwriter 2021-08-19 02:56:28 +0000 UTC [ - ]
Maybe that's true of the ones that get caught, but if so there's a good chance that at least part of that is inverse survivorship bias. (Similar to “Why are most of the CIA covert missions you hear about failures?”)
simondotau 2021-08-19 02:51:06 +0000 UTC [ - ]
ravenstine 2021-08-19 04:28:39 +0000 UTC [ - ]
simondotau 2021-08-19 05:42:39 +0000 UTC [ - ]
Spivak 2021-08-18 21:07:49 +0000 UTC [ - ]
speleding 2021-08-18 22:11:06 +0000 UTC [ - ]
You may catch a few perverts looking at the stuff, but I'm not convinced this will lead to catching the producers. How would that happen?
kelnos 2021-08-19 02:50:23 +0000 UTC [ - ]
Compare this to the war on drugs. There is huge demand for drugs. Many drugs are highly illegal to possess, and a lot of people get jailed for possessing them. And yet most of us consider the war on drugs to be an abject failure that hasn't done much more than create inequity.
Why should something like CSAM possession be any different? I wish it was different, and these sorts of tactics would reduce this awful problem, but I'm just not convinced it does.
shuckles 2021-08-18 22:23:55 +0000 UTC [ - ]
mirker 2021-08-18 22:23:06 +0000 UTC [ - ]
What’s novel about this? The technique and issues with it are fairly obvious to anyone with experience in computer vision. It seems not too different from research from 1993 https://proceedings.neurips.cc/paper/1993/file/288cc0ff02287...
The issues are also well known and encompass the whole subfield of adversarial examples.
shuckles 2021-08-18 22:56:02 +0000 UTC [ - ]
mirker 2021-08-19 01:04:46 +0000 UTC [ - ]
unyttigfjelltol 2021-08-18 21:32:03 +0000 UTC [ - ]
Saddened by the privacy-adverse functionality on handsets but Apple seems to be hitting the nail on the head that poor communication principally is driving outrage.
Syonyk 2021-08-18 21:51:23 +0000 UTC [ - ]
So it's not even accomplishing "keeping it off the servers."
simondotau 2021-08-19 02:56:48 +0000 UTC [ - ]
Apple's actions will be effective at making sure every CSAM aficionado will not trust their device. Or, if they're tech savvy, they will at least know not to co-mingle their deepest darkest secrets alongside photos of their mum and last night's dinner.
If you think Apple's very big, very public blow-up is making waves in the hacker/security/libertarian crowd, just imagine how big it's blowing up in the CSAM community right now. I dare say it's probably all they've been talking about for the past two weeks. Unlike places like Hacker News where there's plenty of people desperate to drag Apple through the coals, the CSAM community will be highly motivated to have a precise understanding of what Apple is doing. I dare say that they'll be extremely well informed about exactly what Apple's plans entail and how to work around them.
lixtra 2021-08-18 20:58:31 +0000 UTC [ - ]
That is very unlikely. Most likely they compare some hash against a database - just like apple.
jowsie 2021-08-18 21:01:12 +0000 UTC [ - ]
GiorgioG 2021-08-18 21:05:04 +0000 UTC [ - ]
nine_k 2021-08-18 21:51:35 +0000 UTC [ - ]
Eventually it could lead to Apple platform not having any CSAM content, or any known legally objectionable content, because any sane perpetrator would migrate to other platforms, and less sane, caught.
This, of course, takes the user base to overwhelmingly agree that keeping legally objectionable content is a bad thing and should be exposed to police, and to trust the whatever body which defines the hashes of objectionable content. I'm afraid this order is not as tall as we could imagine.
heavyset_go 2021-08-18 22:31:17 +0000 UTC [ - ]
And yet photos that get scanned are still uploaded to iCloud Photos, so they do end up on Apple's servers.
nine_k 2021-08-19 10:47:08 +0000 UTC [ - ]
Doing so right while activating a new iDevice is the way to prevent its private keys from ending up in iCloud, and so preventing Apple, or law enforcement, or some malicious hackers from cracking into your device.
GeekyBear 2021-08-18 22:35:33 +0000 UTC [ - ]
I'm failing to see the issue.
heavyset_go 2021-08-18 22:50:21 +0000 UTC [ - ]
GeekyBear 2021-08-19 01:59:42 +0000 UTC [ - ]
The results of the scan looking for kiddie porn cannot be read by Apple until the device finds 30 examples of photos that match known kiddie porn, whereupon Apple gets the decryption key so they can see which images on their server need review, and a human review is triggered to make sure there haven't been 30 false positives.
heavyset_go 2021-08-19 02:35:17 +0000 UTC [ - ]
I implore you to read the comment I originally replied in order to understand the context of my replies. Personally, I don't care what you're shocked about or not, as my OP wasn't directed at you at all.
Spivak 2021-08-18 21:09:03 +0000 UTC [ - ]
JimBlackwood 2021-08-18 21:21:06 +0000 UTC [ - ]
GeekyBear 2021-08-18 22:34:04 +0000 UTC [ - ]
After that, they get the decryption key and trigger a human review to make sure there haven't been 30 false positives at once.
That is better, and more private, in a very meaningful way.
False positive scan data sitting on the server is open to malicious misuse by anyone who can get a warrant.
>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime
https://www.dailymail.co.uk/news/article-7897319/Police-arre...
Also, I sincerely doubt that Google has ended it's practice of refusing to hire a human being to check for false positives when a poorly performing algorithm is cheaper.
kelnos 2021-08-19 02:59:50 +0000 UTC [ - ]
I'm fine with the idea that if I upload stuff to someone else's server, they may take a look at it and maybe even punish me for what I've uploaded. Certainly if I encrypt the data before I upload it, they can't do that. But if I don't, then it's fine with me if they do.
But my device should not be snitching on me. Yes, this device-side scanning is supposedly gated on enabling uploads to iCloud, but that doesn't really make for much of a distinction to me. And since Apple certainly has the capability, they are likely one secret court order away from being required to scan even photos that aren't being uploaded, at least on some targeted subset of devices.
tshaddox 2021-08-19 03:22:17 +0000 UTC [ - ]
Apple has the ability to upload literally any software to iPhones, so this argument applies equally to literally any conceivable bad thing that software could do on iPhones.
GeekyBear 2021-08-19 14:03:44 +0000 UTC [ - ]
orangeoxidation 2021-08-18 21:05:14 +0000 UTC [ - ]
Spivak 2021-08-18 21:14:30 +0000 UTC [ - ]
1. The TSA agent opens your luggage and searches everything for banned items.
2. The TSA agent hands you a scanner for you to wave over your luggage in private, it prints out a receipt of banned items it saw, and you present that receipt to the agent.
Which one is more invasive?
IncRnd 2021-08-19 03:07:06 +0000 UTC [ - ]
3. The home builder installs a TSA scanner in all newly built homes. The scanners will scan all items as they are put away into drawers and cupboards, attempting to detect the presence of banned items.
Then the TSA Agents say they won't report you until at least 30 banned items are detected, even though you haven't flown in 10 years and don't have banned materials. As the TSA Agents walk back to their car, you overhear words like warrant, trouble, and tap amidst their chuckles. What could go wrong with that?
Aaargh20318 2021-08-18 21:28:24 +0000 UTC [ - ]
shuckles 2021-08-18 22:25:45 +0000 UTC [ - ]
rfd4sgmk8u 2021-08-18 21:47:24 +0000 UTC [ - ]
Which one is more invasive?
I don't think there is much to not understand. Apple is proposing putting a cop in your house. On device scanning is vastly vastly different than server side scanning. One is over apples property, the other is over YOUR property. Its a big deal, and a huge change in policy. And as myself and others have indicated from years of observation, it will grow and grow...
smnrchrds 2021-08-18 21:28:47 +0000 UTC [ - ]
cma 2021-08-18 21:26:27 +0000 UTC [ - ]
The scanner technology can also be used for detecting dissident journalism, but they say it won't be, it is just for preventing terrorism, but still they say, they really need to install it in your house even though it is only for scanning when you intend on going to the airport.
bsql 2021-08-18 21:57:03 +0000 UTC [ - ]
This has always been possible with server side scanning. Yeah we have to trust Apple to not enable it for users who don’t use iCloud photos but we’ve always had to trust Apple given their closed source system.
varispeed 2021-08-19 09:12:17 +0000 UTC [ - ]
This is Orwellian doublespeak.
You don't preserver someone's privacy by snooping on their devices.
mschuster91 2021-08-18 21:04:46 +0000 UTC [ - ]
They're not going to find predators, all they are going to find is people incompetent enough to have years old (otherwise how would it end up at NCMEC?) CSAM on their iPhones and dumb enough to have iCloud backup turned on. To catch actual predators they would need to run AI scanning on phones for all photos and risk a lot of false positives by parents taking photos of their children on a beach.
Plus a load of people who will inadvertently have stuff that "looks" like CSAM on their devices because some 4chan trolls will inevitably create colliding material and spread it in ads etc. so it ends up downloaded in browser caches.
All of this "let's use technology to counter pedos" is utter, utter crap that is not going to save one single child and will only serve to tear down our freedoms, one law at a time. Want to fight against pedos? Make photos of your airbnb, hotel and motel rooms to help identify abuse recording locations and timeframes, and teach children from an early age about their bodies, sexuality and consent so that they actually have the words to tell you that they are being molested.
zepto 2021-08-18 21:45:59 +0000 UTC [ - ]
Who do you think has 30 or more CSAM images on their phone?
atq2119 2021-08-18 22:27:01 +0000 UTC [ - ]
But let's also be real about something else. Think of the Venn diagram of child abusers and people who share CSAM online. Those circles are not identical. Not everybody with CSAM is necessarily a child abuser. Worse, how many child abusers don't share CSAM online? Those can't ever be found by Apple-style invasions of privacy, so one has to wonder if we're being asked to give up significant privacy for a crime fighting strategy that may not even be all that effective.
The cynic in me is also wondering what fraction of people working at places like NCMEC are pedophiles. It'd be a rather convenient job for them. And after all, there's a long history of ostensibly child-serving institutions harboring the worst kind of offenders.
zepto 2021-08-18 22:32:48 +0000 UTC [ - ]
> a crime fighting strategy
Is it a crime fighting strategy? I thought it was just about not making their devices and services into aids for these crimes.
> The cynic in me is also wondering what fraction of people working at places like NCMEC are pedophiles.
Good question. What has that to do with this?
atq2119 2021-08-18 22:45:48 +0000 UTC [ - ]
And how many children can really be helped in this way? Surely the majority of child abuse happens independently of the online sharing of CSAM.
And is that really worth the suspension of important civil rights? That is really what this discussion is about. Nobody is arguing against fighting child abuse, the question is about how, and about what means can be justified.
zepto 2021-08-19 00:05:26 +0000 UTC [ - ]
Does it? I see no reason to assume that.
> And is that really worth the suspension of important civil rights?
No civil rights are being suspended. It’s not about that at all.
mschuster91 2021-08-18 23:21:38 +0000 UTC [ - ]
CSAM scanning is sold as beneficial, while in reality it won't do shit while it opens dangerous precedence backdoors!
zepto 2021-08-19 00:08:58 +0000 UTC [ - ]
Why do you assume it’s extremely old?
mapgrep 2021-08-19 02:29:03 +0000 UTC [ - ]
So does Apple.
EDIT: some people don’t like that answer, but “inspecting” in this context clearly means “digitally inspecting” (Google does not physically look at every file) and Apple does this with files that are uploaded. They do it in device, but it’s still inspected. That’s the whole point of this controversy, that there’s not much difference to people WHERE apple inspects and on device is actually arguably worse. Your sentence does not in any way distinguish what Apple does from what others do.
jiocrag 2021-08-19 05:38:39 +0000 UTC [ - ]
Conjecture, admittedly:
1. Apple cannot lose the Chinese market. Huge and more important fastest growing geo for the company.
2. China is self deprecating elements of its own tech sector (aggressive crackdowns on both established companies like tencent and Alibaba as well as individual web sites). They are clearly cleaning house from a surveillance and control perspective. Apple is not immune, but it’s an American behemoth, so open door crackdowns are impossible.
I don’t think Apple’s CSAM push and China’s crackdown are purely coincidental.
Who can argue with stemming child abuse? It’s the type of hot button issue that affords broad acceptance for intrusive tech.
The leap from scanning for abuse to scanning for anti regime content is more like a tiny step.
It seems obvious from afar that the company adamant about refusal to unlock a potential terrorist’s iPhone on privacy principles (with the attendant marketing benefits) would so suddenly force push (and therefore ensure collection massive training data with or without opt-in for Chinas v2.0) such a boldly invasive feature addition.
Turns out vertical integration is both gift and curse (dependent on the whims of the integrator) for on-device privacy and autonomy.
strogonoff 2021-08-19 06:01:15 +0000 UTC [ - ]
If Apple happens to use similar ToS in China (no idea if true), you bet CCP would be all over this clause.
A worst-case wild guess from the pessimist in me: they had to add this clause in 2019 to appease CCP, and then they got thinking if they could make use of it for the greater good (tm), too. Hopefully it’s very incorrect.
stjohnswarts 2021-08-19 07:14:31 +0000 UTC [ - ]
GeekyBear 2021-08-18 20:46:29 +0000 UTC [ - ]
Apple announces that it is going to start scanning iCloud Photos only, and that their system is set to ignore anything below a threshold of ~30 positives before triggering a human review, and people lose their minds.
mightybyte 2021-08-18 21:10:11 +0000 UTC [ - ]
This is the difference between putting CSAM on a sign in your front yard (maybe not quite front yard but I can't come up with quite the same physical equivalent to a cloud provider) and keeping it in a password protected vault in your basement. One of those things is protected in the U.S. by laws against unlawful search and seizure. Cloud and on your device are two very different things and consumers are right to be alarmed.
I'll say it again, if you are concerned with this privacy violation, sell your Apple stock and categorically refuse to purchase Apple devices. Also go to https://www.nospyphone.com/ and make your voice heard there.
onethought 2021-08-19 04:57:34 +0000 UTC [ - ]
I honestly can't find the uproar here. Google devices can face match photos offline... so they are applying a neural net (scanning) ON THE DEVICE! How is that not worse than what apple do?
mightybyte 2021-08-19 11:08:29 +0000 UTC [ - ]
The difference can also be seen from a customer service perspective. One is a feature that lets you sort according to which friends you were with. The other is a feature that puts you in jail. No thanks. Not gonna pay money for that.
onethought 2021-08-19 14:05:40 +0000 UTC [ - ]
Literally no difference.
If you have illegal stuff only on your phone neither google or Apple will be notified or notify anyone else.
GeekyBear 2021-08-18 21:14:38 +0000 UTC [ - ]
kelnos 2021-08-19 03:16:22 +0000 UTC [ - ]
Let's say the TSA were to install air-travel-contraband scanners in everyone's homes, but promise only to scan things that are being put into your luggage as you prepare to go to the airport. And let's say that this became a requirement if you want to board a plane.
That's what this feels like. I'm fine with Google scanning through everything in my GMail account, or everything I've uploaded to GDrive, or created in GDocs. That stuff is on their servers, unencrypted, and I explicitly put it there.
But I'm sure as hell not going to let Google install something on my laptop (or phone!) that lets them look at my stuff, even if they pinky-promise that they'll only scan stuff that I intend to upload.
mightybyte 2021-08-18 22:40:03 +0000 UTC [ - ]
First of all, you have to be able to read it to do the comparison that can increment the counter to 30. So regardless of whether it is or is not encrypted there, they're accessing the unencrypted plaintext to calculate the hash.
And yes, on my device is definitively more private than on someone else's server--just like in my bedside drawer is more private than in an office I rent in a co-working space.
heavyset_go 2021-08-18 22:20:41 +0000 UTC [ - ]
This doesn't matter because Apple can read iCloud data, including iCloud Photos. They hold the encryption keys, and they hand over customers' data for about 150,000 users/accounts a year in response to requests from the government[1].
GeekyBear 2021-08-19 02:03:37 +0000 UTC [ - ]
How do you think Google and Microsoft scan everything in your account? They all have the capability to read your cloud data.
What Apple cannot read are the results of your device scanning your iCloud Photos. Those results are encrypted and stay that way until your device finds 30 matches for known kiddie porn.
Once you pass the threshhold, Apple gets the decryption key and a human review is triggered to make sure there weren't just 30 false positives.
pasquinelli 2021-08-18 21:18:58 +0000 UTC [ - ]
maybe you mean to say that apple says they won't read it until that threshold has been crossed.
zepto 2021-08-18 21:46:46 +0000 UTC [ - ]
The kind Apple has built. You should read the docs. This is literally how it works.
mightybyte 2021-08-18 22:37:08 +0000 UTC [ - ]
zepto 2021-08-19 00:04:48 +0000 UTC [ - ]
GeekyBear 2021-08-18 21:25:39 +0000 UTC [ - ]
>Apple is unable to process individual vouchers; instead, all the properties of our system mean that it’s only once an account has accumulated a collection of vouchers associated with illegal, known CSAM images that we are able to learn anything about the user’s account.
Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy.
https://techcrunch.com/2021/08/10/interview-apples-head-of-p...
Meanwhile, a single false positive from an on server scan is open to malicious use by anyone who can get a subpeona.
telside 2021-08-18 21:49:08 +0000 UTC [ - ]
Just going to respond to every post on here with these absurd points? K apple guy.
brandon272 2021-08-18 21:08:11 +0000 UTC [ - ]
When that scanning gets moved from the cloud to being on your device, a boundary is violated.
When that boundary is violated by a company who makes extreme privacy claims like saying that privacy is a "fundamental human right"[1], yes, people will "lose their minds" over it. This shouldn't be shocking at all.
GeekyBear 2021-08-18 21:34:30 +0000 UTC [ - ]
You would have to have 30 false positives before Apple can see anything, which is unlikely, but the next step is still a human review, since it's not impossible.
OnlineGladiator 2021-08-18 21:56:40 +0000 UTC [ - ]
GeekyBear 2021-08-18 22:24:31 +0000 UTC [ - ]
If anything, you should be outraged that Google and Microsoft have been scanning much more of your data, and doing so in a much more intrusive way.
Apple only scans iCloud Photos and they do so in a way that they can't see the results until they can be reasonably sure it's not just a false positive.
OnlineGladiator 2021-08-18 23:13:15 +0000 UTC [ - ]
If you think Apple's approach is the best you're allowed to think that. I disagree.
brandon272 2021-08-18 22:02:08 +0000 UTC [ - ]
OnlineGladiator 2021-08-18 20:54:46 +0000 UTC [ - ]
GeekyBear 2021-08-18 21:02:43 +0000 UTC [ - ]
Apple can't decrypt the results of the scan until the ~30 image threshold is crossed and a human review is triggered.
Given Google's reluctance to hire humans when a poorly performing algorithm is cheaper, are they turning over every single false positive without a human review?
lifty 2021-08-18 22:06:31 +0000 UTC [ - ]
GeekyBear 2021-08-18 22:26:27 +0000 UTC [ - ]
Apple isn't scanning that.
Google and Microsoft are.
lifty 2021-08-18 22:36:11 +0000 UTC [ - ]
GeekyBear 2021-08-18 23:07:45 +0000 UTC [ - ]
They don't cross that line like Google and Microsoft do.
With Apple, nothing but files you upload to iCloud Photos get scanned.
GuB-42 2021-08-18 22:19:41 +0000 UTC [ - ]
Facebook, even more so, they are explicitly anti-privacy to the point of being insulting.
Microsoft will happily show you everything they may send when you install Windows, you can sometimes refuse, but not always. They are a bit less explicit than Google, but privacy is rarely on the menu.
As for Amazon, their cloud offers are mostly for businesses, different market, but still, for consumers, they don't really insist on privacy either.
So that if any of these company scan your pictures for child porn, it won't shock anyone, because we know it is what they do.
But Apple claims privacy as a core value, half of their ads are along the lines of "we are not like the others, we respect your privacy, everything on your device stays on your device, etc...", they announce every (often legitimate) privacy feature with great fanfare, etc... So much that people start to believe it. But with that, people realize that Apple is not so different from the others after all, and if they bought an overpriced device based on that promise, I understand why they are pissed off.
heavyset_go 2021-08-18 20:55:44 +0000 UTC [ - ]
GeekyBear 2021-08-18 21:05:19 +0000 UTC [ - ]
https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
You don't consider the contents of your email account or the files you mirror to a cloud drive to be your own private data?
WA 2021-08-18 21:21:42 +0000 UTC [ - ]
I expect that my ISP tracks and stores my DNS resolutions (if I use their DNS) and has a good understanding of the websites I visit.
I expect that an app that I grant access to my contacts uploads as much data as it can to their servers.
I expect WhatsApp and similar apps to collect and upload meta data of my entire photo library such as GPS info the second I give them access.
Hence, I don’t give access. And hence, it’s a problem if there is no opt-out of local file scanning in the future.
_trampeltier 2021-08-18 21:17:52 +0000 UTC [ - ]
GeekyBear 2021-08-18 21:30:47 +0000 UTC [ - ]
>So if iCloud Photos is disabled, the system does not work, which is the public language in the FAQ. I just wanted to ask specifically, when you disable iCloud Photos, does this system continue to create hashes of your photos on device, or is it completely inactive at that point?
If users are not using iCloud Photos, NeuralHash will not run
https://techcrunch.com/2021/08/10/interview-apples-head-of-p...
heavyset_go 2021-08-18 22:12:34 +0000 UTC [ - ]
bobthepanda 2021-08-18 20:59:50 +0000 UTC [ - ]
https://protectingchildren.google/intl/en/
> CSAI Match is our proprietary technology, developed by the YouTube team, for combating child sexual abuse imagery (CSAI) in video content online. It was the first technology to use hash-matching to identify known violative videos and allows us to identify this type of violative content amid a high volume of non-violative video content. When a match of violative content is found, it is then flagged to partners to responsibly report in accordance to local laws and regulations. Through YouTube, we make CSAI Match available for free to NGOs and industry partners like Adobe, Reddit, and Tumblr, who use it to counter the spread of online child exploitation videos on their platforms as well.
> We devote significant resources—technology, people, and time—to detecting, deterring, removing, and reporting child sexual exploitation content and behavior. Since 2008, we’ve used “hashing” technology, which creates a unique digital ID for each known child sexual abuse image, to identify copies of images on our services that may exist elsewhere.
arsome 2021-08-18 21:02:44 +0000 UTC [ - ]
heavyset_go 2021-08-18 21:03:01 +0000 UTC [ - ]
bobthepanda 2021-08-18 21:06:59 +0000 UTC [ - ]
The Google page has a section later down that also says they use hashing of images.
GeekyBear 2021-08-18 21:10:55 +0000 UTC [ - ]
When Google scans on server, a single false positive result can be abused by anyone who can get a warrant.
>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime
https://www.dailymail.co.uk/news/article-7897319/Police-arre...
Apple's method is more private.
websites2023 2021-08-18 20:51:18 +0000 UTC [ - ]
__blockcipher__ 2021-08-18 21:08:17 +0000 UTC [ - ]
OrvalWintermute 2021-08-18 21:24:21 +0000 UTC [ - ]
Mainland China will probably be the first chip to fall. Can't imagine the Ministry of State Security not actively licking their lips, waiting for this functionality to arrive.
websites2023 2021-08-18 21:09:51 +0000 UTC [ - ]
commoner 2021-08-18 21:30:41 +0000 UTC [ - ]
Even though NCMEC describes itself as "private", it was established by and has been heavily funded by the U.S. government.
From an archive of NCMEC's own history page, cited on Wikipedia (https://web.archive.org/web/20121029010231/http://www.missin...):
> In 1984, the U.S. Congress passed the Missing Children’s Assistance Act which established a National Resource Center and Clearinghouse on Missing and Exploited Children. The National Center for Missing & Exploited Children was designated to fulfill this role.
> On June 13, 1984, the National Center for Missing & Exploited Children was opened by President Ronald Reagan in a White House Ceremony. The national 24-hour toll-free missing children’s hotline 1-800-THE-LOST opened as well.
$40 million/year of U.S. government funding from a 2013 bill (https://en.wikipedia.org/wiki/Missing_Children%27s_Assistanc...):
> The Missing Children's Assistance Reauthorization Act of 2013 (H.R. 3092) is a bill that was introduced into the United States House of Representatives during the 113th United States Congress. The Missing Children's Assistance Reauthorization Act of 2013 reauthorizes the Missing Children's Assistance Act and authorizes $40 million a year to fund the National Center for Missing and Exploited Children.
tjfl 2021-08-18 21:28:56 +0000 UTC [ - ]
> The National Center for Missing & Exploited Children® was established in 1984 as a private, nonprofit 501(c)(3) organization. Today, NCMEC performs the following 15 specific programs of work, funded in part by federal grants (34 U.S.C. § 11293): Source: https://www.missingkids.org/footer/about
US DOJ OJJDP lists recent grants totaling $84,446,366 in FY19 and FY20. Source: https://ojjdp.ojp.gov/funding/awards/list?awardee=NATIONAL%2...
__blockcipher__ 2021-08-18 21:42:16 +0000 UTC [ - ]
https://www.law.cornell.edu/uscode/text/18/2258A
You must report to them and only them.
For the GP to claim they’re not government “owned” is a rhetorical trick at best and outright ignorant absurdity at worst.
__blockcipher__ 2021-08-18 21:40:11 +0000 UTC [ - ]
justin_oaks 2021-08-18 20:56:38 +0000 UTC [ - ]
Imagine if Apple had done this on the client side without telling anyone, and later it was discovered. I think things would be a whole worse for Apple in that case.
squarefoot 2021-08-19 02:42:36 +0000 UTC [ - ]
websites2023 2021-08-18 22:57:57 +0000 UTC [ - ]
rblatz 2021-08-18 21:47:39 +0000 UTC [ - ]
Copernicron 2021-08-18 21:17:59 +0000 UTC [ - ]
xdennis 2021-08-19 01:39:51 +0000 UTC [ - ]
In part because people didn't know.
And if you were one of innocent people caught by them, you wouldn't want people to know.
floatingatoll 2021-08-18 21:17:38 +0000 UTC [ - ]
"Apple does not have my permission to use my device to scan my iCloud uploads for CSAM"
and
"This is a slippery slope that could result in Apple enforcing thoughtcrimes"
Neither of these viewpoints are particularly agreeable to the general public in the US, as far as I can determine from my non-tech farming city. Once the fuss in tech dies down, I expect Apple will see a net increase in iCloud adoption — all the fuss we're generating is free advertising for their efforts to stop child porn, and the objections raised are too domain-specific to matter.
It's impossible to say for certain which of your outcomes will occur, but there's definitely two missing from your list. Corrected, it reads:
"Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust, or Apple sees no effect whatsoever on consumer trust, or Apple sees a permanent boost in consumer trust."
I expect it'll be "no effect", but if I had to pick a second guess, it would be "permanent boost", well offsetting any losses among the tech/free/lib crowd.
drvdevd 2021-08-19 00:04:31 +0000 UTC [ - ]
"Apple has created a system for detecting CSAM on local devices which has already proven vulnerable to cheap perceptual hash collision attacks. It's now highly inconceivable Apple will be able to deploy this technology as-is without having their users exploited."
In other words it's not just about privacy or thoughtcrimes anymore but should be viewed as actually dangerous to use their devices. I feel a bit dramatic even typing that out but I.. think it's true?
floatingatoll 2021-08-19 15:25:03 +0000 UTC [ - ]
How dramatic is too dramatic? When does something that hasn’t happened to you or anyone you know become a risk you’re willing to sacrifice personal convenience to mitigate? Will you be divesting yourself of all wireless radio hardware? If not, then why would you be worried about users being exploited through a more clumsy and less effective process such as CSAM signature hacking?
The piece of information you’re taking for granted, that few in free/tech/lib are confronting, is the assumption that this process can be exploited at scale to harm millions of people.
So far as I can tell, there will probably be zero or one false positive CSAM matches that pass the known algo, the unknown algo, the human blurred comparison, and the human unblurred comparison — all steps that must occur before law enforcement is invoked to collect digital evidence - in the first year.
How many false positives (to the nearest 10^X) do you think the system will generate in the first year that result in law enforcement actions? Your words suggest that everyone is vulnerable, and there are 10^9 users, so do you believe there will be 10^9 false positives in the first year? Do you think only a thousand people will be affected, so 10^3? How do you judge which is more likely correct?
It is unlikely that this system will generate 10^9 false positives, or else it never would have passed QA. I encourage you to consider how you would personally quantify this risk, and then also look up the quantified risks for killing someone while driving a car or getting struck by lightning while indoors. I don’t know what the actual reality will be, but I don’t think it's a very large X.
duxup 2021-08-18 20:54:40 +0000 UTC [ - ]
I mentioned I was thinking of moving from an Android phone to Apple soon, somewhat privacy related.
My friends lectured me on "they're scanning your photos" ... meanwhile they share their google photos albums with me and marvel about how easy they are to search ...
Maybe we (humans) only get outraged based on more specific narratives and not so much the general topics / issues?
I don't know but they didn't seem to notice the conflict.
kzrdude 2021-08-18 21:01:42 +0000 UTC [ - ]
Psychologically, you'll feel a difference in what you accept between the two, I think
whoknowswhat11 2021-08-18 21:07:19 +0000 UTC [ - ]
Don't want your photo's scanned, don't sync them to icloud. Seriously! Please include the actual system when discussing this system, not your bogeyman system.
"To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."
To increase privacy - they perform the scan on device prior to upload.
Johnny555 2021-08-18 21:23:00 +0000 UTC [ - ]
"for now"
Which is the part most people have a problem with -- they say that they are only scanning iCloud uploads now, but it's simple extension of the scanner to scan all files.
I don't care if Apple scans my iCloud uploads on iCloud servers, I don't want them scanning photos on my device.
smnrchrds 2021-08-18 21:37:17 +0000 UTC [ - ]
kcb 2021-08-18 21:44:39 +0000 UTC [ - ]
smnrchrds 2021-08-18 22:19:57 +0000 UTC [ - ]
TimTheTinker 2021-08-19 03:16:27 +0000 UTC [ - ]
Johnny555 2021-08-18 21:38:38 +0000 UTC [ - ]
whoknowswhat11 2021-08-19 03:25:18 +0000 UTC [ - ]
I don't have a problem if it scans everything, but its not. Let's stick to what is doing. Android could do this as well, so talking about what companies like google could do is not so interesting - they could do almost anything.
duxup 2021-08-18 21:10:02 +0000 UTC [ - ]
I'm not sure there's a real difference unless you want to watch your settings all the time. In google land they tend to reset ... and really that happens a lot of places.
I think for most people if you use google, you're in their cloud.
gmueckl 2021-08-18 21:07:03 +0000 UTC [ - ]
shapefrog 2021-08-18 21:06:48 +0000 UTC [ - ]
No ... They scan everything that I have released to apple photos that exists on my device.
Same scan - different place.
__blockcipher__ 2021-08-18 21:10:00 +0000 UTC [ - ]
shapefrog 2021-08-18 21:48:30 +0000 UTC [ - ]
I get to select the issue and it was in response to the previous claim that 'they' are scanning everything that exists on the device right now.
A year ago this was a possible 'future development'. 10 years from now I could be living on Mars. This is all hypothetical.
heavyset_go 2021-08-18 22:54:45 +0000 UTC [ - ]
Exactly. However, people don't seem to have an issue when hypothesizing that CSAM detection is good actually, because Apple might implement E2EE, despite no evidence of such intentions.
For some reason, though, people take issue when others hypothesize that CSAM detection is bad actually, because Apple might expand it further and violate users' privacy even more. And there's actually precedent for this hypothesis, given Apple's actions here and their own words[1]:
> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.
robertoandred 2021-08-18 21:12:41 +0000 UTC [ - ]
shapefrog 2021-08-18 21:52:29 +0000 UTC [ - ]
Must have been absloutely nuts round here back in 2011 when they announced the particular slippery slope that is iCloud backup.
__blockcipher__ 2021-08-18 21:37:53 +0000 UTC [ - ]
Now we have clientside scanning, and actually being led by the [note the scare quotes] “pro-privacy” two ton gorilla. It’s a whole different ballgame.
Multicomp 2021-08-18 21:28:24 +0000 UTC [ - ]
But here appears to be beyond the pale such that people are saying "this far, no farther!"
antpls 2021-08-19 14:25:49 +0000 UTC [ - ]
Most people won't have any idea about the meaning of "hashes" and "databases". Not everyone is trying to actively fight the system and shit on everything, most people just want to live happily with their friends and family, they won't care that Apple scans their devices.
> "Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust."
Oh god ! How wasn't all of this obvious to the top Apple management, but so obvious to epistasis! Damn, thanks man for correcting and leading Apple to the right track !
daxuak 2021-08-18 20:17:12 +0000 UTC [ - ]
floatingatoll 2021-08-18 20:46:53 +0000 UTC [ - ]
How did you determine that their intentions contradict their words? Please share the framework for your belief, so that we're able to understand how you arrived at that belief and to evaluate your evidence with an open mind.
(Or, if your claim is unsupported conjecture, please don't misrepresent your opinions and beliefs as facts here at HN.)
camillomiller 2021-08-18 20:31:11 +0000 UTC [ - ]
bhawks 2021-08-18 20:37:23 +0000 UTC [ - ]
Definitely would appreciate a link to anything substantial indicating that this was a bunch of eng in over their heads.
emptysongglass 2021-08-18 20:47:53 +0000 UTC [ - ]
sorrytim 2021-08-18 20:59:38 +0000 UTC [ - ]
orangeoxidation 2021-08-18 21:53:44 +0000 UTC [ - ]
Wow. That feels like an overreach. 'We don't just buy your labor, but also your private opinion and tell you how to talk to your family'.
sorrytim 2021-08-19 02:57:25 +0000 UTC [ - ]
dylan604 2021-08-18 21:23:27 +0000 UTC [ - ]
stingraycharles 2021-08-18 20:48:58 +0000 UTC [ - ]
I’m not buying the “engineers were left unbridled” argument, I think there just have been a level of obliviousness in much wider a part of the organization for something like this to happen.
FabHK 2021-08-18 20:52:56 +0000 UTC [ - ]
Maybe because they underestimated people's ignorance.
I think they saw (and still do see) it as a better, more privacy preserving technique than what everyone else is doing.
fraa-orolo 2021-08-18 21:05:05 +0000 UTC [ - ]
Their hubris is in not seeing or thinking that they will be able to stand up to all kinds of abuses of this system that its mere existence will invite; their hubris is also in thinking that they will be able to perfectly and without mistakes manage and overview a system making accusations so heinous that even the mere act of accusing destroy people's lives and livelihoods.
shuckles 2021-08-18 21:29:01 +0000 UTC [ - ]
Since the announcement, I can think of a dozen ways Apple could be easily forced into scanning all the contents of your device by assembling features they’ve already shipped. Yet they haven’t. At some point, people need to produce evidence that Apple cannot hold the line they’ve said they will.
int_19h 2021-08-19 02:37:55 +0000 UTC [ - ]
What we have now is Apple, with its "strong privacy" record, normalizing this. If it succeeds, it would be that much easier for the governments to tackle other stuff onto it. Or, say, lower the threshold needed to submit images for review. I can easily picture some senator ranting about how unacceptable it is that somebody with only 20 CSAM photos won't be flagged, and won't somebody please think of the children?
And yes, if it comes to that, Apple definitely cannot hold the line. After all, they already didn't hold it on encrypted cloud storage - and that wasn't even legally forced on them, merely "not recommended".
candiodari 2021-08-18 22:13:14 +0000 UTC [ - ]
justapassenger 2021-08-18 20:13:44 +0000 UTC [ - ]
Because privacy stance is mostly PR to differentiate from Google. And while there're invalid reasons to get users data, there're also valid ones (at least from legal requirement point of view - let's not get into weeds about personal freedom here and if the laws and its implementations need to be changed).
Their PR was just writing the checks they cannot cash without going on a war with governments.
n8cpdx 2021-08-18 20:56:22 +0000 UTC [ - ]
I assumed they pivoted to focus on privacy, but clearly it was just a marketing department innovation rather than a core value (as their marketing department claimed).
istingray 2021-08-18 20:33:32 +0000 UTC [ - ]
websites2023 2021-08-18 21:00:01 +0000 UTC [ - ]
If your threat model includes being the target of someone who will plant child pornography on your phone, you are already fucked. And no, Apple isn’t suddenly going to scan Chinese iPhones for Winnie the Pooh memes. They don’t have to. China already has the 50 cent party to do that for them, on WeChat.
Basically everything everyone seems to think is just around the corner has already been possible for years.
shuckles 2021-08-18 21:30:46 +0000 UTC [ - ]
websites2023 2021-08-18 22:48:38 +0000 UTC [ - ]
whoknowswhat11 2021-08-18 21:03:04 +0000 UTC [ - ]
What I have seen is a selling point for apple products.
I'd encourage folks to get out of the HN bubble on this - talk to a wife, a family especially those with kids.
__blockcipher__ 2021-08-18 21:14:57 +0000 UTC [ - ]
Why stop there? Get out of the HN bubble on the patriot act, instead ask your neighbor’s wife her thoughts on it. Get out of the HN bubble on immigration, go ask a stereotypical boomer conservative about it.
I think my sarcasm already made it overtly obvious but, this is horrible advice you are giving and the fact that you don’t seem to be aware that pedophilia and terrorism are the two most classic “this gives us an excuse to exert totalitarian control” topics betrays your own ignorance (or, worse, you are aware and just don’t care).
ravenstine 2021-08-18 21:53:13 +0000 UTC [ - ]
zepto 2021-08-18 21:41:11 +0000 UTC [ - ]
The only people who are bothered are people claiming this is going to be misused by authoritarian governments.
teclordphrack2 2021-08-19 02:27:45 +0000 UTC [ - ]
This is not me agreeing or disagreeing.
nojito 2021-08-18 20:33:17 +0000 UTC [ - ]
There are atleast 2-3 further checks to account for this.
fraa-orolo 2021-08-18 21:13:22 +0000 UTC [ - ]
Salvador Dali could do something similar by hand in 1973 in Gala Contemplating the Mediterranean Sea [1]
__blockcipher__ 2021-08-18 21:17:16 +0000 UTC [ - ]
The mere accusal itself of possessing CSAM can be life ruining if it gets to that stage. More importantly, a collision will effectively allow warrantless searches, at least of the collided images.
fraa-orolo 2021-08-18 21:19:19 +0000 UTC [ - ]
nojito 2021-08-18 23:39:18 +0000 UTC [ - ]
SXX 2021-08-19 07:53:02 +0000 UTC [ - ]
And whoever going to check images for Apple will see that yeah, there is porn on picture. Flag it. Then you'll have unlimited amount of time to explain to FBI why some porn on your device match CSAM hash.
geoah 2021-08-18 19:32:33 +0000 UTC [ - ]
“Cryptographic representations of images”. That’s not the case though right? These are “neuralhashes” afaik which are nowhere close to cryptographic hashes but rather locality sensitive hashes which is a fancy speak for “the closer two images look like, the more similar the hash”.
Vice and others keeps calling the cryptographic. Am I missing something here?
crooked-v 2021-08-18 19:45:21 +0000 UTC [ - ]
0x5f3759df-i 2021-08-18 21:32:57 +0000 UTC [ - ]
legostormtroopr 2021-08-19 03:46:53 +0000 UTC [ - ]
I know that that "think of the children" is a meme, but I think this illustrates the point for apple. If you have a croped or modified image of CSA the system will identify it. As long as your image is different enough from CSA, you are safe.
The point here is that Apple is specifically looking for matches against known CSA material.
If someone can demonstrate that a legal NSFW image (eg. regular old-fashioned pornography), can be collided with a legal, completely 100% SFW image then I'll be concerned.
But until then, this looks like a reasonable and supportable way for finding CSAM in real time.
hartator 2021-08-19 04:34:03 +0000 UTC [ - ]
Look at this other collision: https://twitter.com/SarahJamieLewis/status/14282060881181491... An attacker can send you an innocent looking picture that embbed some CSA material and you get swatted the next day.
legostormtroopr 2021-08-19 05:33:09 +0000 UTC [ - ]
Looking at those two images its plain to see why they are identical.
And if someone is sending you CSA material, isn't that the point of this process. Apple identifies it as CSA, can give you a warning its sensitive, and identify they authorities that people are sending CSA.
Again - this seems like a win. If Apple can automatically identify CSA material, en-masse thats good.
edit: It looks like they are different URLs, but Twitter only allows people to see replies if they are logged in, so I can't see that example.
edit 2: On further thought, if someone can use CSA material and produce an innocuous image with a similar hash, if they send that image to you, its still proof that the sender had CSA material. Again, its still good.
__blockcipher__ 2021-08-18 21:18:22 +0000 UTC [ - ]
bawolff 2021-08-18 20:32:57 +0000 UTC [ - ]
kbelder 2021-08-18 23:32:15 +0000 UTC [ - ]
smoldesu 2021-08-18 20:02:25 +0000 UTC [ - ]
floatingatoll 2021-08-18 20:28:11 +0000 UTC [ - ]
treesprite82 2021-08-18 22:21:52 +0000 UTC [ - ]
Consider if the honeypot images (manipulated to match CSAM hashes) are terrorist recruitment material for example.
laverya 2021-08-18 21:19:06 +0000 UTC [ - ]
Bonus points if you match poses, coloration, background, etc.
floatingatoll 2021-08-18 21:21:57 +0000 UTC [ - ]
I think that's a perfectly acceptable outcome, since anyone with the hubris to both possess CSAM content and create replicas of it especially deserves to be arrested. Do you see a more problematic outcome here?
ultrarunner 2021-08-18 23:36:47 +0000 UTC [ - ]
What if the visual check gets accidentally signed off on, or even gets fraudulently marked as positive by a burned out/lazy/competent-but-distracted employee? There are just so many failure modes for this dragnet that all end with innocent people suffering a very difficult process, even if they don't eventually land in prison. I don't think it's unreasonable to not want to be volunteered to participate.
laverya 2021-08-18 22:18:05 +0000 UTC [ - ]
And in retrospect, you don't need to match that - you just need it to appear obviously pornographic after the blur is applied in order to get past Apple's reviewers. After that, the lucky individual's life is in the hands of the police/prosecutors. (I have to imagine that both real and faked cases will look pretty much like "Your honor/members of the jury, this person's device contained numerous photos matching known CSAM. No, we won't be showing you the pictures. No, the defence can't see them either." Can you imagine a "tough on crime" prosecutor taking the faked case to trial too? Would the police and prosecutors even know it was faked?)
floatingatoll 2021-08-18 22:42:18 +0000 UTC [ - ]
ultrarunner 2021-08-18 23:38:15 +0000 UTC [ - ]
https://twitter.com/SarahJamieLewis/status/14280837442802565...
floatingatoll 2021-08-19 16:52:35 +0000 UTC [ - ]
SXX 2021-08-19 08:13:46 +0000 UTC [ - ]
Unless you're looking to build a porn dataset and you're don't care about copyright. Porn is industry where exabytes of material are produced and published on internet almost every week.
Who will agency come to? To some OnlyFans creators?
int_19h 2021-08-19 00:07:30 +0000 UTC [ - ]
floatingatoll 2021-08-19 15:53:48 +0000 UTC [ - ]
Is this just as unacceptable as the CSAM scanning? Should all public photography be banned, in order to reduce the risk of false positive identifications of innocent people as criminals to zero? Or is that an acceptable degree of privacy impingement for the good of society?
Is Apple’s implementation an acceptable trade of impingement and risk, for good for society? We do live in a society, and so zero impingement upon privacies is never going to be acceptable (sorry, free/tech/libs). But instead of discussing whether Apple’s approach violates privacy minimally or not in order to get the job done, these discussions here just keep circling the drain of “putting my privacy at risk by any degree is never acceptable”, when that drain is cemented shut by the existence of society and will never lead to a valid outcome.
whoknowswhat11 2021-08-18 19:58:33 +0000 UTC [ - ]
1) These are more share similar visual features than crypto hashes.
2) HN posters have been claiming that apple reviewing flagged photos is a felony -> because HN commentators are claiming flagged photos are somehow "known" CASM - this is also likely totally false. The images may not be CASM and the idea that a moderation queue results in felony charges is near ridiculous.
3) This illustrates why apple's approach here (manual review after 30 images or so flagged) is not unreasonable. The push to say that this review is unnecessary is totally misguided.
4) They use words like "hash collision" for something that is not a hash. In fact, different devices will calculate DIFFERENT hashes for the SAME image at times.
One request I have - before folks cite legal opinions - those opinions should have the name of a lawyer on them. Not this "I talked to a lawyer" because we have no idea if you described things accurately.
sandworm101 2021-08-18 20:03:36 +0000 UTC [ - ]
Not going to happen. Lawyers in the US have issues with offering unsolicited advice, and other problems with issuing advice into states where they are not admitted. So likely none of the US lawyers (and the great many more law students) here will ever put their real name to a comment.
agbell 2021-08-18 20:31:37 +0000 UTC [ - ]
Lawyers I know would politely decline that.
whoknowswhat11 2021-08-18 20:47:52 +0000 UTC [ - ]
So your own firm may cover some costs if you have something to say. If you found someone to pay for you to do an analysis or offer your thoughts - you'd be in heaven!
torstenvl 2021-08-18 20:41:33 +0000 UTC [ - ]
zie 2021-08-18 20:35:43 +0000 UTC [ - ]
But it's the law, it's fuzzy at best, much like your HR department. It's only after a court decision has been reached on your particular issue that it's anywhere near "settled" case law, and even that's up for possible change tomorrow.
dylan604 2021-08-18 21:27:44 +0000 UTC [ - ]
sandworm101 2021-08-18 21:36:50 +0000 UTC [ - ]
whoknowswhat11 2021-08-18 20:45:58 +0000 UTC [ - ]
At least HN should flag these and get these taken down. Over and over the legal analysis is either trash or it's clear the article author didn't understand something (so how can lawyer give good advice?).
These conversations become so uninteresting when people take these extreme type positions. Apple's brand is destroyed - apple is committing child porn felonies.
I would have rather just had a link to the apple technical paper and a discussion personally vs the over the top random article feed with all sorts of misunderstandings.
And in contract law there are LOTS of legal articles online - with folks name on them! They are useful! I read them and enjoy them. Can we ask for that here where it matters maybe more?
sandworm101 2021-08-18 21:39:58 +0000 UTC [ - ]
Articles are not legal advice. They are opinions on the law applicable generally, rather than fact-based advice to specific clients. Saying whether apple is doing something illegal or not in this case, with a lawyer's name stamped on that opinion, is very different.
rootusrootus 2021-08-18 20:16:43 +0000 UTC [ - ]
Agree, and I think this is backed up by real world experience. Has Facebook or anyone working on their behalf ever been charged for possession of CSAM? I guarantee they've seen some. Probably a lot, in fact. That's why we have recurring discussions about the workers and the compensation they get (or not) for the really horrid work they are tasked with.
Xamayon 2021-08-18 20:39:03 +0000 UTC [ - ]
whoknowswhat11 2021-08-18 20:49:16 +0000 UTC [ - ]
tetha 2021-08-18 22:45:51 +0000 UTC [ - ]
Overall, we're advised to take about these steps there. First off, report it. Second, remove all access for the customer, terminate the accounts, lock them out asap. Third, prevent access to the content without touching it. For example, if it sits on a file system and a web server could serve it, blacklist URLs on a loadbalancer. Fourth, if necessary, begin archiving and securing evidence. But if possible in any way, disable content deletion mechanisms and wait for legal advice, or the law enforcement to tell you how to gather the data.
But overall, you're not immediately guilty for someone abusing your service, and no one is instantly guilty for detecting someone is abusing your service.
judge2020 2021-08-18 20:47:50 +0000 UTC [ - ]
> In 2020, FotoForensics received 931,466 pictures and submitted 523 reports to NCMEC; that's 0.056%. During the same year, Facebook submitted 20,307,216 reports to NCMEC
https://www.hackerfactor.com/blog/index.php?/archives/929-On....
hpoe 2021-08-18 20:27:07 +0000 UTC [ - ]
whoknowswhat11 2021-08-18 20:50:21 +0000 UTC [ - ]
Do you really think Apple's brand has been "destroyed" over this?
rootusrootus 2021-08-18 22:58:02 +0000 UTC [ - ]
Someone 2021-08-18 20:04:17 +0000 UTC [ - ]
They use “private set intersection” (https://en.wikipedia.org/wiki/Private_set_intersection) to compute a value that itself doesn’t say whether an image is in the forbidden list, yet when combined with sufficiently many other such values can be used to do that.
They also encrypt the “NeuralHash and a visual derivative” on iCloud in such a way that Apple can only decrypt that if they got sufficiently many matching images (using https://en.wikipedia.org/wiki/Secret_sharing)
(For details and, possibly, corrections on my interpretation, see Apple’s technical summary at https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni... and https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...)
3np 2021-08-19 05:22:03 +0000 UTC [ - ]
People who understand tech well enough to recognize hashes like MD5 and SHA don't dive deep enough to understand that this is something completely different.
I even suspect this is deliberate from Apple's side when announcing and talking about these changes - making people wrongly believe that only exact matches will trigger, except possibly in extremely rare cases and under concious attacks.
They could have called it "fingerprint" or something but deliberately went with a technical term that even confuses technical people who know well enough what a hash usually means.
Vice is falling victim to this misunderstanding stemming from the conflation of "hash".
atonse 2021-08-18 20:28:25 +0000 UTC [ - ]
It's a huge, huge, huge distinction.
robertoandred 2021-08-18 20:04:30 +0000 UTC [ - ]
jdavis703 2021-08-18 20:48:48 +0000 UTC [ - ]
Edit: this is apparently not true as demonstrated by researchers.
judge2020 2021-08-18 20:53:16 +0000 UTC [ - ]
> Microsoft says that the "PhotoDNA hash is not reversible". That's not true. PhotoDNA hashes can be projected into a 26x26 grayscale image that is only a little blurry. 26x26 is larger than most desktop icons; it's enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26x26 Sudoku puzzle; a task well-suited for computers.
https://www.hackerfactor.com/blog/index.php?/archives/929-On...
robertoandred 2021-08-18 21:13:58 +0000 UTC [ - ]
judge2020 2021-08-18 21:45:24 +0000 UTC [ - ]
https://twitter.com/fayfiftynine/status/1427899951120490497?...
Given neuralhash is a hash of a hash, I imagine they’re running photodna and not some custom solution which would require Apple ingesting and hasing all of the images themselves using another custom perceptual hash system.
> . Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.
https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
robertoandred 2021-08-18 22:03:42 +0000 UTC [ - ]
andrewmcwatters 2021-08-18 19:36:51 +0000 UTC [ - ]
Apple isn't using a "similar image, similar hash" system. They're using a "similar image, same hash" system.
[1]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
heavyset_go 2021-08-18 19:47:14 +0000 UTC [ - ]
Perceptual hashes are not cryptographic hashes. Perceptual hashing systems do compare hashes using a distance metric like the Hamming distance.
If two images have similar hashes, then they look kind of similar to one another. That's the point of perceptual hashing.
andrewmcwatters 2021-08-18 22:10:31 +0000 UTC [ - ]
salawat 2021-08-18 19:51:44 +0000 UTC [ - ]
You know, let me put it this way. Yiu know that one really weird family member I'm pretty sure everyone either has or is?
Guess what? They're a neural net too.
This is what Apple is asking you to trust.
heavyset_go 2021-08-18 19:52:45 +0000 UTC [ - ]
function_seven 2021-08-18 19:49:30 +0000 UTC [ - ]
IGNORE THIS: I think that's the parent comment's point. These are definitely not cryptographic hashes, since they—by design and necessity—need to mirror hash similarity to the perceptual similarity of the input images.
whoknowswhat11 2021-08-18 20:51:58 +0000 UTC [ - ]
yeldarb 2021-08-18 19:30:05 +0000 UTC [ - ]
In order for a collision to get through to the human checkers, the same image would have to fool both networks independently:
mrits 2021-08-18 19:41:06 +0000 UTC [ - ]
yeldarb 2021-08-18 19:46:47 +0000 UTC [ - ]
Unclear how hard this would actually be in practice (if I were going to attempt it, the first thing I'd try is to evolve a colliding image with something like CLIP+VQGAN) but certainly harder than finding a collision alone.
salawat 2021-08-18 20:01:09 +0000 UTC [ - ]
Swing and a miss. Not in the CSAM dataset. Take two images. Encode alternating pixels. Decode to get the original image back. Convert to different encodings print to PDF or Postscript. Encode as base64 representations of the image file...
Who are we trying to fool here? This is kiddie stuff.
This is more about trying to implant scanning capabilities on client devices.
yeldarb 2021-08-18 20:19:16 +0000 UTC [ - ]
rootusrootus 2021-08-18 20:22:48 +0000 UTC [ - ]
Did we think they didn't already have that ability?
salawat 2021-08-18 21:19:43 +0000 UTC [ - ]
Barrin92 2021-08-18 20:11:16 +0000 UTC [ - ]
It's a direct trade-off and the error tolerance of any such filter is the only thing that makes it useful so we can basically stop arguing about the depths of implementation details or how high the collision rate of the hashing algorithm is etc. If this thing is supposed to catch anyone it needs to be magnitudes more lenient than any of those minor faults.
Ashanmaril 2021-08-18 20:54:45 +0000 UTC [ - ]
Or better yet, they can just not store their stuff on an iPhone. While meanwhile, millions of innocent people are have their photos scanned and risking being reported as a pedophile.
734129837261 2021-08-18 20:32:39 +0000 UTC [ - ]
1. It requires a perfect 1:1 match (their documentation says this is not the case); 2. Or it has some freedom in detecting a match, probably including a match with a certain percentage.
If it's the former, it's completely useless. A watermark or a randomly chosen pixel with a slightly different hue and the hash would be completely different.
So, it's not #1. It's going to be #2. And that's where it becomes dangerous. The government of the USA is going to look for child predators. The government of Saudi Arabia is going to track down known memes shared by atheists, and they will be put to death; heresy is a capital offence over there. And China will probably do their best to track down Uyghurs so they can make the process of elimination even easier.
It's not like Apple hasn't given in to dictatorships in the past. This tech is absolutely going to kill people.
endisneigh 2021-08-18 20:44:21 +0000 UTC [ - ]
visarga 2021-08-18 21:18:01 +0000 UTC [ - ]
kevin_thibedeau 2021-08-18 19:27:31 +0000 UTC [ - ]
istingray 2021-08-18 19:42:57 +0000 UTC [ - ]
eurasiantiger 2021-08-18 20:50:01 +0000 UTC [ - ]
mzs 2021-08-18 20:03:53 +0000 UTC [ - ]
>"This independent hash is chosen to reject the unlikely possibility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database," …
stevenalowe 2021-08-18 21:13:04 +0000 UTC [ - ]
tlogan 2021-08-19 02:41:30 +0000 UTC [ - ]
We are just one stupid terrorist attack from full surveillance of everybody.
balozi 2021-08-18 21:19:57 +0000 UTC [ - ]
Neural hash this: Its about Trust. Its about Privacy. Its about Boundaries between me and corporations/governments/etc.
ahD5zae7 2021-08-19 11:39:11 +0000 UTC [ - ]
What I'm basically getting at is: are the files scanned after the user has expressed the intention of uploading them? That's what I understood. Am I wrong? Are the files scanned the moment they appear on your device, regardless of you iCloud status (even if you have disabled iCloud somehow)?
Edit: typo
1vuio0pswjnm7 2021-08-18 20:36:46 +0000 UTC [ - ]
Does Apple's solution only stop people from uploading illegal files to Apple's servers or does it stop them from uploading the files to any server.
If Apple intends to control the operation of a computer purchased from Apple after the owner begins using it, does Apple have a duty to report illegal files found on that computer and stop them from being shared (anywhere, not just through Apple's datacenters).
To me, this is why there is a serious distinction between a company detecting and policing what files are stored on their computers (i.e., how other companies approach this problem) and a company detecting and policing what files someone else's computer is storing and can transfer over the internet (in this case, unless I am mistaken, only to Apple's computers).
Mind you, I am not familiar with the details of exactly how Apple's solution works nor the applicable criminal laws so these questions might be irrelevant. However I was thinking that if Apple really wanted to prevent the trafficking of ostensibly illegal files then wouldn't Apple seek to prevent their transfer not only to Apple's computers but to any computer (and also report them to the proper authorities). What duty does Apple have if they can "see into the owner's computer" and they detect illegal activity. If Apple is in remote control of the computer, e.g., they can detect the presence/absence of files remotely and allow or disallow full user control through the OS, then does Apple have a duty to take action.
judge2020 2021-08-18 20:40:19 +0000 UTC [ - ]
Only applies to iCloud Photos uploads, but the photos are still uploaded: when there's a match, the photo and a 'ticket' are uploaded and Apple's servers (after the servers themselves verify the match[0]) send the image to human reviewers to verify the CSAM before submitting it to police as evidence.
0: https://twitter.com/fayfiftynine/status/1427899951120490497 and https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
tejohnso 2021-08-18 20:43:35 +0000 UTC [ - ]
Perhaps they're trying to prevent being in "posession" of illegal images by having them on their own servers, rather than preventing copying to arbitrary destinations.
ysavir 2021-08-18 20:39:23 +0000 UTC [ - ]
Duty? No, that's the secondary question. The primary question is whether they have the right.
CubsFan1060 2021-08-18 20:42:43 +0000 UTC [ - ]
endisneigh 2021-08-18 20:25:30 +0000 UTC [ - ]
All of the slippery slope arguments have already been possible for nearly a decade now with the cloud.
Can someone illustrate something wrong with this that’s not already possible today.
Fundamentally unless you audited the client and the server yourself either (client or backend) scanning is possible, and therefore this “problem”.
Where’s the issue?
fritzw 2021-08-18 21:24:48 +0000 UTC [ - ]
That doesn't make it ok. The fact that you and my mom have normalized bad behavior doesn't make it any less offensive to my civil liberties.
> Am I the only one who finds no issue with this?
If you only think about the first order effects. This is great, we catch a bunch of kid sex pedos. The second order effects are much more dire. "slippery slope" sure... but false arrests, lives ruined based on mere investigations revealing nothing, expanding the role of government peering in to our personal lives, resulting suicides, corporations launching these programs - then defending them - then expanding them due to government pressure is *guaranteed*, additional PR and propaganda from corporations and government for further invasion in our personal lives due to the marketed/claimed success of these programs. The FBI has been putting massive pressure on Apple for years, due to the dominance and security measures of iOS.
Say what you want about the death penalty, but many many innocent people have dead in a truly horrific way, with some actually being tortured while being executed. That is a perfect example of second order effects on something most of us without any further information (ending murderous villains is a good thing) would agree on. So many Death Row inmates have been exonerated and vindicated.
edit: https://en.wikipedia.org/wiki/List_of_exonerated_death_row_i...
xadhominemx 2021-08-18 20:45:47 +0000 UTC [ - ]
Not only has it been possible for a decade, it’s been happening for a decade. Every major social media company already scans for and reports child pornography to the feds. Facebook submits millions of reports per year.
headShrinker 2021-08-19 10:08:28 +0000 UTC [ - ]
xadhominemx 2021-08-19 16:40:32 +0000 UTC [ - ]
endisneigh 2021-08-18 20:53:46 +0000 UTC [ - ]
mrkstu 2021-08-18 21:06:45 +0000 UTC [ - ]
Apple is introducing a reverse 'Little Snitch' where instead of the app warning you what apps are doing on the network, the OS is scanning your photos. Introducing a 5th columnist into a device that you've bought and paid for is a huge philosophical jump from Apple's previous stances on privacy, where they'd gone as far as fighting the Feds about trying to break into terrorist's iPhones.
endisneigh 2021-08-18 21:15:21 +0000 UTC [ - ]
mrkstu 2021-08-18 21:34:06 +0000 UTC [ - ]
endisneigh 2021-08-18 21:36:42 +0000 UTC [ - ]
vorpalhex 2021-08-19 02:35:09 +0000 UTC [ - ]
"Well you can turn it off" to "Well, but then you just don't trust Apple".
Clearly these are users who did trust Apple. Apple betrayed their trust. Given that Apple bulk handed over iCloud data to China, I don't really believe their pinky promise that they are, by policy only, going to resist government use of this tech. They can cave to government cases _and_ the government can certainly force them to.
endisneigh 2021-08-19 03:24:33 +0000 UTC [ - ]
Therefore you must trust Apple. So if you still have issues then you don’t trust Apple.
mrkstu 2021-08-18 22:03:03 +0000 UTC [ - ]
Trust is fragile and Apple has taken what in the past I believed it understood to be a strategic advantage and stomped it into little pieces.
creddit 2021-08-18 20:49:04 +0000 UTC [ - ]
The reason people don't like this, as opposed to, for example, Dropbox scanning your synced files on their servers, is that a compute tool you ostensibly own is now turned completely against you. Today, that is for CSAM, tomorrow, what else?
endisneigh 2021-08-18 20:52:22 +0000 UTC [ - ]
creddit 2021-08-18 21:28:48 +0000 UTC [ - ]
headShrinker 2021-08-19 10:24:00 +0000 UTC [ - ]
> people don’t care about that because you’re literally, voluntarily, giving Dropbox your files.
Correct and I agree. I don’t upload my most personal photos to Dropbox for this very specific reason. In fact I stopped using Dropbox when Condi Rice joined the board because she lacks good sense and doesn’t respect civil rights. See ‘The Patriot Act’ and ‘the Invasion of Afghanistan’. It was easy to stop using Dropbox because the alternatives were vast. Apple has me very purposely locked in to this scheme to where the alternatives are a huge transition and compromise on privacy no matter where I turn.
stevenalowe 2021-08-18 21:16:53 +0000 UTC [ - ]
mavhc 2021-08-18 21:19:52 +0000 UTC [ - ]
creddit 2021-08-18 21:36:32 +0000 UTC [ - ]
Not to mention the meaning of ownership is completely unrelated to ability to view the designs of a given object. I think I own the fan currently blowing air at me without ever having seen a schematic for its controls circuitry just fine and everyone for all of history has pretty much felt the same.
mavhc 2021-08-18 21:50:57 +0000 UTC [ - ]
The point is it's hard to hide your evil plans in daylight. Either you trust Apple or your don't. Same with Microsoft's telemetry, they wrote the whole OS, if they were evil they have 10000 easier ways to do it.
All Apple has to do is lie, you'll never know.
petersellers 2021-08-18 20:39:51 +0000 UTC [ - ]
The issue is that Apple previously was not intruding into their user's privacy (at least publicly), but now they are.
It sounds like your argument is that Apple could have been doing this all along and just not telling us. I find that unlikely mainly because they've marketed themselves as a privacy-focused company up until now.
shapefrog 2021-08-18 21:20:10 +0000 UTC [ - ]
Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.
You must not have been paying attention for the last 20 years.
endisneigh 2021-08-18 20:42:20 +0000 UTC [ - ]
petersellers 2021-08-18 20:50:18 +0000 UTC [ - ]
CubsFan1060 2021-08-18 20:57:20 +0000 UTC [ - ]
petersellers 2021-08-18 21:02:07 +0000 UTC [ - ]
CubsFan1060 2021-08-18 21:06:42 +0000 UTC [ - ]
petersellers 2021-08-18 21:22:13 +0000 UTC [ - ]
I don't think they have been doing server-side scanning until now, hence the publicity. Do you have any evidence that shows they've been doing this before?
CubsFan1060 2021-08-18 21:41:09 +0000 UTC [ - ]
I wasn’t able to find the whole video though.
No time to watch, but. https://www.ces.tech/Videos/2020/Chief-Privacy-Officer-Round...
petersellers 2021-08-18 22:31:29 +0000 UTC [ - ]
> Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images
It also seems weird that the EFF wouldn't have complained about this before if Apple was known to be doing server-side scanning for some time now.
I also am not going to watch through an hour video, but I scanned the transcript and I didn't see anything that said that Apple currently (at that time) scanned content.
CubsFan1060 2021-08-18 22:53:25 +0000 UTC [ - ]
endisneigh 2021-08-18 20:59:34 +0000 UTC [ - ]
petersellers 2021-08-18 21:03:35 +0000 UTC [ - ]
endisneigh 2021-08-18 21:16:05 +0000 UTC [ - ]
If you don’t trust Apple why would you use iCloud anyway? Makes no sense.
petersellers 2021-08-18 22:27:30 +0000 UTC [ - ]
I just have to point out the ridiculousness of this statement. With your logic any feature in any product can be "turned off" by not using the entire product at all. For example, the radio in my car sounds like shit, I guess I should just stop driving completely to avoid having to hear it.
In reality, this new "feature" will be a requirement of using iCloud Photos. The feature itself cannot be turned off. If your answer is to stop using iCloud Photos, that is no help for the millions of people who currently use iCloud Photos.
> If you don’t trust Apple why would you use iCloud anyway? Makes no sense.
I've trusted Apple for a long time because I felt like they were one of the only companies that cared about consumer privacy. After these actions I am less convinced of that. I'm not sure why that stance is so surprising.
endisneigh 2021-08-18 22:40:47 +0000 UTC [ - ]
iCloud Photos can be used on Windows, for example. This scanning only applies to iOS devices. You could use iCloud Photos and not be subject to the scanning.
> I've trusted Apple for a long time because I felt like they were one of the only companies that cared about consumer privacy. After these actions I am less convinced of that. I'm not sure why that stance is so surprising.
Nothing about what they're doing is contradictory with privacy beyond what they're already doing. The only reason they're even implementing it this way is because they do care about privacy. They could just not encrypt and scan on the server like Google, Microsoft, Dropbox, Box.com and more.
petersellers 2021-08-18 23:12:19 +0000 UTC [ - ]
That's great for the <0.001% of people who use iCloud Photos without an iPhone. Everyone else is SOL.
>Nothing about what they're doing is contradictory with privacy beyond what they're already doing. The only reason they're even implementing it this way is because they do care about privacy. They could just not encrypt and scan on the server like Google, Microsoft, Dropbox, Box.com and more.
False dichotomy. Apple doesn't have to do either of these things.
endisneigh 2021-08-18 23:20:51 +0000 UTC [ - ]
in any case I've given you the solution on how to use iCloud and not be scanned. Take your photos, sync your iDevice to your computer and manually upload to iCloud photos. there you go.
petersellers 2021-08-18 23:39:07 +0000 UTC [ - ]
A subpoena is a lot different than scanning every single photo on every user's device automatically.
> Why do you think all of these companies even do this? lol
Because most companies don't give a shit about privacy? And they will cave at even the slightest government pressure to do so. Honestly it's probably easier for them that way (but worse for the consumer).
> in any case I've given you the solution on how to use iCloud and not be scanned. Take your photos, sync your iDevice to your computer and manually upload to iCloud photos. there you go.
Gee thanks, your "solution" is 1000x harder to use than just using iCloud Photos on your iPhone. One of the biggest selling points of it now is for convenience, and this is pretty obvious so I'm starting to think you're just trolling at this point.
endisneigh 2021-08-19 00:11:43 +0000 UTC [ - ]
> Because most companies don't give a shit about privacy? And they will cave at even the slightest government pressure to do so. Honestly it's probably easier for them that way (but worse for the consumer).
Most people don’t care. I’m giving you solutions and you’re taking about convenience how hard it is. Plugging in your phone is trivial. I assume you’re going to stick with iCloud even despite this “privacy” issue? If not, what are you switching to?
petersellers 2021-08-19 01:12:56 +0000 UTC [ - ]
It's not black or white. Clearly Apple does care more about privacy than most other tech companies. That doesn't mean we shouldn't be critical of them when they make a mistake.
Also, Apple doesn't use e2ee now. AFAIK it's possible for them to decrypt the contents of your iCloud content, and they will do so and forward your data if legally required to.
> Most people don’t care. I’m giving you solutions and you’re taking about convenience how hard it is. Plugging in your phone is trivial.
I don't see what is so hard for you to understand about this - because of Apple's decision, I gain nothing and lose privacy. I shouldn't have to jump through extra hoops to avoid that, and it's reasonable to dislike Apple's position on this. In order to replicate the same functionality I would have to remember to sync my phone every night, and if I ever forgot and my phone died, I just lost data. Yours is an indefensible position when a MUCH easier solution exists today.
> I assume you’re going to stick with iCloud even despite this “privacy” issue? If not, what are you switching to?
That's a bold assumption. I haven't decided yet, for a couple reasons. One is that it's a pain in the ass to switch providers, so I'm going to wait and see if Apple actually follows through with it. Two is that it's possible that Apple is only implementing this so that they can in fact implement full e2ee and then tell law enforcement to kick rocks when they ask for user data (only allowing them to see the CSAM results before they are uploaded to the cloud). I might be willing to accept that compromise, but it's not clear that that is what their plan is.
basisword 2021-08-19 07:30:49 +0000 UTC [ - ]
the8472 2021-08-18 21:31:16 +0000 UTC [ - ]
What is also possible: No scanning on your device and encrypted cloud storage. E.g. borg + rsync.net, mega, proton drive.
fuzzer37 2021-08-18 20:33:46 +0000 UTC [ - ]
Why is that the alternative? How about everything is encrypted and nothing is scanned.
endisneigh 2021-08-18 20:38:46 +0000 UTC [ - ]
If you don’t want to be scanned you can turn it off. I honestly don’t see the issue. It seems the only thing people can say are hypothetical situations here.
saynay 2021-08-18 20:48:02 +0000 UTC [ - ]
robertoandred 2021-08-18 20:29:33 +0000 UTC [ - ]
LatteLazy 2021-08-18 21:17:00 +0000 UTC [ - ]
How about neither? Just let people have their privacy. Some will misuse it. Thats life.
aborsy 2021-08-18 20:17:58 +0000 UTC [ - ]
ALittleLight 2021-08-18 20:49:56 +0000 UTC [ - ]
1. Get a pornographic picture involving young though legal actors and actresses.
2. Encode a nonce into the image. Hash it checking for CSAM collisions. If you've found a collision go on to the next step, if not update the nonce and try again.
3. You now have an image that, to visual inspection will appear plausibly like CSAM, and to automated detection will appear like CSAM. Though, presumably, it is not illegal for you to have this image as it is, in fact, legal pornography. You can now text this to anyone with an iPhone who will be referred by Apple to law enforcement.
shapefrog 2021-08-18 21:33:07 +0000 UTC [ - ]
So at this point we have an image that computers think is CSAM and people think is CSAM, and when held up next to the original verified horrific image everyone agrees is the same image. At this point, someone is going to ask, rightly so, where that came from.
In order to generate this attack, you have had to go out and procure, deliberately, known CSAM. Ignoring that it would be easier just to send that to the target, rather than hiring talent to recreate the pose of a specific piece of child porn (or 30 pieces to trigger the reporting levels), the most likely person by orders of magnitude to be prosecuted in this scenario is the attacker.
rlpb 2021-08-18 22:17:45 +0000 UTC [ - ]
Define "get anywhere". Why won't you get raided by the police and have all your devices seized first?
shapefrog 2021-08-18 22:23:07 +0000 UTC [ - ]
If your 30 or so hash matching images matched their corresponding known CSAM then that goes on to the police and then they knock on your door.
rlpb 2021-08-18 23:21:24 +0000 UTC [ - ]
I am under the impression that Apple's scheme allows them only to verify the output of the matching algorithm (the "safety vouchers"), and not the image content itself. So in the hypothetical situation described in the thread, it won't be possible for Apple to detect the false positive, and they could pass on the report to NCMEC.
I fear that this will lead to a law enforcement raid without any actual human verification of the offending image itself.
If I'm wrong, I welcome citations which demonstrate the opposite.
shapefrog 2021-08-19 09:41:34 +0000 UTC [ - ]
You should also point out that the NCMEC themselves are not law enforcment.
[1] https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
rlpb 2021-08-19 12:12:42 +0000 UTC [ - ]
That suggests that Apple can't access the actual matched CSAM images at all.
> You should also point out that the NCMEC themselves are not law enforcment.
I don't think the distinction is relevant. The point is that they will get passed on for enforcement purposes, and at that point, innocent parties will find themselves raided and their devices seized without any human actually manually verifying that it is CSAM first.
Ashanmaril 2021-08-18 21:18:24 +0000 UTC [ - ]
You're already attempting to frame someone for a crime, might as well commit another crime while you're at it
arn 2021-08-18 20:58:29 +0000 UTC [ - ]
ALittleLight 2021-08-18 21:12:22 +0000 UTC [ - ]
FabHK 2021-08-18 21:13:48 +0000 UTC [ - ]
arsome 2021-08-18 21:09:39 +0000 UTC [ - ]
robertoandred 2021-08-18 21:19:06 +0000 UTC [ - ]
YeBanKo 2021-08-19 05:00:10 +0000 UTC [ - ]
When did they make NeuralHash public?
almostdigital 2021-08-18 21:41:29 +0000 UTC [ - ]
jl6 2021-08-18 19:54:47 +0000 UTC [ - ]
Room for improvement in the headline.
bruce343434 2021-08-18 21:23:46 +0000 UTC [ - ]
m3kw9 2021-08-18 20:42:22 +0000 UTC [ - ]
_trampeltier 2021-08-18 21:24:29 +0000 UTC [ - ]
63 2021-08-18 21:37:06 +0000 UTC [ - ]
_trampeltier 2021-08-19 08:04:01 +0000 UTC [ - ]
eurasiantiger 2021-08-18 20:48:21 +0000 UTC [ - ]
So they are already running a generic version of this system since iOS 14.3?
gjsman-1000 2021-08-18 19:31:51 +0000 UTC [ - ]
If you are Apple, even though EARN IT failed... you know where Washington's heart lies. Is CSAM scanning a "better alternative", a concession, an appeasement, a lesser evil, in the hope this prevents EARN IT from coming back?
Also, many people forgot about the Lawful Access to Encrypted Data Act of 2020, or LAED, which would unilaterally banned E2E encryption in entirety and required that all devices featuring encryption must be unlockable by the manufacturer. That also was on the table.
samename 2021-08-18 19:38:14 +0000 UTC [ - ]
If you’re trying to frame this as “we need to prevent Congress from ever passing something like the EARN IT act”, I agree. Apple and other tech companies already lobby Congress. Why aren’t they lobbying for encryption?
gjsman-1000 2021-08-18 19:39:34 +0000 UTC [ - ]
It's clear that EARN IT could literally be revived any day if Apple didn't do something to say "we don't need it because we've already satisfied your requirements."
orangecat 2021-08-18 19:56:36 +0000 UTC [ - ]
Alternatively, "Apple has shown that it's possible to do without undue hardship, so we should make everyone else do it too".
gjsman-1000 2021-08-18 19:59:35 +0000 UTC [ - ]
They were going to legally mandate that everything be scanned through methods less private than the ones Apple has developed here, through EARN IT and potentially LAED (which would have banned E2E in all circumstances and any device that could not be unlocked by the manufacturer). While that crisis was temporarily averted, the risk of it coming back was and is very real.
Apple decided to get ahead of it with a better solution, even though that solution is still bad. It's a lesser evil to prevent the return of something worse.
cwizou 2021-08-18 20:04:02 +0000 UTC [ - ]
That tech is not being deployed on iMessage which is the only e2ee(ish) service from Apple (with Keychain) and is what those legislative attempts are usually targeting. One could argue it would have made sense (technically) there though, sure.
Was it a reason to release it preventively, on something unrelated, to be in the good graces of legislators ? I'm not sure it's a good calculation, and it doesn't cover other platforms like Signal and Telegram that would still be seen as a problem by those legislators and require them to legislate anyway.
heavyset_go 2021-08-18 19:50:06 +0000 UTC [ - ]
This would only make sense if Apple intends to expand their CSAM detection and reporting system to detect and report those other things, as well.
gjsman-1000 2021-08-18 19:52:10 +0000 UTC [ - ]
Also, there is another reason why there is the CSAM Detecting and Reporting system. With Apple CSAM Scan, that big "excuse" Congress was planning to use through EARN IT to ban E2E is diffused, meaning now Apple has the potential to add E2E to their iCloud service before Congress can figure out a different excuse.
falcolas 2021-08-18 19:54:31 +0000 UTC [ - ]
There would need to be end-device scanning for arbitrary objects, including full text search for strings including 'Taiwan', 'Tiananmen Square', '09 F9', and so forth to even begin looking at e2e encryption of your items in the cloud.
At which point… what's the point?
gjsman-1000 2021-08-18 20:02:44 +0000 UTC [ - ]
charcircuit 2021-08-18 20:28:16 +0000 UTC [ - ]
It did not do this. The bill was essentially asking for search warrants to become a part of the protocol. If you're only solution to allowing for search warrants to work is to stop encrypting data I feel you are intentionally ignoring other options to make this seem worse than it is.
stormbrew 2021-08-18 20:44:45 +0000 UTC [ - ]
alerighi 2021-08-18 19:39:07 +0000 UTC [ - ]
vineyardmike 2021-08-18 19:59:47 +0000 UTC [ - ]
Its amazing since this would have decimated the American Tech sector in many unknown ways.
Unklejoe 2021-08-18 20:33:20 +0000 UTC [ - ]
eurasiantiger 2021-08-18 20:51:36 +0000 UTC [ - ]
In some countries even discussing the application of certain numbers is unlawful.
rolph 2021-08-18 19:44:54 +0000 UTC [ - ]
bitwize 2021-08-18 20:28:08 +0000 UTC [ - ]
gjsman-1000 2021-08-18 19:40:22 +0000 UTC [ - ]
jchw 2021-08-18 19:58:16 +0000 UTC [ - ]
But you shouldn’t get jailed for protected speech and you shouldn’t get jailed for preserving your privacy (via encryption or otherwise.) As cynical as people may get, this is one thing that we have to agree on if we want to live in a free society.
And above all, most certainly, we shouldn’t allow being jailed over encryption to become codified as law, and if it does, we certainly must fight it and not become complacent.
Apathy over politics, especially these days, is understandable with the flood of terrible news and highly divisive topics, but we shouldn’t let the fight for privacy become a victim to apathy. (And yes, I realize big tech surveillance creep is a fear, but IMO we’re starting to get into more direct worst cases now.)
jedmeyers 2021-08-18 20:17:49 +0000 UTC [ - ]
If you are IG Farben, you know where Berlin's heart lies...
bArray 2021-08-19 00:24:28 +0000 UTC [ - ]
I think the claim here is that you won't have access to the source images, and therefore generating collisions will be more difficult. But, if you do have access to the source images, this has been shown to be trivial. This of course doesn't stop nations states generating images that cause hash collisions, in fact they would be incentivized to do so.
I would also add that Apple are behind the curve, attempts to crack the hashing algorithm more efficiently are still ongoing: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...
> [..] not the final implementation [..]
Why on earth would you invite people to come and test your algorithm and then say "sure, you broke it, but it's not the real one". This kind of defeats the point and seems like some bait and switch bullshit. I suspect this is some retroactive cope from management realising they can't deploy this version and whatever they do deploy needs to be heavily modified.
> If Apple finds they are CSAM, it will report the user to law enforcement.
One statistic I want to know is: How many people already trigger this report function in the wild? Surely currently is the largest number of positives they will ever have - if it turns out to be 0% - Apple should just scrap it.
> Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.
So to avoid reporting, simply block Apple servers? Also, security by obscurity is not security - the algorithm supposedly being private just means that its not properly tested and Apple is not held to account.
> "Apple actually designed this system so the hash function doesn't need to remain secret, as the only thing you can do with 'non-CSAM that hashes as CSAM' is annoy Apple's response team with some garbage images until they implement a filter to eliminate those garbage false positives in their analysis pipeline," Nicholas Weaver, senior researcher at the International Computer Science Institute at UC Berkeley, told Motherboard in an online chat.
No. A report could be considered 'reasonable doubt' for law enforcement to do a full search. Imagine trying to explain to a judge why your iPhone shouldn't be searched because of a false-positive CSAM hash collision because of a malicious website you visited or a text message you received.
guerrilla 2021-08-18 20:55:42 +0000 UTC [ - ]
Isn't that where we already with things like Article 17 of the EU's Copyright Directive?
shuckles 2021-08-18 21:05:05 +0000 UTC [ - ]
trident5000 2021-08-18 20:52:17 +0000 UTC [ - ]
joelbondurant 2021-08-18 20:40:17 +0000 UTC [ - ]
throwawaymanbot 2021-08-18 20:26:35 +0000 UTC [ - ]
Hashes can be created for anything on a phone. And hash collisions enable the "Near match" of hashes (Similar items).
Lets pretend.. you use your face to log in to an iPhone, and there is a notice out for a certain person. If your face matches the hash, will you be part of the scan? You betcha.
zakember 2021-08-18 20:52:21 +0000 UTC [ - ]
If Apple is training a neural network to detect this kind of imagery, I would imagine there to be thousands, if not millions of child pornography images on Apple's servers that are being used by their own engineers to train this system
arn 2021-08-18 20:53:38 +0000 UTC [ - ]
zimpenfish 2021-08-18 21:11:25 +0000 UTC [ - ]
NCMEC generate the hashes using their CSAM corpus.
firebaze 2021-08-18 19:59:05 +0000 UTC [ - ]
csilverman 2021-08-18 20:13:00 +0000 UTC [ - ]
I'm not saying this as a fan of either Cook or their anti-CSAM measures; I'm neither, and if anyone is ever wrongfully arrested because Apple's system made a mistake, Cook may well wind up in disgrace depending on how much blame he can/can't shuffle off to subordinates. I don't think we're there yet, though.
atonse 2021-08-18 20:30:09 +0000 UTC [ - ]
Just saying that money hides problems.
cirrus3 2021-08-18 23:38:29 +0000 UTC [ - ]
Sounds pretty desperate.
As if any normal user is going to upload a photo to iCloud that is a collision.
The fact that such images are possible to generate means nothing by itself.
Also, they would need to accidentally have 30 of them.
Also, a human would have to not be able to tell the difference.
dang 2021-08-18 19:47:31 +0000 UTC [ - ]
Hash collision in Apple NeuralHash model - https://news.ycombinator.com/item?id=28219068 - Aug 2021 (542 comments)
Convert Apple NeuralHash model for CSAM Detection to ONNX - https://news.ycombinator.com/item?id=28218391 - Aug 2021 (155 comments)