Hugo Hacker News

Apple defends anti-child abuse imagery tech after claims of ‘hash collisions’

dang 2021-08-18 19:47:31 +0000 UTC [ - ]

Ongoing related threads:

Hash collision in Apple NeuralHash model - https://news.ycombinator.com/item?id=28219068 - Aug 2021 (542 comments)

Convert Apple NeuralHash model for CSAM Detection to ONNX - https://news.ycombinator.com/item?id=28218391 - Aug 2021 (155 comments)

epistasis 2021-08-18 20:08:41 +0000 UTC [ - ]

What's shocking to me is how little Apple management understood of what their actions looked like. Really stunning.

For a company that marketed itself as one of the few digital service providers that consumers could trust, I just don't understand how they acted this way at all.

Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust.

helen___keller 2021-08-18 20:48:39 +0000 UTC [ - ]

> I just don't understand how they acted this way at all.

There's a simple answer to this right? Despite everyone's reaction, Apple genuinely believe this is a novel and unique method to catch CSAM without invading people's privacy. And if you look at it from Apple's point of view that's correct: other major cloud providers catch CSAM content on their platform by inspecting every file uploaded, i.e. total invasion of privacy. Apple found a way to preserve that privacy but still catch the bad people doing very bad things to children.

dabbledash 2021-08-19 02:51:10 +0000 UTC [ - ]

There is no way to scan people’s content while “respecting their privacy.” The goal should be to create a system where you couldn’t do so even of you wanted to (or the state demanded it).

chrismorgan 2021-08-19 02:59:33 +0000 UTC [ - ]

> The goal should be to create a system where you couldn’t do so even of you wanted to (or the state demanded it).

There’s this bizarre notion that using end-to-end encryption can absolve you of responsibility, that the authorities will have to accept an answer of “we literally can’t access it”.

That’s just not the case for centralised things: you’re deliberately facilitating some service, government will find you liable for some things in its operation, and if you don’t comply, they’ll fine or shut you down. E2EE doesn’t absolve you from law; law is all about saying you’re not allowed to do things that are physically possible.

(Decentralised things, now they can be banned but not truly stopped because there’s no central party to shut down.)

Zak 2021-08-19 03:57:52 +0000 UTC [ - ]

When end-to-end encryption is done correctly, the answer is "we literally can't access it" as a matter of mathematics, whether the state accepts it or not.

A state that does not accept it might retaliate against the entity giving that answer or forbid future use of end-to-end encryption without backdoors, but the truth of the answer doesn't depend on anyone's acceptance.

onethought 2021-08-19 04:52:06 +0000 UTC [ - ]

Isn't that when the state prohibits your service? So then no body cares about your mathematical proof because it's a crime to use it.

This is what has happened in many countries already.

3np 2021-08-19 05:09:26 +0000 UTC [ - ]

Sure, but that's a case of "we're prohibited from providing end-to-end encryption and preserving user privacy so we can scan for prohibited content as mandated by authorities", not "we are keeping children safe while still preserving user privacy"

Legal terms such as "murder", "fraud" and "rape" do change as effect of regulatory changes. "Encryption" and "privacy" do not.

There's a limit to how much you can bend semantics in your PR before it breaks and you get backlash.

onethought 2021-08-19 05:35:06 +0000 UTC [ - ]

But if you're a company like apple, it'd be bad business to wait until large government bans your service/device before you respond to it. Much better to read the tea leaves and get a head of it.

3np 2021-08-19 06:30:03 +0000 UTC [ - ]

Again, their double-speak and redefining words don't help with the reception.

They're deliberately misrepresenting what's happening, appearing surprised when people misunderstand, and bundling together legitimate criticism with misunderstandings.

I can draw some parallels to how Google went out with FLoC.

Honestly I can't tell where Hanlon's razor should cut here.

onethought 2021-08-19 11:08:07 +0000 UTC [ - ]

What’s the double speak? It’s quite clear they are trying to maintain e2e encryption while also achieving CSAM scanning. Google/Facebook/Microsoft do this by scanning everything unencrypted. Apple do it while maintaining encryption and are being punished for it. It’s insane.

Zak 2021-08-19 11:15:23 +0000 UTC [ - ]

iCloud is not end-to-end encrypted, and Apple has not announced plans to add that option. Apple has the technical ability to scan images on the server.

onethought 2021-08-19 14:08:19 +0000 UTC [ - ]

Yes, they have the ability, but they found a way NOT to do that. Instead maintaining user privacy. And the world hates them for it, and would prefer they just unencrypted and scanned everything… doesn’t make sense.

dabbledash 2021-08-19 03:04:10 +0000 UTC [ - ]

There’s nothing stopping governments from banning E2EE, but in the absence of such bans, no one is under any obligation to build systems that empower them to spy on their users.

Retric 2021-08-19 03:16:46 +0000 UTC [ - ]

I wouldn’t be sure about that, the phone companies already have a legal obligation to allow wiretapping and the government is very happy to put gag orders on this stuff.

pengaru 2021-08-19 03:48:17 +0000 UTC [ - ]

Shouldn't the E2EE apple walled-garden app equivalent of wiretapping be pushing an app update to the suspect's phone with a sidechannel added for law enforcement to snoop, with warrant in hand?

supertrope 2021-08-19 03:55:18 +0000 UTC [ - ]

CALEA has a carve out for encryption. I’m sure when E2EE is about to be deployed to the masses the law will be updated to force key escrow.

heavyset_go 2021-08-19 04:02:12 +0000 UTC [ - ]

Courts can order them to collect data on users.

raxxorrax 2021-08-19 07:00:01 +0000 UTC [ - ]

The law may say that government isn't allowed to spy on people. So no, the state cannot just come and demand anything it wants.

Problem is that the law is self-contradictory and it is up to the judicative institutions to fix it as soon as possible.

UncleMeat 2021-08-19 12:29:00 +0000 UTC [ - ]

In this case, the state does demand it. Running a major cloud photo storage and sharing platform without checking for this material isn't an option in the US.

Retric 2021-08-19 03:11:28 +0000 UTC [ - ]

> The goal should be to create a system where you couldn’t do so even of you wanted to (or the state demanded it).

You can still do secure backups of your phone without using iCloud, but there isn’t a way for Apple to do end to end encryption of backups transparently like you can with real time communication. The only way end to end encryption of backups works is to require people keep a separate secure key(s) to avoid losing their data, which means a universal implementation has real direct risk for users.

As long as Apple has access to these files the FBI can legally require them to do these searches. From a pure PR perspective they should have communicated what was already going on before releasing this system because people assume something significant changed.

simfree 2021-08-19 05:56:38 +0000 UTC [ - ]

Mega.io (from the same people as MegaUpload) has e2e file encryption with just usernames and passwords.

There is no reason the password can't be the encryption key, with backup keys stored with a trusted third party (eg: your credit union or bank) without notation as to what these backup keys are tied to.

Retric 2021-08-19 14:01:27 +0000 UTC [ - ]

Standard passwords don’t provide enough entropy to provide secure encryption.

Trusting third parties with the password in unencrypted form is either systematic in which case the FBI now just needs collect data from 2 different organizations, or on a case by case basis in which case users will mess it up. Apple etc would have no way to verify users actually did something to back up their keys.

Apple’s current approach is to let users setup their own backups if they want security which allows for privacy just fine without providing a service with fundamental issues.

noapologies 2021-08-18 21:54:20 +0000 UTC [ - ]

It's really interesting to see the mental gymnastics people are willing to go through to defend their favorite trillion dollar corporations.

> other major cloud providers catch CSAM content on their platform by inspecting every file uploaded, i.e. total invasion of privacy.

> Apple found a way to preserve that privacy ...

So scanning for CSAM in a third-party cloud is "total invasion of privacy", while scanning your own personal device is "preserving privacy"?

The third-party clouds can only scan what one explicitly chooses to share with the third-party, while on-device scanning is a short slippery slope away from have its scope significantly expanded (to include non-shared and non-CSAM content).

spullara 2021-08-19 03:51:36 +0000 UTC [ - ]

They are scanning files that are being uploaded. So, yes.

helen___keller 2021-08-18 22:01:51 +0000 UTC [ - ]

> defend their favorite trillion dollar corporations

> is a short slippery slope away

Obviously people who trust Apple aren't concerned about slippery slopes. What's the point of your post?

stjohnswarts 2021-08-19 07:16:17 +0000 UTC [ - ]

He's just commenting on how ludicrous it is for anyone to believe that apple is being honest about this and about spying for governments on its users.

carnitas 2021-08-18 21:05:26 +0000 UTC [ - ]

Apple inspects every file on the local device Before its uploaded. It’s just pinky promise only matched with the on device database when an upload is intended.

squarefoot 2021-08-19 02:53:10 +0000 UTC [ - ]

That is the problem. CSAM is just smoke and mirrors paired with appeal to emotions to win approval more easily. I don't want anyone, neither Apple, nor Microsoft, Google and others to sneak into my files. Did anyone realize that in the 21st century our cellphone is essentially our wallet?

spullara 2021-08-19 03:52:25 +0000 UTC [ - ]

Microsoft and Google already scan all files uploaded to them regardless of your preferences.

stjohnswarts 2021-08-19 07:17:32 +0000 UTC [ - ]

No they don't, they scan them on the servers. If you have a source that says otherwise I'd love to see. I'm 99.99% sure you won't find any.

onethought 2021-08-19 04:53:41 +0000 UTC [ - ]

Exactly, and 100% Google and Microsoft have on device scanning as well. We are saying "Apple are bad for admitting it" what about the others that are not?

stjohnswarts 2021-08-19 07:18:19 +0000 UTC [ - ]

You have 0 proof of that other than conjecture. They are not scanning stuff and uploading it to government organizations to come and arrest you.

squarefoot 2021-08-19 12:39:37 +0000 UTC [ - ]

The point is that until devices and all their software/firmware become fully auditable, there's no way to be 100% safe, and we must resort to trust.

That wouldn't be a problem in an ideal world, but the one in which we live is far from even resembling one. Mining data is already a huge business, and governments everywhere would love tools to use to get advantage over people they don't like. There's huge motivation and demand for those tools at all levels, and at least governments have the resources to buy them and the power to force whoever implements them to stay silent. I'm not implying that spyware tools exist in any phone, PC, smart TV, car, etc. because we can't prove they don't; that's the argument used for UFOs, witches and unicorns, no thanks, but we better think like they do because technology, resources and demand for their adoption are real, and the rest is probability.

onethought 2021-08-19 11:05:39 +0000 UTC [ - ]

A literal feature of Google Photos is to find photos on your device.

Apple are not uploading anything to government organisations, in fact they are uploading less than google, by their own(and googles admission).

You have 0 proof that they are uploading to gov orgs… right?

2021-08-19 06:12:13 +0000 UTC [ - ]

slg 2021-08-18 21:06:52 +0000 UTC [ - ]

Apple controls the hardware, software, and cloud service. It was always a pinky promise that they wouldn't look at your files. I don't know why we should doubt that pinky promise less today than we did a month ago.

codeecan 2021-08-18 21:20:15 +0000 UTC [ - ]

They don't control the database used, any country can thru legal means attach additional hashes for search and reporting.

Apple has already proven it will concede to China's demands.

They are building the worlds most pervasive surveillance system and when the worlds governments come knocking to use it ... they will throw their hands up and feed you the "Apple complies with all local laws etc.."

tshaddox 2021-08-19 03:15:09 +0000 UTC [ - ]

They can upload any software to iPhones that they want. They may not create the database of hashes, but they can choose whether their software uses that database.

bsql 2021-08-18 21:49:15 +0000 UTC [ - ]

Do any of the other companies control the database of hashes they use? A government could have already done what you’re suggesting but I can’t find a source where this has been the case.

GeekyBear 2021-08-18 21:44:14 +0000 UTC [ - ]

> They don't control the database used

They control what goes into the on-device database that is used.

>The on-device encrypted child abuse database contains only data independently submitted by two or more child safety organizations, located in separate jurisdictions, and thus not under the control of the same government

https://www.techwarrant.com/apple-will-only-scan-abuse-image...

Invictus0 2021-08-19 02:33:43 +0000 UTC [ - ]

For now. Wait until the FBI comes knocking with a warrant.

threeseed 2021-08-19 02:27:25 +0000 UTC [ - ]

Then simply disable iCloud Photos sync.

It's bizarre to me that people are freaking out about governments adding client side hashes but no concern that they could be doing server side checks.

babesh 2021-08-19 03:01:13 +0000 UTC [ - ]

Have you tried disabling various iCloud features? Disabling iCloud is incredibly buggy.

I tried disabling iCloud keychain and it just flips back on. Sometimes it asks for a login first. Sometimes it shows a cancel/continue modal. Either way, it magically flips back on. No error message.

I tried backing up my device to my hard drive (with Photos already on iCloud) and it kept complaining that there wasn’t enough space. It throws error message after warning message that your content will be deleted. It created additional copies of my photos each time my phone synced.

To properly back up, I had to copy the photos directory to an external hard drive, delete the original, mark the external hard drive one as the system one and then finally free up enough space to back up. iCloud and the device backup weren’t smart enough to free up space for my backup. In fact, I first backed up all my photos to iCloud first because they said that it would free up space on my hard drive as necessary. LOL.

BTW, iCloud keychain is still on for me. Fuck Apple.

babesh 2021-08-19 03:13:42 +0000 UTC [ - ]

I just tried disabling iCloud for Game Center. It flips itself back on. No error message. Fuck Apple.

CoolGuySteve 2021-08-18 21:17:16 +0000 UTC [ - ]

Should they even be doing that though? It seems like a matter of time before it's possible to SWAT somebody by sending them a series of hash colliding image files given how not cryptographically secure the hash algorithm is.

I think I'm not the only one who'd rather not have my devices call the cops on me in a country where the cops are already way too violent.

UncleMeat 2021-08-19 12:31:31 +0000 UTC [ - ]

You could already just send them CSAM. That is a lot easier than finding a hash collision that also appears to be illegal content when downscaled and viewed by a human.

slg 2021-08-18 21:28:48 +0000 UTC [ - ]

But this doesn't change any of that. It only changes whether the scanning happens on your device as part of the upload process instead of on the server after the upload.

CoolGuySteve 2021-08-18 22:24:59 +0000 UTC [ - ]

Yeah that’s right and I don’t think either method is ethical.

simondotau 2021-08-19 02:24:46 +0000 UTC [ - ]

Ethics aside, on-device scanning has the benefit of Constitutional protection, at least in the USA. Any attempt by the Government to compel Apple to expand the on-device searching of privately owned devices to find other things would be a clear-cut 4th Amendment violation.

lupire 2021-08-18 23:53:31 +0000 UTC [ - ]

Then don't ask other peole to hold your non encrypted files for you.

croutonwagon 2021-08-19 02:31:48 +0000 UTC [ - ]

The human review phase is supposed to explicitly prevent that. Im not sure I would but my faith there, especially if its a flood of colisions and they are rated/paid on case clearance rate.

Further, this is step 1 of a process they have explicitly said they are looking to expand on [1], even going as far to state it in bold font with a standout color.

So theres no telling that they wont expand it by simply scanning everything, regardless of icloud usage, or pivot it to other combat "domestic terrorism" or "gun violence epidemics" or whatever else they feel like.

Its an erosion of trust, even if not a full stop erosion, its something they intend to expand upon and wont be taking back.

[1] https://www.apple.com/child-safety/pdf/Expanded_Protections_...

fsflover 2021-08-18 21:12:27 +0000 UTC [ - ]

Because now Apple confirmed themselves that this promise is not kept.

slg 2021-08-18 21:18:21 +0000 UTC [ - ]

What is "this promise"? Because I would consider it "we will only scan files that you upload to iCloud". That was true a month ago and that would be true under this new system. The only part that is changing is that the scanning happens on your device before upload rather than on an Apple server after upload. I don't view that as a material difference when Apple already controls the hardware and software on both ends. If we can't trust Apple to follow their promise, their products should already have been considered compromised before this change was announced.

lokedhs 2021-08-19 04:39:09 +0000 UTC [ - ]

Why is the scanning done on the device in the first place? Photos uploaded to Icloud are not encrypted (because the US government was apparently opposed to that idea), so why not do what Google and Facebook does and do the scanning once the image reaches their servers. What is the benefit in running the scan on the device?

fsflover 2021-08-18 21:21:16 +0000 UTC [ - ]

> The only part that is changing is that the scanning happens on your device before upload

This is the key point.

1. What if I change my mind and decide not to upload the picture?

2. This is a new mechanism for scanning private pictures on the device. What could go wrong?

> If we can't trust Apple to follow their promise, their products should already have been considered compromised before this change was announced.

Many people did trust Apple to keep their files private until now.

zepto 2021-08-18 21:44:09 +0000 UTC [ - ]

> This is a new mechanism for scanning private pictures on the device.

No it isn’t. It’s a mechanism for scanning pictures as they are uploaded to iCloud Photo Library.

Private pictures on the device are not scanned.

fsflover 2021-08-18 21:59:01 +0000 UTC [ - ]

Pictures not uploaded yet are private.

zepto 2021-08-18 22:23:49 +0000 UTC [ - ]

Not if you have opted to have them uploaded.

int_19h 2021-08-19 02:21:41 +0000 UTC [ - ]

Cloud backups are the default on iOS, so you rather have to opt out. And that doesn't even account for apps that can do the same.

zepto 2021-08-19 04:45:29 +0000 UTC [ - ]

iCloud backups are not scanned.

Also, what apps are you talking about?

int_19h 2021-08-19 06:09:29 +0000 UTC [ - ]

Photos that are automatically uploaded to iCloud are not scanned?

As for apps - WhatsApp, for example, saves everything to Camera Roll by default. Which then gets auto-uploaded to iCloud.

simondotau 2021-08-19 02:30:07 +0000 UTC [ - ]

If your device generates a safety token and it's never uploaded anywhere, that's a no-op.

slg 2021-08-18 21:32:28 +0000 UTC [ - ]

How are those two examples different than before? You can't unupload a photo under either the old or new system. I don't know why we would expect that the scanning feature will be more prone to accidentally scan too many photos compared to the uploading feature accidentally uploading too many photos.

>Many people did trust Apple to keep their files private until now.

And that was my original point. If a pinky promise from Apple is not enough to trust them, then Apple should have never been trusted.

fsflover 2021-08-18 22:00:36 +0000 UTC [ - ]

> You can't unupload a photo under either the old or new system.

You can choose to upload many pictures. They will start uploading. Then, you change your mind. Some pictures were not uploaded yet. But they were scanned by the new algorithm.

kelnos 2021-08-19 02:41:43 +0000 UTC [ - ]

I do wonder what happens in that case to the scan results for the photos that weren't yet uploaded. From earlier articles it sounded like the "voucher" is attached to the image upon upload, so it stands to reason that if you cancel an upload, results don't get uploaded for photos that didn't get uploaded. Who knows, though...

ravenstine 2021-08-18 22:01:25 +0000 UTC [ - ]

Why do people expect anything different? Every corporate promise is subject to change. When you hand your belongings to someone else, those things are liable to be tampered with.

mox1 2021-08-18 21:00:54 +0000 UTC [ - ]

But who was complaining about google and microsoft doing the cloud scanning?

I don’t mind my one drive being scanned for “bad stuff”, I very much mind my personally owned data stores being scanned, with no opt out.

GeekyBear 2021-08-18 21:07:35 +0000 UTC [ - ]

The only thing Apple scans are files you upload to iCloud Photos.

If you turn off iCloud Photos, nothing is scanned.

Microsoft scans everything.

>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)

https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...

erklik 2021-08-19 02:44:21 +0000 UTC [ - ]

> If you turn off iCloud Photos, nothing is scanned.

According to Apple. and For Now. Patriot Act was only for terrorists. Apple makes concessions for China. Creating this technology, makes it very easy for China to go, "Look at all photos, always". If they only want to scan stuff on iCloud Photos, no worries, just implement on it on their end. This tech does not need to exist in that case.

> Microsoft scans everything.

Everything uploaded to the Cloud. The Cloud bit is fairly important here. That's me, willingly, putting information on their property. Perfectly acceptable and fine for them to ensure that it's nothing unethical.

However, someone else snooping through your drawers looking for something to pin on you is not private, nor the same as checking their own property.

tshaddox 2021-08-19 03:17:29 +0000 UTC [ - ]

“And for now” is not a particularly strong argument because it applies to literally anything that software could ever conceivably do on an iPhone, because Apple has the ability to release any software updates they want.

kelnos 2021-08-19 02:43:47 +0000 UTC [ - ]

If Microsoft's scan is a server-side scan, how do they scan "everything" if "everything" includes things that you do not upload?

By your definition of "everything", both Microsoft and Apple scan "everything". (Or Apple will, after this new system is rolled out.)

Waterluvian 2021-08-19 02:12:50 +0000 UTC [ - ]

I have no idea what I expected but 16,000 photos feels… disgustingly high and ridiculously low at the same time.

ffhhj 2021-08-19 03:58:23 +0000 UTC [ - ]

It's well known that there are also videos. So these criminals won't be caught watching them.

nonbirithm 2021-08-19 00:15:48 +0000 UTC [ - ]

In my view, Apple is not going to take the risk of changing their mind and not implementing this new scanning feature in order to test the hypothesis that you can satisfy both the people who want to stop CSAM and the people who don't want their files scanned. And also, Apple or any other company announcing they will not scan or give up any user data means that criminals, foolish or not, will take notice and start uploading CSAM to their servers. If they get away with it, they've found their new home. That's why regulation against this kind of data exists. Possessing CSAM is still a felony.

Nobody is arguing about the legality of CSAM itself. It's an issue that is not popular to discuss, and of course being on the wrong side of it results in near-universal, justifiable derision. Stopping its spread is absolutely the right thing to do.

So at the point that companies will always be held liable for storing it, they will have to put up countermeasures of some kind or find themselves sued out of existence. Server-side scanning is one method, and Apple's on-device scanning is another.

There are certainly ways that Apple can go too far with whatever it happens to come up with as its solution to stopping CSAM, but there is still seems to have been a line behind which nobody particularly cares how the detection is implemented and life continues as usual. If Apple had chosen not to cross that line, maybe many of the arguments being made here would never have been brought up at all.

nine_k 2021-08-18 21:43:15 +0000 UTC [ - ]

That's the point: catching the bad guy foolish enough to keep known CSAM images on their phone, while not technically invading the privacy of any good guys. Anyway, if apple wanted to covertly invade their users' privacy, they'd have no technical problems to do so.

What it takes to accept is the "nothing to hide" mentality: your files are safe to scan (locally) because they can't be known CSAM files. You have to trust the scanner. You allow the scanner to touch your sensitive files because you're not the bad guy, and you want the bad guy be caught (or at least forced off the platform).

And this is, to my mind, the part Apple wasn't very successful at communicating. The whole thing should have started with an educational campaign well ahead of time. The privacy advantage should have been explained again and again: "unlike every other vendor, we won't siphon your files unecrypted for checking; we do everything locally and are unable to compromise your sensitive bits". Getting one's files scanned should have become a badge of honor among the users.

But, for some reason, they tried to do it in a low-key and somehow hasty manner.

int_19h 2021-08-19 00:01:07 +0000 UTC [ - ]

The "nothing to hide" mentality is exactly what's wrong about all this.

chii 2021-08-19 03:38:02 +0000 UTC [ - ]

Yes. You don't leave the toilet doors open, not because you don't "have nothing to hide", but because privacy is a right.

Everybody knows what goes on behind that door. And yet, everyone would much prefer to close it. Apple's method is equivalent to the toilet door being removed, so that you cannot do anything nefarious behind that door.

Your phone also (on average) have nothing to hide, but that's also why you need privacy on your devices.

ravenstine 2021-08-18 21:58:13 +0000 UTC [ - ]

I mostly agree, though I will argue for the other side; child predators don't exactly have a track record for intelligence. Pedophiles still to this day get caught trading CP on Facebook of all places. I think engaging in that kind of activity requires a certain number of brain cells misfiring. Watch some old episodes of To Catch A Predator. Most of those guys were borderline retarded or had obvious personality disorders.

jjjensen90 2021-08-19 02:32:13 +0000 UTC [ - ]

This is not true and a harmful generalization. There are dumb people of every type. There are probably smart people you respect that engage in CSAM and are minor-attracted. This kind of privacy invasion will only net a tiny section of that population, as most know that lack of caution is life or death.

ravenstine 2021-08-19 04:20:22 +0000 UTC [ - ]

> This is not true and a harmful generalization. There are dumb people of every type.

That's purely specious, and you say it as if I ever asserted there aren't "dumb people of every type.", which I didn't say.

Yes, there are dumb people of every "type". Are there as many intelligent people as those of low intelligence who struggle to think abstractly, struggle to read and write, have poor motor control, rely heavily on dogma for a moral compass, or have trouble with impulse control?

If you think there are as many people with a high IQ as a low IQ with those traits, I don't think you're being honest.

Here's some data for you:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4478390/

https://pubmed.ncbi.nlm.nih.gov/14744183/

https://pubmed.ncbi.nlm.nih.gov/961457/

https://www.dw.com/en/scientists-find-brain-differences-in-p...

https://sci-hubtw.hkvisa.net/10.1023/a:1018754004962

https://www.livescience.com/4671-study-pedophiles-tend-short...

https://pubmed.ncbi.nlm.nih.gov/29625377/

Are most pedophiles of low intelligence overall? It's doubtful, but there's room for more study.

Are the pedophiles that are offenders of low intelligence? Significantly more offending pedophiles are of low intelligence than the general population. Yes, the factors at play may also apply to other forms of crime. But when you factor out non-offending pedophiles, those we aren't as concerned with in regards to CSAM, the ones that are caught aren't exactly the cream of the crop.

Furthermore, I've met a handful of pedophiles, both convicted and not. Every one of them either was clearly mentally retarded or bordering on retardedness, or they had a severe personality disorder. Yes, it's a small sample size, but the ones that were caught did so doing the dumbest of shit (putting aside the horrific nature of child exploitation in the first place). This even includes a female pedophile, who was both stupid and psychopathic, and I find it funny that one of those studies questions the very existence of female pedophiles.

There are many pedophiles that are of above-average intelligence. Most may not even be considered "dumb". But to imply that there's no difference in distribution of, frankly, dumb people to smart people between different behaviors is absurd in that it ignores any sort of relation implication that intelligence has on said behaviors.

As a group, the data does not suggest that pedophiles are exceptionally intelligent, but rather the opposite.

So yes, while I believe there is a privacy concern (I said earlier I would not want such software or hardware running on my own compuer), I stand by my conclusion that CSAM detection would catch a significant number of pedophiles and probably continue to do so for some time. The point of such systems isn't to be a superweapon to catch every pedophile. The ones that it might catch happen to be the ones who are most likely to offend. If you play the stupid game of uploading CSAM to Apple iCloud, you win the stupid prize of getting convicted, and it just so happens that a ton of pedophiles are likely to do something that boneheaded.

kelnos 2021-08-19 02:48:04 +0000 UTC [ - ]

That sounds like survivorship bias. Of course the people who get caught for doing things the stupid way are going to be branded as stupid. How about the people who don't get caught, or the people who get caught despite doing a decent job of covering their tracks? What evidence do you have that most pedophiles are of the stupid sort?

dragonwriter 2021-08-19 02:56:28 +0000 UTC [ - ]

> child predators don't exactly have a track record for intelligence

Maybe that's true of the ones that get caught, but if so there's a good chance that at least part of that is inverse survivorship bias. (Similar to “Why are most of the CIA covert missions you hear about failures?”)

simondotau 2021-08-19 02:51:06 +0000 UTC [ - ]

You don't have to be intelligent to know that CSAM is super illegal and you probably don't want them to be co-mingled with pictures of your mum and last night's dinner.

ravenstine 2021-08-19 04:28:39 +0000 UTC [ - ]

Except intelligence is clustered with other pathologies such as poor impulse control. Knowledge of the law is not the end-all-be-all of human behavior. Both the intelligent and unintelligent are capable of poor impulse control, but the unintelligent are statistically much more likely to exhibit a lack of impulse control, among other things such as difficulty in long-term reasoning and connecting one's actions with adverse effects.

simondotau 2021-08-19 05:42:39 +0000 UTC [ - ]

I've never lacked the impulse control to import regular porn into my photo library. And regular porn only comes with the risk of embarrassment, not prison.

Spivak 2021-08-18 21:07:49 +0000 UTC [ - ]

Where's the line though? Would you be upset that the Dropbox client scanned things before uploadng it? What about the Google Drive client JS?

speleding 2021-08-18 22:11:06 +0000 UTC [ - ]

> catch the bad people doing very bad things to children

You may catch a few perverts looking at the stuff, but I'm not convinced this will lead to catching the producers. How would that happen?

kelnos 2021-08-19 02:50:23 +0000 UTC [ - ]

The theory is that if you make it too risky to possess CSAM, demand for it will drop, leading to less exploitation. I don't know if I buy that theory, but... there it is.

Compare this to the war on drugs. There is huge demand for drugs. Many drugs are highly illegal to possess, and a lot of people get jailed for possessing them. And yet most of us consider the war on drugs to be an abject failure that hasn't done much more than create inequity.

Why should something like CSAM possession be any different? I wish it was different, and these sorts of tactics would reduce this awful problem, but I'm just not convinced it does.

shuckles 2021-08-18 22:23:55 +0000 UTC [ - ]

Known CSAM detection was one of three features they announced under the banner of child safety. Certainly they understand it is by itself an incomplete intervention for a much larger problem.

mirker 2021-08-18 22:23:06 +0000 UTC [ - ]

> Apple genuinely believe this is a novel and unique method to catch CSAM without invading people's privacy

What’s novel about this? The technique and issues with it are fairly obvious to anyone with experience in computer vision. It seems not too different from research from 1993 https://proceedings.neurips.cc/paper/1993/file/288cc0ff02287...

The issues are also well known and encompass the whole subfield of adversarial examples.

shuckles 2021-08-18 22:56:02 +0000 UTC [ - ]

Presumably the system is novel enough that a very similar proposal which lacked one important insight (using perceptual hashes to turn image similarity problems into set intersection problems) was accepted to USENIX this year: https://www.usenix.org/conference/usenixsecurity21/presentat...

mirker 2021-08-19 01:04:46 +0000 UTC [ - ]

The privacy aspect does seem novel, and that paper evaluates the various aspects of privacy and the quality of perceptual hashes. However, anything concerning privacy (of client or server data) is orthogonal to the vision methods underpinning the system, which are well understood and are the primary cause of the post. For that matter, I don’t see how the paper’s proofs of privacy and/or formalization of the problem help with hash collisions. And that particular paper doesn’t use deep learning methods which are probably easier to attack with off-the-shelf methods.

unyttigfjelltol 2021-08-18 21:32:03 +0000 UTC [ - ]

Maybe Apple was focused on keeping CSAM off a cloud resource that really is Apple's in both a practical and technical sense. Merely scanning after upload maybe doesn't accomplish that because the material already has arrived in an Apple resource and someone might define that as a failure. Viewed from a perspective of iCloud compliance it sorta makes sense. No one ought to feel a lot of (legal) security deploying a system that could be used to host CSAM, from the Blockchain to whatever, it sounds like a don't-wizz-on-the-electric-fence kind of risk.

Saddened by the privacy-adverse functionality on handsets but Apple seems to be hitting the nail on the head that poor communication principally is driving outrage.

Syonyk 2021-08-18 21:51:23 +0000 UTC [ - ]

With the new system, it still arrives on the server, and they can't even know about it until some threshold has been passed - and then the servers still have to scan it.

So it's not even accomplishing "keeping it off the servers."

simondotau 2021-08-19 02:56:48 +0000 UTC [ - ]

Actually it'll do an excellent job of keeping it off their servers.

Apple's actions will be effective at making sure every CSAM aficionado will not trust their device. Or, if they're tech savvy, they will at least know not to co-mingle their deepest darkest secrets alongside photos of their mum and last night's dinner.

If you think Apple's very big, very public blow-up is making waves in the hacker/security/libertarian crowd, just imagine how big it's blowing up in the CSAM community right now. I dare say it's probably all they've been talking about for the past two weeks. Unlike places like Hacker News where there's plenty of people desperate to drag Apple through the coals, the CSAM community will be highly motivated to have a precise understanding of what Apple is doing. I dare say that they'll be extremely well informed about exactly what Apple's plans entail and how to work around them.

lixtra 2021-08-18 20:58:31 +0000 UTC [ - ]

> other major cloud providers catch CSAM content on their platform by inspecting every file uploaded

That is very unlikely. Most likely they compare some hash against a database - just like apple.

jowsie 2021-08-18 21:01:12 +0000 UTC [ - ]

That's what they're saying, the key difference is Apples happens on your device, not theirs.

GiorgioG 2021-08-18 21:05:04 +0000 UTC [ - ]

That's not really better or different. In any meaningful way.

nine_k 2021-08-18 21:51:35 +0000 UTC [ - ]

It is indeed meaningfully different: your sensitive data never leaves your device in order to be scanned. There can't be a crack in Apple servers that would expose files of millions of users uploaded for scanning.

Eventually it could lead to Apple platform not having any CSAM content, or any known legally objectionable content, because any sane perpetrator would migrate to other platforms, and less sane, caught.

This, of course, takes the user base to overwhelmingly agree that keeping legally objectionable content is a bad thing and should be exposed to police, and to trust the whatever body which defines the hashes of objectionable content. I'm afraid this order is not as tall as we could imagine.

heavyset_go 2021-08-18 22:31:17 +0000 UTC [ - ]

> your sensitive data never leaves your device in order to be scanned. There can't be a crack in Apple servers that would expose files of millions of users uploaded for scanning.

And yet photos that get scanned are still uploaded to iCloud Photos, so they do end up on Apple's servers.

nine_k 2021-08-19 10:47:08 +0000 UTC [ - ]

You can disable iCloud backup.

Doing so right while activating a new iDevice is the way to prevent its private keys from ending up in iCloud, and so preventing Apple, or law enforcement, or some malicious hackers from cracking into your device.

GeekyBear 2021-08-18 22:35:33 +0000 UTC [ - ]

Photos that users choose to upload to iCloud Photos do indeed get uploaded to iCloud Photos?

I'm failing to see the issue.

heavyset_go 2021-08-18 22:50:21 +0000 UTC [ - ]

I think you might be tilting at windmills here. The issue is that the post I'm replying to claims that data never ends up on Apple's servers even though it does. Thanks for confirming that it does, though.

GeekyBear 2021-08-19 01:59:42 +0000 UTC [ - ]

Am I supposed to be shocked that photos the user uploads to iCloud are on iCloud?

The results of the scan looking for kiddie porn cannot be read by Apple until the device finds 30 examples of photos that match known kiddie porn, whereupon Apple gets the decryption key so they can see which images on their server need review, and a human review is triggered to make sure there haven't been 30 false positives.

heavyset_go 2021-08-19 02:35:17 +0000 UTC [ - ]

> Am I supposed to be shocked that photos the user uploads to iCloud are on iCloud?

I implore you to read the comment I originally replied in order to understand the context of my replies. Personally, I don't care what you're shocked about or not, as my OP wasn't directed at you at all.

Spivak 2021-08-18 21:09:03 +0000 UTC [ - ]

Then why the outrage if they're the same?

JimBlackwood 2021-08-18 21:21:06 +0000 UTC [ - ]

Because this is Apple and people are very passionate about both hating and liking Apple.

GeekyBear 2021-08-18 22:34:04 +0000 UTC [ - ]

Apple keeps the scan results encrypted with a key they don't have until the device informs them that the threshold of 30 images that match known kiddie porn has been reached.

After that, they get the decryption key and trigger a human review to make sure there haven't been 30 false positives at once.

That is better, and more private, in a very meaningful way.

False positive scan data sitting on the server is open to malicious misuse by anyone who can get a warrant.

>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

https://www.dailymail.co.uk/news/article-7897319/Police-arre...

Also, I sincerely doubt that Google has ended it's practice of refusing to hire a human being to check for false positives when a poorly performing algorithm is cheaper.

kelnos 2021-08-19 02:59:50 +0000 UTC [ - ]

I do agree that it's more private, but I'm not sure it's better.

I'm fine with the idea that if I upload stuff to someone else's server, they may take a look at it and maybe even punish me for what I've uploaded. Certainly if I encrypt the data before I upload it, they can't do that. But if I don't, then it's fine with me if they do.

But my device should not be snitching on me. Yes, this device-side scanning is supposedly gated on enabling uploads to iCloud, but that doesn't really make for much of a distinction to me. And since Apple certainly has the capability, they are likely one secret court order away from being required to scan even photos that aren't being uploaded, at least on some targeted subset of devices.

tshaddox 2021-08-19 03:22:17 +0000 UTC [ - ]

> And since Apple certainly has the capability, they are likely one secret court order away from being required to scan even photos that aren't being uploaded, at least on some targeted subset of devices.

Apple has the ability to upload literally any software to iPhones, so this argument applies equally to literally any conceivable bad thing that software could do on iPhones.

GeekyBear 2021-08-19 14:03:44 +0000 UTC [ - ]

As well as any conceivable bad thing that Google could be ordered to add to Android.

orangeoxidation 2021-08-18 21:05:14 +0000 UTC [ - ]

Making it obviously and unquestionably more invasive.

Spivak 2021-08-18 21:14:30 +0000 UTC [ - ]

I just don't get this. Say you're given two options when going through the TSA.

1. The TSA agent opens your luggage and searches everything for banned items.

2. The TSA agent hands you a scanner for you to wave over your luggage in private, it prints out a receipt of banned items it saw, and you present that receipt to the agent.

Which one is more invasive?

IncRnd 2021-08-19 03:07:06 +0000 UTC [ - ]

The scenario is far more like this

3. The home builder installs a TSA scanner in all newly built homes. The scanners will scan all items as they are put away into drawers and cupboards, attempting to detect the presence of banned items.

Then the TSA Agents say they won't report you until at least 30 banned items are detected, even though you haven't flown in 10 years and don't have banned materials. As the TSA Agents walk back to their car, you overhear words like warrant, trouble, and tap amidst their chuckles. What could go wrong with that?

Aaargh20318 2021-08-18 21:28:24 +0000 UTC [ - ]

Apple’s solution more like the TSA agent shows up at your house and inspects your luggage before you even leave for the airport and he pinky promises not to report on anything else he might see while in your home.

shuckles 2021-08-18 22:25:45 +0000 UTC [ - ]

This version is incorrect because Apple’s solution by design does not reveal the non-suspect contents of your luggage.

rfd4sgmk8u 2021-08-18 21:47:24 +0000 UTC [ - ]

1. Cops on the street 2. Cops in your house

Which one is more invasive?

I don't think there is much to not understand. Apple is proposing putting a cop in your house. On device scanning is vastly vastly different than server side scanning. One is over apples property, the other is over YOUR property. Its a big deal, and a huge change in policy. And as myself and others have indicated from years of observation, it will grow and grow...

smnrchrds 2021-08-18 21:28:47 +0000 UTC [ - ]

3. TSA installs scanners in all homes, premises to only use them to scan your luggages as you head to the airport and nothing else.

2021-08-18 21:18:23 +0000 UTC [ - ]

cma 2021-08-18 21:26:27 +0000 UTC [ - ]

The TSA installs this scanner in your house so it can detect before you even get to the airport. They give a pinky promise they will only run it when you book your ticket and intend on going to the airport where you would be expected to be scanned anyway.

The scanner technology can also be used for detecting dissident journalism, but they say it won't be, it is just for preventing terrorism, but still they say, they really need to install it in your house even though it is only for scanning when you intend on going to the airport.

bsql 2021-08-18 21:57:03 +0000 UTC [ - ]

> The scanner technology can also be used for detecting dissident journalism, but they say it won't be, it is just for preventing terrorism

This has always been possible with server side scanning. Yeah we have to trust Apple to not enable it for users who don’t use iCloud photos but we’ve always had to trust Apple given their closed source system.

varispeed 2021-08-19 09:12:17 +0000 UTC [ - ]

> Apple found a way to preserve that privacy

This is Orwellian doublespeak.

You don't preserver someone's privacy by snooping on their devices.

mschuster91 2021-08-18 21:04:46 +0000 UTC [ - ]

> Apple found a way to preserve that privacy but still catch the bad people doing very bad things to children.

They're not going to find predators, all they are going to find is people incompetent enough to have years old (otherwise how would it end up at NCMEC?) CSAM on their iPhones and dumb enough to have iCloud backup turned on. To catch actual predators they would need to run AI scanning on phones for all photos and risk a lot of false positives by parents taking photos of their children on a beach.

Plus a load of people who will inadvertently have stuff that "looks" like CSAM on their devices because some 4chan trolls will inevitably create colliding material and spread it in ads etc. so it ends up downloaded in browser caches.

All of this "let's use technology to counter pedos" is utter, utter crap that is not going to save one single child and will only serve to tear down our freedoms, one law at a time. Want to fight against pedos? Make photos of your airbnb, hotel and motel rooms to help identify abuse recording locations and timeframes, and teach children from an early age about their bodies, sexuality and consent so that they actually have the words to tell you that they are being molested.

zepto 2021-08-18 21:45:59 +0000 UTC [ - ]

> people incompetent enough to have years old (otherwise how would it end up at NCMEC?) CSAM on their iPhones

Who do you think has 30 or more CSAM images on their phone?

atq2119 2021-08-18 22:27:01 +0000 UTC [ - ]

Let's be real, of course many of those people will be child abusers.

But let's also be real about something else. Think of the Venn diagram of child abusers and people who share CSAM online. Those circles are not identical. Not everybody with CSAM is necessarily a child abuser. Worse, how many child abusers don't share CSAM online? Those can't ever be found by Apple-style invasions of privacy, so one has to wonder if we're being asked to give up significant privacy for a crime fighting strategy that may not even be all that effective.

The cynic in me is also wondering what fraction of people working at places like NCMEC are pedophiles. It'd be a rather convenient job for them. And after all, there's a long history of ostensibly child-serving institutions harboring the worst kind of offenders.

zepto 2021-08-18 22:32:48 +0000 UTC [ - ]

All CSAM is produced by child abuse. Everyone who shares 30 images online is a participant in child abuse, even if only by creating a demand for images.

> a crime fighting strategy

Is it a crime fighting strategy? I thought it was just about not making their devices and services into aids for these crimes.

> The cynic in me is also wondering what fraction of people working at places like NCMEC are pedophiles.

Good question. What has that to do with this?

atq2119 2021-08-18 22:45:48 +0000 UTC [ - ]

That's a tedious debate that has been rehashed to death. Finding somebody who has CSAM without being a child abuser doesn't save even a single child. Finding everybody who has CSAM without being a child abuser would likely save some children (drying out commercial operations, perhaps?), but is also basically impossible.

And how many children can really be helped in this way? Surely the majority of child abuse happens independently of the online sharing of CSAM.

And is that really worth the suspension of important civil rights? That is really what this discussion is about. Nobody is arguing against fighting child abuse, the question is about how, and about what means can be justified.

zepto 2021-08-19 00:05:26 +0000 UTC [ - ]

> Surely the majority of child abuse happens independently of the online sharing of CSAM.

Does it? I see no reason to assume that.

> And is that really worth the suspension of important civil rights?

No civil rights are being suspended. It’s not about that at all.

mschuster91 2021-08-18 23:21:38 +0000 UTC [ - ]

Pedophiles. But these people are consumers of extremely old content, which means the CSAM can won't even make a tiny dent against active, current abusers!

CSAM scanning is sold as beneficial, while in reality it won't do shit while it opens dangerous precedence backdoors!

zepto 2021-08-19 00:08:58 +0000 UTC [ - ]

> extremely old content

Why do you assume it’s extremely old?

mapgrep 2021-08-19 02:29:03 +0000 UTC [ - ]

> other major cloud providers catch CSAM content on their platform by inspecting every file uploaded

So does Apple.

EDIT: some people don’t like that answer, but “inspecting” in this context clearly means “digitally inspecting” (Google does not physically look at every file) and Apple does this with files that are uploaded. They do it in device, but it’s still inspected. That’s the whole point of this controversy, that there’s not much difference to people WHERE apple inspects and on device is actually arguably worse. Your sentence does not in any way distinguish what Apple does from what others do.

jiocrag 2021-08-19 05:38:39 +0000 UTC [ - ]

Repeating myself from another thread but… China.

Conjecture, admittedly:

1. Apple cannot lose the Chinese market. Huge and more important fastest growing geo for the company.

2. China is self deprecating elements of its own tech sector (aggressive crackdowns on both established companies like tencent and Alibaba as well as individual web sites). They are clearly cleaning house from a surveillance and control perspective. Apple is not immune, but it’s an American behemoth, so open door crackdowns are impossible.

I don’t think Apple’s CSAM push and China’s crackdown are purely coincidental.

Who can argue with stemming child abuse? It’s the type of hot button issue that affords broad acceptance for intrusive tech.

The leap from scanning for abuse to scanning for anti regime content is more like a tiny step.

It seems obvious from afar that the company adamant about refusal to unlock a potential terrorist’s iPhone on privacy principles (with the attendant marketing benefits) would so suddenly force push (and therefore ensure collection massive training data with or without opt-in for Chinas v2.0) such a boldly invasive feature addition.

Turns out vertical integration is both gift and curse (dependent on the whims of the integrator) for on-device privacy and autonomy.

strogonoff 2021-08-19 06:01:15 +0000 UTC [ - ]

If you use Apple devices, you are agreeing to their ToS which since 2019 give them the right to pre-screen your uploads for any potentially illegal (sic) content, not specifically CSAM imagery.

If Apple happens to use similar ToS in China (no idea if true), you bet CCP would be all over this clause.

A worst-case wild guess from the pessimist in me: they had to add this clause in 2019 to appease CCP, and then they got thinking if they could make use of it for the greater good (tm), too. Hopefully it’s very incorrect.

stjohnswarts 2021-08-19 07:14:31 +0000 UTC [ - ]

Of course the CCP loves stuff like this. The big thing is will apple acquiesce to keeping mum about scanning everyone's stuff at the directive of the government. Also will that just be a "china" build or will it be in the USA code base but "flipped off"

GeekyBear 2021-08-18 20:46:29 +0000 UTC [ - ]

The thing that's shocking to me is that Google, Microsoft and all the big names in tech have been scanning everything in your account (email, cloud drive, photos, etc) for the past decade, without any noticeable uproar.

Apple announces that it is going to start scanning iCloud Photos only, and that their system is set to ignore anything below a threshold of ~30 positives before triggering a human review, and people lose their minds.

mightybyte 2021-08-18 21:10:11 +0000 UTC [ - ]

I think the difference here is that it's ON YOUR DEVICE. I think there's a pretty clear understanding that if you upload stuff to a cloud provider they can do whatever they want with it. This is different. This is reaching into what has up until now mostly been considered a private place. Law enforcement often has to get warrants to search this kind of thing.

This is the difference between putting CSAM on a sign in your front yard (maybe not quite front yard but I can't come up with quite the same physical equivalent to a cloud provider) and keeping it in a password protected vault in your basement. One of those things is protected in the U.S. by laws against unlawful search and seizure. Cloud and on your device are two very different things and consumers are right to be alarmed.

I'll say it again, if you are concerned with this privacy violation, sell your Apple stock and categorically refuse to purchase Apple devices. Also go to https://www.nospyphone.com/ and make your voice heard there.

onethought 2021-08-19 04:57:34 +0000 UTC [ - ]

Only the safety tokens are generated on your device, the action triggering scan happens in the cloud (just like all the others) and then it goes to human review, so if it's a hash collision it'd be caught there when they review the images, and they can only review the images that matched CSAM.

I honestly can't find the uproar here. Google devices can face match photos offline... so they are applying a neural net (scanning) ON THE DEVICE! How is that not worse than what apple do?

mightybyte 2021-08-19 11:08:29 +0000 UTC [ - ]

One is matching a face--yes, concerning, and I don't like it but IMO Google has had a much worse privacy reputation for quite some time--and the other is reporting private data to law enforcement and potentially abusive parents. It's quite a bit different.

The difference can also be seen from a customer service perspective. One is a feature that lets you sort according to which friends you were with. The other is a feature that puts you in jail. No thanks. Not gonna pay money for that.

onethought 2021-08-19 14:05:40 +0000 UTC [ - ]

No it reports you to Apple, Apple report you to police. Exactly like google, they will also report you to the police, just they search your library unencrypted.

Literally no difference.

If you have illegal stuff only on your phone neither google or Apple will be notified or notify anyone else.

GeekyBear 2021-08-18 21:14:38 +0000 UTC [ - ]

ON YOUR DEVICE (where it's encrypted in a way that Apple can't read until the 30 image threshold is crossed) is more private than doing the same scan on server where a single false positive can be misused by anyone who can get a subpoena.

kelnos 2021-08-19 03:16:22 +0000 UTC [ - ]

A poster upthread made an analogy that I really like. Sure, like all analogies, it's imperfect, but I think it strikes at why many people are uneasy about this.

Let's say the TSA were to install air-travel-contraband scanners in everyone's homes, but promise only to scan things that are being put into your luggage as you prepare to go to the airport. And let's say that this became a requirement if you want to board a plane.

That's what this feels like. I'm fine with Google scanning through everything in my GMail account, or everything I've uploaded to GDrive, or created in GDocs. That stuff is on their servers, unencrypted, and I explicitly put it there.

But I'm sure as hell not going to let Google install something on my laptop (or phone!) that lets them look at my stuff, even if they pinky-promise that they'll only scan stuff that I intend to upload.

mightybyte 2021-08-18 22:40:03 +0000 UTC [ - ]

> ON YOUR DEVICE (where it's encrypted in a way that Apple can't read until the 30 image threshold is crossed)

First of all, you have to be able to read it to do the comparison that can increment the counter to 30. So regardless of whether it is or is not encrypted there, they're accessing the unencrypted plaintext to calculate the hash.

And yes, on my device is definitively more private than on someone else's server--just like in my bedside drawer is more private than in an office I rent in a co-working space.

heavyset_go 2021-08-18 22:20:41 +0000 UTC [ - ]

> where it's encrypted in a way that Apple can't read

This doesn't matter because Apple can read iCloud data, including iCloud Photos. They hold the encryption keys, and they hand over customers' data for about 150,000 users/accounts a year in response to requests from the government[1].

[1] https://www.apple.com/legal/transparency/us.html

GeekyBear 2021-08-19 02:03:37 +0000 UTC [ - ]

Of course Apple can read iCloud data.

How do you think Google and Microsoft scan everything in your account? They all have the capability to read your cloud data.

What Apple cannot read are the results of your device scanning your iCloud Photos. Those results are encrypted and stay that way until your device finds 30 matches for known kiddie porn.

Once you pass the threshhold, Apple gets the decryption key and a human review is triggered to make sure there weren't just 30 false positives.

pasquinelli 2021-08-18 21:18:58 +0000 UTC [ - ]

what kind of encryption can't be decrypted until some threshold has been crossed?

maybe you mean to say that apple says they won't read it until that threshold has been crossed.

zepto 2021-08-18 21:46:46 +0000 UTC [ - ]

> what kind of encryption can't be decrypted until some threshold has been crossed?

The kind Apple has built. You should read the docs. This is literally how it works.

2021-08-19 01:12:39 +0000 UTC [ - ]

mightybyte 2021-08-18 22:37:08 +0000 UTC [ - ]

This is the same company that saved the disk encryption password as the password hint. You really trust them to not screw this up when the stakes are that it could ruin your life and/or land you in jail? I'm simply not ok with that.

zepto 2021-08-19 00:04:48 +0000 UTC [ - ]

How exactly do you imagine a bug in this will land you in jail?

mightybyte 2021-08-19 11:12:17 +0000 UTC [ - ]

Ever heard of planted evidence?

GeekyBear 2021-08-18 21:25:39 +0000 UTC [ - ]

No, I'm saying they designed the system so that they don't have the key to decypt the scan result "vouchers" until after the device tells them that the 30 image threshold is crossed.

>Apple is unable to process individual vouchers; instead, all the properties of our system mean that it’s only once an account has accumulated a collection of vouchers associated with illegal, known CSAM images that we are able to learn anything about the user’s account.

Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy.

https://techcrunch.com/2021/08/10/interview-apples-head-of-p...

Meanwhile, a single false positive from an on server scan is open to malicious use by anyone who can get a subpeona.

telside 2021-08-18 21:49:08 +0000 UTC [ - ]

The lady doth protest too much, methinks

Just going to respond to every post on here with these absurd points? K apple guy.

brandon272 2021-08-18 21:08:11 +0000 UTC [ - ]

Here's the reason for the sudden uproar: The scanning of things on third party servers was just barely tolerated by a lot of people, whether it's for advertising purposes or government intrusion. People accept it because it's considered reasonable for these third parties to scan data in exchange for providing a service for free (e.g. Google), or because they need to have some degree of accountability for what is on their servers.

When that scanning gets moved from the cloud to being on your device, a boundary is violated.

When that boundary is violated by a company who makes extreme privacy claims like saying that privacy is a "fundamental human right"[1], yes, people will "lose their minds" over it. This shouldn't be shocking at all.

[1] https://apple.com/privacy

GeekyBear 2021-08-18 21:34:30 +0000 UTC [ - ]

When the scanning gets moved from the cloud to being on device, Apple itself cannot see the results of the scan until the risk that the result is only a false positive is greatly reduced.

You would have to have 30 false positives before Apple can see anything, which is unlikely, but the next step is still a human review, since it's not impossible.

OnlineGladiator 2021-08-18 21:56:40 +0000 UTC [ - ]

I don't care. It's my device (at least that's how Apple used to advertise it) and I disagree with Apple's policy, full stop. I don't care about "think of the children" (especially since they will be scanning pictures of my own children), and furthermore I don't trust Apple not to change the policy in the future. They've created a backdoor and now the device is compromised and privacy is broken. If you want to trust Apple go ahead. They have eroded my trust and I doubt they could regain it within a decade.

GeekyBear 2021-08-18 22:24:31 +0000 UTC [ - ]

>I don't care

If anything, you should be outraged that Google and Microsoft have been scanning much more of your data, and doing so in a much more intrusive way.

Apple only scans iCloud Photos and they do so in a way that they can't see the results until they can be reasonably sure it's not just a false positive.

OnlineGladiator 2021-08-18 23:13:15 +0000 UTC [ - ]

You have clearly misunderstood my position as you have replied to 2 of my comments already. You seem to have an "Apple vs the world" mentality when my point is entirely about holding Apple accountable to their own marketing. You sound like a bizarre Apple apologist and I'm tired of trying to explain myself to you when all you want to do is explain to me why Apple is better than everybody when I do not care and also fundamentally disagree.

If you think Apple's approach is the best you're allowed to think that. I disagree.

2021-08-19 06:26:39 +0000 UTC [ - ]

brandon272 2021-08-18 22:02:08 +0000 UTC [ - ]

Whether or not Apple can see the results is completely irrelevant, to me at least. Automated surveillance is still surveillance.

OnlineGladiator 2021-08-18 20:54:46 +0000 UTC [ - ]

The other companies didn't advertise and pride themselves on being privacy focused. Part of the appeal of Apple was that you could avoid that issue and they touted it regularly. Now they're telling their customers to go fuck themselves (so long as they're 18 or older).

GeekyBear 2021-08-18 21:02:43 +0000 UTC [ - ]

Conducting the scan on the user's device instead of on the companies server is more private.

Apple can't decrypt the results of the scan until the ~30 image threshold is crossed and a human review is triggered.

Given Google's reluctance to hire humans when a poorly performing algorithm is cheaper, are they turning over every single false positive without a human review?

lifty 2021-08-18 22:06:31 +0000 UTC [ - ]

I guess people had a sense of ownership of these devices and they feel that it’s doing some they they don’t want. If ownership means control, maybe in the digital age full ownership/control is not really possible on a managed device like the iPhone. There are other example like the John Deer tractor issues.

GeekyBear 2021-08-18 22:26:27 +0000 UTC [ - ]

I have a sense of ownership over my non-publicly-accessible data when it's backed up to a cloud drive.

Apple isn't scanning that.

Google and Microsoft are.

lifty 2021-08-18 22:36:11 +0000 UTC [ - ]

Sure, you own the data, but you don’t have any control on what’s happening to it. Just like iPhone users, we own the physical device but we only control certain aspects of what it’s doing, and the boundaries are not set by us, but by Apple.

GeekyBear 2021-08-18 23:07:45 +0000 UTC [ - ]

If I back up my files to Apple's cloud server, I don't have to worry about them scanning my private files at all.

They don't cross that line like Google and Microsoft do.

With Apple, nothing but files you upload to iCloud Photos get scanned.

2021-08-18 21:10:54 +0000 UTC [ - ]

GuB-42 2021-08-18 22:19:41 +0000 UTC [ - ]

Google is open about the fact they scan and track everything, they even make it a point of convenience "hey, we looked at your email and found a plane ticket, based on where you are, you should leave in 2 hours if you want to be at the airport on time".

Facebook, even more so, they are explicitly anti-privacy to the point of being insulting.

Microsoft will happily show you everything they may send when you install Windows, you can sometimes refuse, but not always. They are a bit less explicit than Google, but privacy is rarely on the menu.

As for Amazon, their cloud offers are mostly for businesses, different market, but still, for consumers, they don't really insist on privacy either.

So that if any of these company scan your pictures for child porn, it won't shock anyone, because we know it is what they do.

But Apple claims privacy as a core value, half of their ads are along the lines of "we are not like the others, we respect your privacy, everything on your device stays on your device, etc...", they announce every (often legitimate) privacy feature with great fanfare, etc... So much that people start to believe it. But with that, people realize that Apple is not so different from the others after all, and if they bought an overpriced device based on that promise, I understand why they are pissed off.

heavyset_go 2021-08-18 20:55:44 +0000 UTC [ - ]

This seems like a common deflection, but get back to me when either company puts programs in my pocket that scan my data for crimes and snitch on me to authorities.

GeekyBear 2021-08-18 21:05:19 +0000 UTC [ - ]

>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account

https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...

You don't consider the contents of your email account or the files you mirror to a cloud drive to be your own private data?

WA 2021-08-18 21:21:42 +0000 UTC [ - ]

No OP, but yes. I expect that my emails can be read and that my files on a cloud service can be accessed. And I don’t even use Gmail.

I expect that my ISP tracks and stores my DNS resolutions (if I use their DNS) and has a good understanding of the websites I visit.

I expect that an app that I grant access to my contacts uploads as much data as it can to their servers.

I expect WhatsApp and similar apps to collect and upload meta data of my entire photo library such as GPS info the second I give them access.

Hence, I don’t give access. And hence, it’s a problem if there is no opt-out of local file scanning in the future.

_trampeltier 2021-08-18 21:17:52 +0000 UTC [ - ]

There are worlds between. One case is about pictures on your device. The other case is about pictures on Googles "devices" or network. You had to upload it to Google. EMail is also nothing like a letter anyway. It's a postcard. Everybody can read it.

GeekyBear 2021-08-18 21:30:47 +0000 UTC [ - ]

No, one case is about photos you upload to iCloud Photos only, and the other case is about every single thing in your Google account.

>So if iCloud Photos is disabled, the system does not work, which is the public language in the FAQ. I just wanted to ask specifically, when you disable iCloud Photos, does this system continue to create hashes of your photos on device, or is it completely inactive at that point?

If users are not using iCloud Photos, NeuralHash will not run

https://techcrunch.com/2021/08/10/interview-apples-head-of-p...

heavyset_go 2021-08-18 22:12:34 +0000 UTC [ - ]

No, that happens on Google's servers and is not an agent in my pocket that scans and invades my personal property for evidence of crimes and snitches on me to authorities.

bobthepanda 2021-08-18 20:59:50 +0000 UTC [ - ]

At least Google does.

https://protectingchildren.google/intl/en/

> CSAI Match is our proprietary technology, developed by the YouTube team, for combating child sexual abuse imagery (CSAI) in video content online. It was the first technology to use hash-matching to identify known violative videos and allows us to identify this type of violative content amid a high volume of non-violative video content. When a match of violative content is found, it is then flagged to partners to responsibly report in accordance to local laws and regulations. Through YouTube, we make CSAI Match available for free to NGOs and industry partners like Adobe, Reddit, and Tumblr, who use it to counter the spread of online child exploitation videos on their platforms as well.

> We devote significant resources—technology, people, and time—to detecting, deterring, removing, and reporting child sexual exploitation content and behavior. Since 2008, we’ve used “hashing” technology, which creates a unique digital ID for each known child sexual abuse image, to identify copies of images on our services that may exist elsewhere.

arsome 2021-08-18 21:02:44 +0000 UTC [ - ]

Note that they're not pushing it to devices that consumers have purchased, just running it on their cloud services. I think it's an important distinction here with what Apple is doing.

heavyset_go 2021-08-18 21:03:01 +0000 UTC [ - ]

No, that happens on Google's servers and isn't an agent in my pocket that runs on and invades my personal property. It's also only for videos.

bobthepanda 2021-08-18 21:06:59 +0000 UTC [ - ]

The images being scanned are the ones that go to iCloud.

The Google page has a section later down that also says they use hashing of images.

GeekyBear 2021-08-18 21:10:55 +0000 UTC [ - ]

When the scan is carried out on device, Apple can't read the results of the scan until the 30 image threshold is reached.

When Google scans on server, a single false positive result can be abused by anyone who can get a warrant.

>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

https://www.dailymail.co.uk/news/article-7897319/Police-arre...

Apple's method is more private.

2021-08-18 21:10:27 +0000 UTC [ - ]

websites2023 2021-08-18 20:51:18 +0000 UTC [ - ]

BigCos, take note: you’re better off doing nefarious shit without telling anyone. Because, if you come clean, you’ll only invite an endless parade of bloggers who will misconstrue your technology to make you look bad.

__blockcipher__ 2021-08-18 21:08:17 +0000 UTC [ - ]

No misconstrual needed. This technology is genuinely bad. It scans images against an arbitrary government-owned black-box database. There’s no guarantee that it’s only CSAM.

OrvalWintermute 2021-08-18 21:24:21 +0000 UTC [ - ]

I have a serious problem imagining that Apple will not be willing to redeploy this technology in a novel and nefarious means in exchange for continued market access when billions are on the line.

Mainland China will probably be the first chip to fall. Can't imagine the Ministry of State Security not actively licking their lips, waiting for this functionality to arrive.

websites2023 2021-08-18 21:09:51 +0000 UTC [ - ]

NECMEC isn’t owned by the government and the database of hashes is available from Apple.

commoner 2021-08-18 21:30:41 +0000 UTC [ - ]

> NECMEC isn’t owned by the government

Even though NCMEC describes itself as "private", it was established by and has been heavily funded by the U.S. government.

From an archive of NCMEC's own history page, cited on Wikipedia (https://web.archive.org/web/20121029010231/http://www.missin...):

> In 1984, the U.S. Congress passed the Missing Children’s Assistance Act which established a National Resource Center and Clearinghouse on Missing and Exploited Children. The National Center for Missing & Exploited Children was designated to fulfill this role.

> On June 13, 1984, the National Center for Missing & Exploited Children was opened by President Ronald Reagan in a White House Ceremony. The national 24-hour toll-free missing children’s hotline 1-800-THE-LOST opened as well.

$40 million/year of U.S. government funding from a 2013 bill (https://en.wikipedia.org/wiki/Missing_Children%27s_Assistanc...):

> The Missing Children's Assistance Reauthorization Act of 2013 (H.R. 3092) is a bill that was introduced into the United States House of Representatives during the 113th United States Congress. The Missing Children's Assistance Reauthorization Act of 2013 reauthorizes the Missing Children's Assistance Act and authorizes $40 million a year to fund the National Center for Missing and Exploited Children.

tjfl 2021-08-18 21:28:56 +0000 UTC [ - ]

They're not owned by the federal government, but they do get a lot of federal government money.

> The National Center for Missing & Exploited Children® was established in 1984 as a private, nonprofit 501(c)(3) organization. Today, NCMEC performs the following 15 specific programs of work, funded in part by federal grants (34 U.S.C. § 11293): Source: https://www.missingkids.org/footer/about

US DOJ OJJDP lists recent grants totaling $84,446,366 in FY19 and FY20. Source: https://ojjdp.ojp.gov/funding/awards/list?awardee=NATIONAL%2...

__blockcipher__ 2021-08-18 21:42:16 +0000 UTC [ - ]

And don’t forget, it’s way more than just money:

https://www.law.cornell.edu/uscode/text/18/2258A

You must report to them and only them.

For the GP to claim they’re not government “owned” is a rhetorical trick at best and outright ignorant absurdity at worst.

__blockcipher__ 2021-08-18 21:40:11 +0000 UTC [ - ]

That’s like saying the federal reserve is “private”. No, the NECMEC is not a private entity. Not only was it heavily funded/created by the gov, but more importantly it is granted special legal status. You and I can’t just spin up our own CSAM database. Nor do we have any laws that say that any companies aware of CSAM must send it to us and only us.

justin_oaks 2021-08-18 20:56:38 +0000 UTC [ - ]

It's important to keep nefarious stuff on the server side because eventually someone will reverse engineer what's on the client side.

Imagine if Apple had done this on the client side without telling anyone, and later it was discovered. I think things would be a whole worse for Apple in that case.

squarefoot 2021-08-19 02:42:36 +0000 UTC [ - ]

Devices with proprietary OSes spend more and more time phoning home, then exchanging data officially for "updates". They probably tell the truth, but should one of them decide to hide users data exfiltration or other monitoring practices behind those updates, it would be quite hard to catch them. In other words we have no way to tell that they're not already doing this.

websites2023 2021-08-18 22:57:57 +0000 UTC [ - ]

That being the case, why do it client side at all, when presumably every claim they make is verifiable?

rblatz 2021-08-18 21:47:39 +0000 UTC [ - ]

Because it isn’t about the CSAM scanning. Apple keeps reframing the argument back to that. most people expect that google and Microsoft to scan files on their servers for CSAM, people object to Apple turning your phone against its owner, and once building out the capability to do this only policy prevents it from being abused.

Copernicron 2021-08-18 21:17:59 +0000 UTC [ - ]

Most people don't pay attention to whether or not their stuff is being scanned. Those of us who do pay attention have known for a long time that it's being scanned. Especially by Google and Facebook. Basically anyone whose business model is based off of advertising. My default assumption is anything I upload is scanned.

xdennis 2021-08-19 01:39:51 +0000 UTC [ - ]

> all the big names in tech have been scanning everything in your account [...] without any noticeable uproar

In part because people didn't know.

And if you were one of innocent people caught by them, you wouldn't want people to know.

floatingatoll 2021-08-18 21:17:38 +0000 UTC [ - ]

Right now, it seems like there are two specific groups of people that are upset with Apple: Freedom evangelists (e.g. EFF) and tech futurists (e.g. HN). They're saying, essentially:

"Apple does not have my permission to use my device to scan my iCloud uploads for CSAM"

and

"This is a slippery slope that could result in Apple enforcing thoughtcrimes"

Neither of these viewpoints are particularly agreeable to the general public in the US, as far as I can determine from my non-tech farming city. Once the fuss in tech dies down, I expect Apple will see a net increase in iCloud adoption — all the fuss we're generating is free advertising for their efforts to stop child porn, and the objections raised are too domain-specific to matter.

It's impossible to say for certain which of your outcomes will occur, but there's definitely two missing from your list. Corrected, it reads:

"Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust, or Apple sees no effect whatsoever on consumer trust, or Apple sees a permanent boost in consumer trust."

I expect it'll be "no effect", but if I had to pick a second guess, it would be "permanent boost", well offsetting any losses among the tech/free/lib crowd.

drvdevd 2021-08-19 00:04:31 +0000 UTC [ - ]

You forgot the most important point of the last 24 hours:

"Apple has created a system for detecting CSAM on local devices which has already proven vulnerable to cheap perceptual hash collision attacks. It's now highly inconceivable Apple will be able to deploy this technology as-is without having their users exploited."

In other words it's not just about privacy or thoughtcrimes anymore but should be viewed as actually dangerous to use their devices. I feel a bit dramatic even typing that out but I.. think it's true?

floatingatoll 2021-08-19 15:25:03 +0000 UTC [ - ]

I would also warn you against owning any device with a radio. Carriers control the radio towers and can be compelled by government agencies and selfish corporate interests to exploit remote execution vulnerabilities in radio chips in order to plant CSAM content onto devices.

How dramatic is too dramatic? When does something that hasn’t happened to you or anyone you know become a risk you’re willing to sacrifice personal convenience to mitigate? Will you be divesting yourself of all wireless radio hardware? If not, then why would you be worried about users being exploited through a more clumsy and less effective process such as CSAM signature hacking?

The piece of information you’re taking for granted, that few in free/tech/lib are confronting, is the assumption that this process can be exploited at scale to harm millions of people.

So far as I can tell, there will probably be zero or one false positive CSAM matches that pass the known algo, the unknown algo, the human blurred comparison, and the human unblurred comparison — all steps that must occur before law enforcement is invoked to collect digital evidence - in the first year.

How many false positives (to the nearest 10^X) do you think the system will generate in the first year that result in law enforcement actions? Your words suggest that everyone is vulnerable, and there are 10^9 users, so do you believe there will be 10^9 false positives in the first year? Do you think only a thousand people will be affected, so 10^3? How do you judge which is more likely correct?

It is unlikely that this system will generate 10^9 false positives, or else it never would have passed QA. I encourage you to consider how you would personally quantify this risk, and then also look up the quantified risks for killing someone while driving a car or getting struck by lightning while indoors. I don’t know what the actual reality will be, but I don’t think it's a very large X.

duxup 2021-08-18 20:54:40 +0000 UTC [ - ]

I'm kinda amazed.

I mentioned I was thinking of moving from an Android phone to Apple soon, somewhat privacy related.

My friends lectured me on "they're scanning your photos" ... meanwhile they share their google photos albums with me and marvel about how easy they are to search ...

Maybe we (humans) only get outraged based on more specific narratives and not so much the general topics / issues?

I don't know but they didn't seem to notice the conflict.

kzrdude 2021-08-18 21:01:42 +0000 UTC [ - ]

Isn't there a small difference between these? A) They scan everything I have released to google photos B) They scan everything that exists on my device

Psychologically, you'll feel a difference in what you accept between the two, I think

whoknowswhat11 2021-08-18 21:07:19 +0000 UTC [ - ]

As has been repeated over and over, apple only scans photos that are part if icloud photos (ie, uploaded).

Don't want your photo's scanned, don't sync them to icloud. Seriously! Please include the actual system when discussing this system, not your bogeyman system.

"To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

To increase privacy - they perform the scan on device prior to upload.

Johnny555 2021-08-18 21:23:00 +0000 UTC [ - ]

As has been repeated over and over, apple only scans photos that are part if icloud photos (ie, uploaded).

"for now"

Which is the part most people have a problem with -- they say that they are only scanning iCloud uploads now, but it's simple extension of the scanner to scan all files.

I don't care if Apple scans my iCloud uploads on iCloud servers, I don't want them scanning photos on my device.

smnrchrds 2021-08-18 21:37:17 +0000 UTC [ - ]

I will believe them if they put their money where their mouth is: as a clause to iOS user agreement saying that if they ever use they ever use this functionality for anything other than CSAM or on anything other than iCloud photos, ever person who was subjected to this scan will be paid 100 million dollars by Apple. I will believe them if they put this clause in, and I know when they have changed their plans when they remove the clause. No more pinky swears, let's add some stakes for breaking the promise.

kcb 2021-08-18 21:44:39 +0000 UTC [ - ]

Until one day a box pops up. It says "We've updated our terms. See this 10,000 line document here. Please accept to continue using your device." Then your clause is gone.

smnrchrds 2021-08-18 22:19:57 +0000 UTC [ - ]

I'm fine with that. Apple is a big company and changing its TOS will be instantly reported on. It will act as a canary of sorts to know when they turn evil and we know to stop using their products.

TimTheTinker 2021-08-19 03:16:27 +0000 UTC [ - ]

Isn't the design of this system enough of a canary?

Johnny555 2021-08-18 21:38:38 +0000 UTC [ - ]

I'd be satisfied with a money back guarantee -- if they change the policy then I can return the phone for a full refund.

whoknowswhat11 2021-08-19 03:25:18 +0000 UTC [ - ]

People are pretending it scans everything on your device. This conversation is already bad enough without adding additional confusion over what this does.

I don't have a problem if it scans everything, but its not. Let's stick to what is doing. Android could do this as well, so talking about what companies like google could do is not so interesting - they could do almost anything.

duxup 2021-08-18 21:10:02 +0000 UTC [ - ]

Google does everything they can to backup your photos ... and do it automatically.

I'm not sure there's a real difference unless you want to watch your settings all the time. In google land they tend to reset ... and really that happens a lot of places.

I think for most people if you use google, you're in their cloud.

gmueckl 2021-08-18 21:07:03 +0000 UTC [ - ]

Isn't the default on Android these days that all images get uploaded ("backed up") to Photos? And how many users are actually altering defaults?

shapefrog 2021-08-18 21:06:48 +0000 UTC [ - ]

> B) They scan everything that exists on my device

No ... They scan everything that I have released to apple photos that exists on my device.

Same scan - different place.

__blockcipher__ 2021-08-18 21:10:00 +0000 UTC [ - ]

The issue is that as soon as you set that precedent, it’s only a matter of time before it extends beyond iCloud. That’s the problem with doing any device-level scanning. This is dystopian and scary. And yes I understand the technology in its current iteration. The current form has problems (weaponizing collisions etc) but the real issue comes with future developments.

shapefrog 2021-08-18 21:48:30 +0000 UTC [ - ]

> They scan everything that exists on my device

I get to select the issue and it was in response to the previous claim that 'they' are scanning everything that exists on the device right now.

A year ago this was a possible 'future development'. 10 years from now I could be living on Mars. This is all hypothetical.

heavyset_go 2021-08-18 22:54:45 +0000 UTC [ - ]

> This is all hypothetical.

Exactly. However, people don't seem to have an issue when hypothesizing that CSAM detection is good actually, because Apple might implement E2EE, despite no evidence of such intentions.

For some reason, though, people take issue when others hypothesize that CSAM detection is bad actually, because Apple might expand it further and violate users' privacy even more. And there's actually precedent for this hypothesis, given Apple's actions here and their own words[1]:

> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.

[1] https://www.apple.com/child-safety/

robertoandred 2021-08-18 21:12:41 +0000 UTC [ - ]

Was scanning server side not a precedent?

shapefrog 2021-08-18 21:52:29 +0000 UTC [ - ]

Furthermore, was uploading (backing up) the entire contents of your phone to a server to be freely scaned and distributed (with warrant of course) server side not also a precedent?

Must have been absloutely nuts round here back in 2011 when they announced the particular slippery slope that is iCloud backup.

__blockcipher__ 2021-08-18 21:37:53 +0000 UTC [ - ]

It set the precedent for scanning server side. Which I don’t like but there’s really no way to avoid it anyway.

Now we have clientside scanning, and actually being led by the [note the scare quotes] “pro-privacy” two ton gorilla. It’s a whole different ballgame.

Multicomp 2021-08-18 21:28:24 +0000 UTC [ - ]

It was but it was one we were not in a position to resist at the time. Surrendered to almighty convenience.

But here appears to be beyond the pale such that people are saying "this far, no farther!"

antpls 2021-08-19 14:25:49 +0000 UTC [ - ]

HN's Point of View doesn't represent the PoV of the 1 billion active iPhone owners.

Most people won't have any idea about the meaning of "hashes" and "databases". Not everyone is trying to actively fight the system and shit on everything, most people just want to live happily with their friends and family, they won't care that Apple scans their devices.

> "Either there will be heads rolling at management, or Apple takes a permanent hit to consumer trust."

Oh god ! How wasn't all of this obvious to the top Apple management, but so obvious to epistasis! Damn, thanks man for correcting and leading Apple to the right track !

daxuak 2021-08-18 20:17:12 +0000 UTC [ - ]

They are very likely aware of backlash but since this is a easily defendable hill with a very slippery slope down the way, it is of their interest to push for it IMHO.

floatingatoll 2021-08-18 20:46:53 +0000 UTC [ - ]

Apple has declared in interviews that the slope shall not be slipped, but you're indicating that they chose this specifically because they can slip that slope.

How did you determine that their intentions contradict their words? Please share the framework for your belief, so that we're able to understand how you arrived at that belief and to evaluate your evidence with an open mind.

(Or, if your claim is unsupported conjecture, please don't misrepresent your opinions and beliefs as facts here at HN.)

monadgonad 2021-08-18 21:21:40 +0000 UTC [ - ]

“IMHO” means “in my humble opinion”

floatingatoll 2021-08-18 21:28:25 +0000 UTC [ - ]

I'm glad to see that there now, thanks!

camillomiller 2021-08-18 20:31:11 +0000 UTC [ - ]

Apple's management can also be prone to hubris. This is also very much a case where the engineers were left unbridled without any proper check from marketing and comms, I suspect because of the extreme complexity of the problem and the sheer impossibility of putting it into layman terms effectively.

bhawks 2021-08-18 20:37:23 +0000 UTC [ - ]

Was this a pure engineering driven exercise? I hadn't heard that before and it doesn't match up with what I've heard about Apple's internal culture. The level of defensive pushback really makes me feel that they are very bought into the system.

Definitely would appreciate a link to anything substantial indicating that this was a bunch of eng in over their heads.

emptysongglass 2021-08-18 20:47:53 +0000 UTC [ - ]

Also have to agree: I don't see how this could originate from engineers. Every engineer I've spoken with at the company I work for has been mortified by this insanity.

sorrytim 2021-08-18 20:59:38 +0000 UTC [ - ]

It’s really unpopular inside the fruit company. I’ve not spoken with a single engineer in Cupertino who thinks this is a good path. There has been radio silence from management on how to address this. It almost feels like they want this to fail. In the past management has given us resources and talking points we can use with our friends and family. Not this time.

orangeoxidation 2021-08-18 21:53:44 +0000 UTC [ - ]

> In the past management has given us resources and talking points we can use with our friends and family.

Wow. That feels like an overreach. 'We don't just buy your labor, but also your private opinion and tell you how to talk to your family'.

sorrytim 2021-08-19 02:57:25 +0000 UTC [ - ]

Certainly people tow the official line, but it does help us to articulate the position of the company so that we aren’t so misinformed that we spread FUD.

dylan604 2021-08-18 21:23:27 +0000 UTC [ - ]

This doesn't hold much water. As a counter example, we have all of the engineers that designed Facebook, Google, etc. We have people working in ad tech. We have people working in lots of places that you and the people that you talk with directly would be moritified, but there are countless others that need a paycheck and do whatever pointy haired bosses tell them to do.

stingraycharles 2021-08-18 20:48:58 +0000 UTC [ - ]

Maybe the implementation is difficult to put into layman’s terms, but the high level goals - scanning the devices for child porn - most definitely is not.

I’m not buying the “engineers were left unbridled” argument, I think there just have been a level of obliviousness in much wider a part of the organization for something like this to happen.

FabHK 2021-08-18 20:52:56 +0000 UTC [ - ]

> What's shocking to me is how little Apple management understood of what their actions looked like. Really stunning.

Maybe because they underestimated people's ignorance.

I think they saw (and still do see) it as a better, more privacy preserving technique than what everyone else is doing.

fraa-orolo 2021-08-18 21:05:05 +0000 UTC [ - ]

The thing is that this is a better and more privacy preserving technique.

Their hubris is in not seeing or thinking that they will be able to stand up to all kinds of abuses of this system that its mere existence will invite; their hubris is also in thinking that they will be able to perfectly and without mistakes manage and overview a system making accusations so heinous that even the mere act of accusing destroy people's lives and livelihoods.

shuckles 2021-08-18 21:29:01 +0000 UTC [ - ]

What abuses apply to CSAM scanning in this hybrid pipeline which don’t apply to iCloud Backup? If they scanned on the server, why couldn’t governments “slippery slope” abuse the system by requiring that all files on iOS end up in iCloud Backup, where they can be scanned by the system?

Since the announcement, I can think of a dozen ways Apple could be easily forced into scanning all the contents of your device by assembling features they’ve already shipped. Yet they haven’t. At some point, people need to produce evidence that Apple cannot hold the line they’ve said they will.

int_19h 2021-08-19 02:37:55 +0000 UTC [ - ]

The governments always could do that, but then they'd be the target of the pushback, and there's already significant mistrust of governments wrt surveillance because of how it can be abused.

What we have now is Apple, with its "strong privacy" record, normalizing this. If it succeeds, it would be that much easier for the governments to tackle other stuff onto it. Or, say, lower the threshold needed to submit images for review. I can easily picture some senator ranting about how unacceptable it is that somebody with only 20 CSAM photos won't be flagged, and won't somebody please think of the children?

And yes, if it comes to that, Apple definitely cannot hold the line. After all, they already didn't hold it on encrypted cloud storage - and that wasn't even legally forced on them, merely "not recommended".

candiodari 2021-08-18 22:13:14 +0000 UTC [ - ]

Exactly. Also people are strongly objecting that their own device is being used to report them to law enforcement. Surely someone at Apple noticed this beforehand ...

justapassenger 2021-08-18 20:13:44 +0000 UTC [ - ]

> For a company that marketed itself as one of the few digital service providers that consumers could trust, I just don't understand how they acted this way at all.

Because privacy stance is mostly PR to differentiate from Google. And while there're invalid reasons to get users data, there're also valid ones (at least from legal requirement point of view - let's not get into weeds about personal freedom here and if the laws and its implementations need to be changed).

Their PR was just writing the checks they cannot cash without going on a war with governments.

n8cpdx 2021-08-18 20:56:22 +0000 UTC [ - ]

It was obvious from the beginning that the privacy nonsense was a convenient excuse to cover for their poor (especially at the time) cloud offerings compared to competitors like Google.

I assumed they pivoted to focus on privacy, but clearly it was just a marketing department innovation rather than a core value (as their marketing department claimed).

istingray 2021-08-18 20:33:32 +0000 UTC [ - ]

In retrospect this makes sense to me. I don't like it, but I get it now. When Apple said "privacy" what they meant was "we don't like tracking cookies or hackers but everything else is fine".

websites2023 2021-08-18 21:00:01 +0000 UTC [ - ]

Privacy means “you pay for the device so we don’t sell your attention to advertisers.” That’s it. There’s no protection against state level actors (Pegasus, CSAM scanning, etc.).

If your threat model includes being the target of someone who will plant child pornography on your phone, you are already fucked. And no, Apple isn’t suddenly going to scan Chinese iPhones for Winnie the Pooh memes. They don’t have to. China already has the 50 cent party to do that for them, on WeChat.

Basically everything everyone seems to think is just around the corner has already been possible for years.

shuckles 2021-08-18 21:30:46 +0000 UTC [ - ]

Is there a consumer computing device that’s more secure than iPhone against state actors? A Chromebook, maybe?

websites2023 2021-08-18 22:48:38 +0000 UTC [ - ]

There is no consumer grade device that is secure against state actors.

whoknowswhat11 2021-08-18 21:03:04 +0000 UTC [ - ]

We will see. I've heard lots of these predictions and I don't buy them AT ALL.

What I have seen is a selling point for apple products.

I'd encourage folks to get out of the HN bubble on this - talk to a wife, a family especially those with kids.

__blockcipher__ 2021-08-18 21:14:57 +0000 UTC [ - ]

Yeah, talk to people that know nothing on the issue besides that they want to “protect the kids”.

Why stop there? Get out of the HN bubble on the patriot act, instead ask your neighbor’s wife her thoughts on it. Get out of the HN bubble on immigration, go ask a stereotypical boomer conservative about it.

I think my sarcasm already made it overtly obvious but, this is horrible advice you are giving and the fact that you don’t seem to be aware that pedophilia and terrorism are the two most classic “this gives us an excuse to exert totalitarian control” topics betrays your own ignorance (or, worse, you are aware and just don’t care).

ravenstine 2021-08-18 21:53:13 +0000 UTC [ - ]

Nah, their stock is up which means they'll continue on this trajectory.

zepto 2021-08-18 21:41:11 +0000 UTC [ - ]

Consumers are probably in favor this, or don’t care.

The only people who are bothered are people claiming this is going to be misused by authoritarian governments.

teclordphrack2 2021-08-19 02:27:45 +0000 UTC [ - ]

I have a feeling they already did a test of this with some small sample of people who did not know they were test subjects. I think that in the future it will come out that what they found was disturbing enough that they thought doing it network wide was a worth while endeavor.

This is not me agreeing or disagreeing.

nojito 2021-08-18 20:33:17 +0000 UTC [ - ]

Why are hash collisions relevent?

There are atleast 2-3 further checks to account for this.

fraa-orolo 2021-08-18 21:13:22 +0000 UTC [ - ]

Because it's not a cryptographic hash where a one bit difference results in a completely different hash. It's a perceptual hash that operates on a smaller bitmap derived from the image so it's plausible that some innocuous images might result in similar derivations; and there might be intentionally crafted innocently-looking images that result in an offensive derivative.

Salvador Dali could do something similar by hand in 1973 in Gala Contemplating the Mediterranean Sea [1]

[1] https://en.wikipedia.org/wiki/Lincoln_in_Dalivision

__blockcipher__ 2021-08-18 21:17:16 +0000 UTC [ - ]

This is a great answer but that’s not actually the GP’s contention. Their argument is essentially “so what if there’s a collision, the human review will catch it”. And to that I’d say that the same is supposed to occur for the no-fly list and we all know how that works in practice.

The mere accusal itself of possessing CSAM can be life ruining if it gets to that stage. More importantly, a collision will effectively allow warrantless searches, at least of the collided images.

fraa-orolo 2021-08-18 21:19:19 +0000 UTC [ - ]

Indeed, I touched on that in another comment: https://news.ycombinator.com/item?id=28227141

nojito 2021-08-18 23:39:18 +0000 UTC [ - ]

That’s one check. There are other system checks to make the client side hash collision meaningless.

SXX 2021-08-19 07:53:02 +0000 UTC [ - ]

Do you understand that anyone can take absolutely legal porn and make it match CSAM hash? And no one except NCMEC can know the difference because they all only compare hashes and not actual images.

And whoever going to check images for Apple will see that yeah, there is porn on picture. Flag it. Then you'll have unlimited amount of time to explain to FBI why some porn on your device match CSAM hash.

foobiekr 2021-08-18 20:47:05 +0000 UTC [ - ]

your trust in the system is charming

geoah 2021-08-18 19:32:33 +0000 UTC [ - ]

> The system relies on a database of hashes—cryptographic representations of images—of known CSAM photos provided by National Center for Missing & Exploited Children (NCMEC) and other child protection organizations.

“Cryptographic representations of images”. That’s not the case though right? These are “neuralhashes” afaik which are nowhere close to cryptographic hashes but rather locality sensitive hashes which is a fancy speak for “the closer two images look like, the more similar the hash”.

Vice and others keeps calling the cryptographic. Am I missing something here?

crooked-v 2021-08-18 19:45:21 +0000 UTC [ - ]

Something to note here is that in the hash collision that was discovered, the two images look nothing alike. One is a picture of a dog, the other is blobby grey static.

0x5f3759df-i 2021-08-18 21:32:57 +0000 UTC [ - ]

legostormtroopr 2021-08-19 03:46:53 +0000 UTC [ - ]

Those are two almost identical images based off each other.

I know that that "think of the children" is a meme, but I think this illustrates the point for apple. If you have a croped or modified image of CSA the system will identify it. As long as your image is different enough from CSA, you are safe.

The point here is that Apple is specifically looking for matches against known CSA material.

If someone can demonstrate that a legal NSFW image (eg. regular old-fashioned pornography), can be collided with a legal, completely 100% SFW image then I'll be concerned.

But until then, this looks like a reasonable and supportable way for finding CSAM in real time.

hartator 2021-08-19 04:34:03 +0000 UTC [ - ]

> If someone can demonstrate that a legal NSFW image (eg. regular old-fashioned pornography), can be collided with a legal, completely 100% SFW image then I'll be concerned.

Look at this other collision: https://twitter.com/SarahJamieLewis/status/14282060881181491... An attacker can send you an innocent looking picture that embbed some CSA material and you get swatted the next day.

legostormtroopr 2021-08-19 05:33:09 +0000 UTC [ - ]

I think you are getting downvoted because thats the same image I replied to.

Looking at those two images its plain to see why they are identical.

And if someone is sending you CSA material, isn't that the point of this process. Apple identifies it as CSA, can give you a warning its sensitive, and identify they authorities that people are sending CSA.

Again - this seems like a win. If Apple can automatically identify CSA material, en-masse thats good.

edit: It looks like they are different URLs, but Twitter only allows people to see replies if they are logged in, so I can't see that example.

edit 2: On further thought, if someone can use CSA material and produce an innocuous image with a similar hash, if they send that image to you, its still proof that the sender had CSA material. Again, its still good.

__blockcipher__ 2021-08-18 21:18:22 +0000 UTC [ - ]

They actually do look alike to my eye, but in a “the way the algorithm sees it” kind of way. I can see the obvious similarity. But to your point it’s not like it’s two very slightly different photos of dogs.

TechBro8615 2021-08-19 03:17:57 +0000 UTC [ - ]

You must be a champion CAPTCHA solver.

bawolff 2021-08-18 20:32:57 +0000 UTC [ - ]

I dont really think it'd be a valid second pre-image for this type of hash if they did look similar.

kbelder 2021-08-18 23:32:15 +0000 UTC [ - ]

I'd be more interested in whether you could take an image of, say, an underage dog, and tweak a vaguely similar image of a grown dog so that it was a match. That's what will get innocent people thrown in jail.

smoldesu 2021-08-18 20:02:25 +0000 UTC [ - ]

Furthermore, it may well be possible to combine that blobby grey static with another image, manipulating it's visual hash to create a "sleeper" positive. If this was possible in a week, then it's going to be very interesting to watch the technology change/evolve over the next few years.

floatingatoll 2021-08-18 20:28:11 +0000 UTC [ - ]

Doing so would create a positive that still doesn't pass Apple's human visual check against the (blurred) CSAM content associated with that checksum, and if it somehow did, it would still then also have to occur at qty.30 or more, and they'd have to pass a human visual check against the (unblurred) CSAM content by one of the agencies in possession of it. It's not possible to spoof that final test unless you possess real CSAM content, at which point you don't need to spoof that final test and you'll end up arrested for possession.

treesprite82 2021-08-18 22:21:52 +0000 UTC [ - ]

There are concerns without the images needing to make it all the way through their CSAM process. Apple is a US company with obligations to report certain crimes they become aware of, and NCMEC is a US government-tied organisation that dismissed privacy concerns as "the screeching voices of the minority".

Consider if the honeypot images (manipulated to match CSAM hashes) are terrorist recruitment material for example.

laverya 2021-08-18 21:19:06 +0000 UTC [ - ]

And if you make the second image be actual, 18+ porn? Will Apple be able to tell the difference between that and CSAM after the blur is applied?

Bonus points if you match poses, coloration, background, etc.

floatingatoll 2021-08-18 21:21:57 +0000 UTC [ - ]

The only way to match poses, coloration, background, etc is to possess illegal CSAM content. If you do so successfully, you will result in your crafted image passing the blur check and reaching the agency that possesses the original image for final verification, where it will immediately fail because it is obviously a replica. You will then trigger that agency leading law enforcement to find the creator of the image, so that they can arrest whoever possesses CSAM content in order to model replicas after it.

I think that's a perfectly acceptable outcome, since anyone with the hubris to both possess CSAM content and create replicas of it especially deserves to be arrested. Do you see a more problematic outcome here?

ultrarunner 2021-08-18 23:36:47 +0000 UTC [ - ]

Cops no-knock raid the wrong houses all the time; what makes you think that the procedure you outlined above will be followed without error? Especially against suspected child abusers, from an anti-child-abuse agency?

What if the visual check gets accidentally signed off on, or even gets fraudulently marked as positive by a burned out/lazy/competent-but-distracted employee? There are just so many failure modes for this dragnet that all end with innocent people suffering a very difficult process, even if they don't eventually land in prison. I don't think it's unreasonable to not want to be volunteered to participate.

laverya 2021-08-18 22:18:05 +0000 UTC [ - ]

This assumes that it's impossible to reverse the perceptual hashes used here in such a way that you could determine poses and coloration, for one.

And in retrospect, you don't need to match that - you just need it to appear obviously pornographic after the blur is applied in order to get past Apple's reviewers. After that, the lucky individual's life is in the hands of the police/prosecutors. (I have to imagine that both real and faked cases will look pretty much like "Your honor/members of the jury, this person's device contained numerous photos matching known CSAM. No, we won't be showing you the pictures. No, the defence can't see them either." Can you imagine a "tough on crime" prosecutor taking the faked case to trial too? Would the police and prosecutors even know it was faked?)

floatingatoll 2021-08-18 22:42:18 +0000 UTC [ - ]

Apple's reviewers are comparing blurred originals to blurred matches. It needs to look like the blurred original associated with the checksum that matched. It is irrelevant whether the blurred match looks pornographic or not.

ultrarunner 2021-08-18 23:38:15 +0000 UTC [ - ]

floatingatoll 2021-08-19 16:52:35 +0000 UTC [ - ]

At which point the image will be handed to the relevant CSAM group unblurred, who will do a visual comparison and find out immediately that it’s not a match, and then reject it without invoking law enforcement.

SXX 2021-08-19 08:13:46 +0000 UTC [ - ]

Why would attacker need to create anything from scratch? If you want to build a dataset of images with people with specific skin and hair color, body types, etc in very specific poses it's can be quite hard and expensive to do so. Because even photo stocks have limited number of such photos. Unless...

Unless you're looking to build a porn dataset and you're don't care about copyright. Porn is industry where exabytes of material are produced and published on internet almost every week.

Who will agency come to? To some OnlyFans creators?

2021-08-18 21:17:16 +0000 UTC [ - ]

int_19h 2021-08-19 00:07:30 +0000 UTC [ - ]

At the point where it is submitted for a human visual check, your privacy has already been violated.

floatingatoll 2021-08-19 15:53:48 +0000 UTC [ - ]

My privacy is violated every time I leave my home. Anyone can take a photo of me and look up the FBI Most Wanted and see if I’m there. If they think someone is me and they’re wrong, they can still summon law enforcement, and I’ll still be mistreated for their poor judgement.

Is this just as unacceptable as the CSAM scanning? Should all public photography be banned, in order to reduce the risk of false positive identifications of innocent people as criminals to zero? Or is that an acceptable degree of privacy impingement for the good of society?

Is Apple’s implementation an acceptable trade of impingement and risk, for good for society? We do live in a society, and so zero impingement upon privacies is never going to be acceptable (sorry, free/tech/libs). But instead of discussing whether Apple’s approach violates privacy minimally or not in order to get the job done, these discussions here just keep circling the drain of “putting my privacy at risk by any degree is never acceptable”, when that drain is cemented shut by the existence of society and will never lead to a valid outcome.

whoknowswhat11 2021-08-18 19:58:33 +0000 UTC [ - ]

No - the reporting is absolutely terrible here.

1) These are more share similar visual features than crypto hashes.

2) HN posters have been claiming that apple reviewing flagged photos is a felony -> because HN commentators are claiming flagged photos are somehow "known" CASM - this is also likely totally false. The images may not be CASM and the idea that a moderation queue results in felony charges is near ridiculous.

3) This illustrates why apple's approach here (manual review after 30 images or so flagged) is not unreasonable. The push to say that this review is unnecessary is totally misguided.

4) They use words like "hash collision" for something that is not a hash. In fact, different devices will calculate DIFFERENT hashes for the SAME image at times.

One request I have - before folks cite legal opinions - those opinions should have the name of a lawyer on them. Not this "I talked to a lawyer" because we have no idea if you described things accurately.

sandworm101 2021-08-18 20:03:36 +0000 UTC [ - ]

>> those opinions should have the name of a lawyer on them.

Not going to happen. Lawyers in the US have issues with offering unsolicited advice, and other problems with issuing advice into states where they are not admitted. So likely none of the US lawyers (and the great many more law students) here will ever put their real name to a comment.

agbell 2021-08-18 20:31:37 +0000 UTC [ - ]

This. Also try contacting a lawyer who knows this area and asking to pay for a legal opinion brief so that you can post it online to be debated by legions of software developers.

Lawyers I know would politely decline that.

whoknowswhat11 2021-08-18 20:47:52 +0000 UTC [ - ]

No - this is actually done supposedly as part of biz dev.

So your own firm may cover some costs if you have something to say. If you found someone to pay for you to do an analysis or offer your thoughts - you'd be in heaven!

torstenvl 2021-08-18 20:41:33 +0000 UTC [ - ]

This is incorrectly applied. Offering a legal opinion is fundamentally different from offering legal advice. We publish legal opinions in academic and professional publications all the time. That doesn't mean we have an attorney-client relationship with anyone who reads those opinions, or that we would advise that someone act in accordance with such an opinion, particularly if no court has adopted our position yet.

zie 2021-08-18 20:35:43 +0000 UTC [ - ]

but there are legal opinions that law firms and various orgs(political or not) that wite on occasion from actual lawyers sharing a legal opinion, publicly(or not).

But it's the law, it's fuzzy at best, much like your HR department. It's only after a court decision has been reached on your particular issue that it's anywhere near "settled" case law, and even that's up for possible change tomorrow.

dylan604 2021-08-18 21:27:44 +0000 UTC [ - ]

Right. That's why there's never been an amicus brief or a friends of the court type of document drafted and signed by lawyers.

sandworm101 2021-08-18 21:36:50 +0000 UTC [ - ]

Those are addressed to a court or government agency in a specific place. The advice offered isn't for a bunch of rando people on the internet across all number of jurisdictions. And it is only general advice about an area of law, normally at the appellate level, not specific fact-dependent advice to a real flesh-and-blood person. Amicus is also normally an opinion to sway the court on broad policy, not a determination of facts in a specific case.

whoknowswhat11 2021-08-18 20:45:58 +0000 UTC [ - ]

Sure - but if you are going to write blog posts / articles - and all you can say is based on the lawyers I talked to apple is committing child porn felonies, that is just unacceptable.

At least HN should flag these and get these taken down. Over and over the legal analysis is either trash or it's clear the article author didn't understand something (so how can lawyer give good advice?).

These conversations become so uninteresting when people take these extreme type positions. Apple's brand is destroyed - apple is committing child porn felonies.

I would have rather just had a link to the apple technical paper and a discussion personally vs the over the top random article feed with all sorts of misunderstandings.

And in contract law there are LOTS of legal articles online - with folks name on them! They are useful! I read them and enjoy them. Can we ask for that here where it matters maybe more?

sandworm101 2021-08-18 21:39:58 +0000 UTC [ - ]

>> there are LOTS of legal articles online - with folks name on them

Articles are not legal advice. They are opinions on the law applicable generally, rather than fact-based advice to specific clients. Saying whether apple is doing something illegal or not in this case, with a lawyer's name stamped on that opinion, is very different.

rootusrootus 2021-08-18 20:16:43 +0000 UTC [ - ]

> The images may not be CASM and the idea that a moderation queue results in felony charges is near ridiculous.

Agree, and I think this is backed up by real world experience. Has Facebook or anyone working on their behalf ever been charged for possession of CSAM? I guarantee they've seen some. Probably a lot, in fact. That's why we have recurring discussions about the workers and the compensation they get (or not) for the really horrid work they are tasked with.

Xamayon 2021-08-18 20:39:03 +0000 UTC [ - ]

From what I understand it's even more cut and dry than that. If you submit a report to the NCMEC you are generally required to (securely) keep a copy of the image(s) being reported and any related info for a period of time. That's part of the rules. You are also only compelled to report things that you know are bad, so verification before reporting makes sense. The idea that they would be charged with a crime for doing what is essentially required by law is flat out wrong. Unless they are storing the material insecurely once confirmed bad or otherwise mishandling the process, they seem to be following the law as I understand it. IANAL but I have an account with the NCMEC for a service I run, so I have looked through their documentation and relevant laws to try to understand the requirements placed on me as a service provider.

whoknowswhat11 2021-08-18 20:49:16 +0000 UTC [ - ]

Thank you for a first hand report. All that makes sense - you clearly can't and shouldn't "pass around images" even in the office - so fair that it is super strict once something is confirmed as X.

tetha 2021-08-18 22:45:51 +0000 UTC [ - ]

More of an anecdote, but our software allows users to upload files, documents, images and such. This in turn means, technically speaking, our platform could be used to distribute malware or other illegal content. I've asked: Well what happens if that occurs and we as a company becomes aware?

Overall, we're advised to take about these steps there. First off, report it. Second, remove all access for the customer, terminate the accounts, lock them out asap. Third, prevent access to the content without touching it. For example, if it sits on a file system and a web server could serve it, blacklist URLs on a loadbalancer. Fourth, if necessary, begin archiving and securing evidence. But if possible in any way, disable content deletion mechanisms and wait for legal advice, or the law enforcement to tell you how to gather the data.

But overall, you're not immediately guilty for someone abusing your service, and no one is instantly guilty for detecting someone is abusing your service.

judge2020 2021-08-18 20:47:50 +0000 UTC [ - ]

> Probably a lot, in fact.

> In 2020, FotoForensics received 931,466 pictures and submitted 523 reports to NCMEC; that's 0.056%. During the same year, Facebook submitted 20,307,216 reports to NCMEC

https://www.hackerfactor.com/blog/index.php?/archives/929-On....

hpoe 2021-08-18 20:27:07 +0000 UTC [ - ]

The problem isn't the idea of CASM, it's that what happens when they start scanning for "misinformation", or whatever else they want to come up with at that point.

whoknowswhat11 2021-08-18 20:50:21 +0000 UTC [ - ]

Then why not wait until that actual issue comes up. The look is so bad / awkward with this big protest over something that many people are not going to find at all objectionable.

Do you really think Apple's brand has been "destroyed" over this?

rootusrootus 2021-08-18 22:58:02 +0000 UTC [ - ]

IMO the risk is that the reaction will be viewed amongst the general population as an overreaction, leading to it being ignored on the next go around. Many people are going to listen to what Apple says, look at their piles of technical documentation, see the word CSAM, and conclude that the opposing side is crazy.

fuzzer37 2021-08-18 20:31:48 +0000 UTC [ - ]

Or how about they just don't scan my phone at all.

shapefrog 2021-08-18 20:46:34 +0000 UTC [ - ]

feel free to go ahead and opt out.

Someone 2021-08-18 20:04:17 +0000 UTC [ - ]

The neural hash itself isn’t cryptographic, but there’s cryptography involved in the process.

They use “private set intersection” (https://en.wikipedia.org/wiki/Private_set_intersection) to compute a value that itself doesn’t say whether an image is in the forbidden list, yet when combined with sufficiently many other such values can be used to do that.

They also encrypt the “NeuralHash and a visual derivative” on iCloud in such a way that Apple can only decrypt that if they got sufficiently many matching images (using https://en.wikipedia.org/wiki/Secret_sharing)

(For details and, possibly, corrections on my interpretation, see Apple’s technical summary at https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni... and https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...)

3np 2021-08-19 05:22:03 +0000 UTC [ - ]

This is one of the most common misunderstandings, both when people are arguing for or against.

People who understand tech well enough to recognize hashes like MD5 and SHA don't dive deep enough to understand that this is something completely different.

I even suspect this is deliberate from Apple's side when announcing and talking about these changes - making people wrongly believe that only exact matches will trigger, except possibly in extremely rare cases and under concious attacks.

They could have called it "fingerprint" or something but deliberately went with a technical term that even confuses technical people who know well enough what a hash usually means.

Vice is falling victim to this misunderstanding stemming from the conflation of "hash".

atonse 2021-08-18 20:28:25 +0000 UTC [ - ]

Yea by reading about this whole scandal I learned about perceptual hashes, because the whole time I kept thinking "can't they just alter one pixel and entirely change the hash" because I was only familiar with cryptographic hashes.

It's a huge, huge, huge distinction.

robertoandred 2021-08-18 20:04:30 +0000 UTC [ - ]

The CSAM hashes are encrypted and blinded, possible Vice and others conflated and/or combined steps.

jdavis703 2021-08-18 20:48:48 +0000 UTC [ - ]

In a certain sense it is cryptographic because the original CSAM images remains secret. While the hash is not useful for maintaining integrity it still provides confidentiality.

Edit: this is apparently not true as demonstrated by researchers.

judge2020 2021-08-18 20:53:16 +0000 UTC [ - ]

This has technically been proven false:

> Microsoft says that the "PhotoDNA hash is not reversible". That's not true. PhotoDNA hashes can be projected into a 26x26 grayscale image that is only a little blurry. 26x26 is larger than most desktop icons; it's enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26x26 Sudoku puzzle; a task well-suited for computers.

https://www.hackerfactor.com/blog/index.php?/archives/929-On...

robertoandred 2021-08-18 21:13:58 +0000 UTC [ - ]

PhotoDNA and NeuralHash are not the same thing.

judge2020 2021-08-18 21:45:24 +0000 UTC [ - ]

Apple is using a private perceptual hash on the backend, likely to further filter out bad/spam tickets.

https://twitter.com/fayfiftynine/status/1427899951120490497?...

Given neuralhash is a hash of a hash, I imagine they’re running photodna and not some custom solution which would require Apple ingesting and hasing all of the images themselves using another custom perceptual hash system.

> . Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.

https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

robertoandred 2021-08-18 22:03:42 +0000 UTC [ - ]

My reading of that is NCMEC provides NeuralHash hashes of their CSAM library to Apple, and Apple encrypts and blinds that database before storing it on devices.

andrewmcwatters 2021-08-18 19:36:51 +0000 UTC [ - ]

Edit: "The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, and images that are different from one another result in different hashes."[1]

Apple isn't using a "similar image, similar hash" system. They're using a "similar image, same hash" system.

[1]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

heavyset_go 2021-08-18 19:47:14 +0000 UTC [ - ]

> There really is no sound concept of "the more similar the hash."

Perceptual hashes are not cryptographic hashes. Perceptual hashing systems do compare hashes using a distance metric like the Hamming distance.

If two images have similar hashes, then they look kind of similar to one another. That's the point of perceptual hashing.

andrewmcwatters 2021-08-18 22:10:31 +0000 UTC [ - ]

This isn’t what Apple is doing, regardless of the fact they are using the same terminology. See my edit above.

2021-08-18 19:53:07 +0000 UTC [ - ]

salawat 2021-08-18 19:51:44 +0000 UTC [ - ]

Except when you're not measuring the similarity you think you are, because NeuralNets are f&#@ing weird.

You know, let me put it this way. Yiu know that one really weird family member I'm pretty sure everyone either has or is?

Guess what? They're a neural net too.

This is what Apple is asking you to trust.

heavyset_go 2021-08-18 19:52:45 +0000 UTC [ - ]

I agree, I'm just talking about the difference between cryptographic hashes and other types of hashes.

function_seven 2021-08-18 19:49:30 +0000 UTC [ - ]

EDIT: Parent edited his comment to clarify. I understand the point now. I'm wrong about similar images needing to have "similar" hashes. Those hashes either need to match exactly, or else not be considered at all.

IGNORE THIS: I think that's the parent comment's point. These are definitely not cryptographic hashes, since they—by design and necessity—need to mirror hash similarity to the perceptual similarity of the input images.

whoknowswhat11 2021-08-18 20:51:58 +0000 UTC [ - ]

They are almost the opposite. The programs create different hashes (though very similar) depending on platform you run on even with SAME input. No crypto hash works even close to this.

2021-08-18 22:06:14 +0000 UTC [ - ]

2021-08-18 19:55:38 +0000 UTC [ - ]

yeldarb 2021-08-18 19:30:05 +0000 UTC [ - ]

I created a proof of concept showing how OpenAI's CLIP model can function as a "sanity check" similar to how Apple says their server-side model works.

In order for a collision to get through to the human checkers, the same image would have to fool both networks independently:

https://blog.roboflow.com/apples-csam-neuralhash-collision/

mrits 2021-08-18 19:41:06 +0000 UTC [ - ]

Cool project. Wouldn't padding all generated images with a few items that match CLIP get around this though?

yeldarb 2021-08-18 19:46:47 +0000 UTC [ - ]

Doing that would mean the NeuralHash would change though. And you'd have to not only get CLIP to identify CSAM in the generated image but also negate the parts of the generated image that are causing CLIP to label it as "generated" (while still colliding with the target NeuralHash).

Unclear how hard this would actually be in practice (if I were going to attempt it, the first thing I'd try is to evolve a colliding image with something like CLIP+VQGAN) but certainly harder than finding a collision alone.

salawat 2021-08-18 20:01:09 +0000 UTC [ - ]

Take 2 CSAM pictures, known. Combine into one image.

Swing and a miss. Not in the CSAM dataset. Take two images. Encode alternating pixels. Decode to get the original image back. Convert to different encodings print to PDF or Postscript. Encode as base64 representations of the image file...

Who are we trying to fool here? This is kiddie stuff.

This is more about trying to implant scanning capabilities on client devices.

yeldarb 2021-08-18 20:19:16 +0000 UTC [ - ]

Those are examples of triggering false negatives (CSAM that gets missed by the system), this collision attack is about triggering false positives (non-CSAM that triggers human intervention).

rootusrootus 2021-08-18 20:22:48 +0000 UTC [ - ]

> This is more about trying to implant scanning capabilities on client devices.

Did we think they didn't already have that ability?

salawat 2021-08-18 21:19:43 +0000 UTC [ - ]

No. The brazen part of this is Apple trying to sell the public on this being okay, effective to the point of mitigating the abuse potential. False negative AND false positive have to counterweight the existence of the vector for abuse. That's what you're measuring against. You can't even guarantee that. This isn't even close to a safe or effective move. The risk management on this is hosed.

Barrin92 2021-08-18 20:11:16 +0000 UTC [ - ]

it's kind of silly how this is treated like some deeply technical issue. Apple's 'one of a trillion' claim is either true and the software is useless, because I'm pretty sure pedophiles can figure out what a watermark or a gaussian blur in paint net is, or it's imprecise and actually detects things which is the very thing that makes it dangerous to the public.

It's a direct trade-off and the error tolerance of any such filter is the only thing that makes it useful so we can basically stop arguing about the depths of implementation details or how high the collision rate of the hashing algorithm is etc. If this thing is supposed to catch anyone it needs to be magnitudes more lenient than any of those minor faults.

Ashanmaril 2021-08-18 20:54:45 +0000 UTC [ - ]

> I'm pretty sure pedophiles can figure out what a watermark or a gaussian blur in paint net is, or it's imprecise and actually detects things which is the very thing that makes it dangerous to the public

Or better yet, they can just not store their stuff on an iPhone. While meanwhile, millions of innocent people are have their photos scanned and risking being reported as a pedophile.

734129837261 2021-08-18 20:32:39 +0000 UTC [ - ]

Exactly right. The tech Apple uses can be one of two things:

1. It requires a perfect 1:1 match (their documentation says this is not the case); 2. Or it has some freedom in detecting a match, probably including a match with a certain percentage.

If it's the former, it's completely useless. A watermark or a randomly chosen pixel with a slightly different hue and the hash would be completely different.

So, it's not #1. It's going to be #2. And that's where it becomes dangerous. The government of the USA is going to look for child predators. The government of Saudi Arabia is going to track down known memes shared by atheists, and they will be put to death; heresy is a capital offence over there. And China will probably do their best to track down Uyghurs so they can make the process of elimination even easier.

It's not like Apple hasn't given in to dictatorships in the past. This tech is absolutely going to kill people.

endisneigh 2021-08-18 20:44:21 +0000 UTC [ - ]

What? Why would memes be on the CSAM list?

visarga 2021-08-18 21:18:01 +0000 UTC [ - ]

They got to scan for whatever the local law requires them to scan.

kevin_thibedeau 2021-08-18 19:27:31 +0000 UTC [ - ]

How long until thishashcollisionisnotporn.com is a thing?

mono-bob 2021-08-18 19:50:36 +0000 UTC [ - ]

Sounds like a fun weekend project.

istingray 2021-08-18 19:42:57 +0000 UTC [ - ]

sounds like a future NFT market

eurasiantiger 2021-08-18 20:50:01 +0000 UTC [ - ]

Sounds like hipster T-shirts with hash collision images.

mzs 2021-08-18 20:03:53 +0000 UTC [ - ]

>… Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

>"This independent hash is chosen to reject the unlikely possibility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database," …

stevenalowe 2021-08-18 21:13:04 +0000 UTC [ - ]

Apple wants to enforce one of their corporate policies at my expense: battery, CPU, and unknown risks if/when they're wrong. What could possibly go wrong?

tlogan 2021-08-19 02:41:30 +0000 UTC [ - ]

I have to say that I’m just sad about this. I was really convinced that Apple really believed in privacy and they will never do this (having a framework which can scan your device content). But maybe they already had it and this was like: “there is this delta and Afghanistan let’s make announcement now”

We are just one stupid terrorist attack from full surveillance of everybody.

balozi 2021-08-18 21:19:57 +0000 UTC [ - ]

They are slowly turning what is a philosophical argument into a technical question. Next step is for them to conjure up a technical answer before claiming victory. Most people are already bored by all the technical talk.

Neural hash this: Its about Trust. Its about Privacy. Its about Boundaries between me and corporations/governments/etc.

ahD5zae7 2021-08-19 11:39:11 +0000 UTC [ - ]

Honest question - as far as I could find out the files will be scanned on device but before upload to iCloud. Before they were scanned in the cloud after upload. That's the change as far as I understand it. But if it's scanned before upload then what is the difference? A few seconds? It would be scanned either way, before or after upload. Is that it?

What I'm basically getting at is: are the files scanned after the user has expressed the intention of uploading them? That's what I understood. Am I wrong? Are the files scanned the moment they appear on your device, regardless of you iCloud status (even if you have disabled iCloud somehow)?

Edit: typo

1vuio0pswjnm7 2021-08-18 20:36:46 +0000 UTC [ - ]

Perhaps I have I missed this in all the discussions of Apple's latest move but has anyone considered the following questions.

Does Apple's solution only stop people from uploading illegal files to Apple's servers or does it stop them from uploading the files to any server.

If Apple intends to control the operation of a computer purchased from Apple after the owner begins using it, does Apple have a duty to report illegal files found on that computer and stop them from being shared (anywhere, not just through Apple's datacenters).

To me, this is why there is a serious distinction between a company detecting and policing what files are stored on their computers (i.e., how other companies approach this problem) and a company detecting and policing what files someone else's computer is storing and can transfer over the internet (in this case, unless I am mistaken, only to Apple's computers).

Mind you, I am not familiar with the details of exactly how Apple's solution works nor the applicable criminal laws so these questions might be irrelevant. However I was thinking that if Apple really wanted to prevent the trafficking of ostensibly illegal files then wouldn't Apple seek to prevent their transfer not only to Apple's computers but to any computer (and also report them to the proper authorities). What duty does Apple have if they can "see into the owner's computer" and they detect illegal activity. If Apple is in remote control of the computer, e.g., they can detect the presence/absence of files remotely and allow or disallow full user control through the OS, then does Apple have a duty to take action.

judge2020 2021-08-18 20:40:19 +0000 UTC [ - ]

> Does Apple's solution only stop people from uploading illegal images to Apple's servers or does it stop them from uploading the images to any server.

Only applies to iCloud Photos uploads, but the photos are still uploaded: when there's a match, the photo and a 'ticket' are uploaded and Apple's servers (after the servers themselves verify the match[0]) send the image to human reviewers to verify the CSAM before submitting it to police as evidence.

0: https://twitter.com/fayfiftynine/status/1427899951120490497 and https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

tejohnso 2021-08-18 20:43:35 +0000 UTC [ - ]

> Does Apple's solution only stop people from uploading illegal files to Apple's servers or does it stop them from uploading the files to any server.

Perhaps they're trying to prevent being in "posession" of illegal images by having them on their own servers, rather than preventing copying to arbitrary destinations.

ysavir 2021-08-18 20:39:23 +0000 UTC [ - ]

> does Apple have a duty to report illegal images found on that computer and stop them from being shared

Duty? No, that's the secondary question. The primary question is whether they have the right.

CubsFan1060 2021-08-18 20:42:43 +0000 UTC [ - ]

What has been stated is that this all _only_ happens as photos are about to be uploaded to iCloud photos (not to be confused with iCloud backup, a totally separate thing). This is currently done server-side on iCloud photoes. This appears to move it client side.

endisneigh 2021-08-18 20:25:30 +0000 UTC [ - ]

Am I the only one who finds no issue with this? Personally I’d prefer this than no encryption and scanning in the cloud, which is already possible.

All of the slippery slope arguments have already been possible for nearly a decade now with the cloud.

Can someone illustrate something wrong with this that’s not already possible today.

Fundamentally unless you audited the client and the server yourself either (client or backend) scanning is possible, and therefore this “problem”.

Where’s the issue?

fritzw 2021-08-18 21:24:48 +0000 UTC [ - ]

> All of the slippery slope arguments have already been possible for nearly a decade now with the cloud.

That doesn't make it ok. The fact that you and my mom have normalized bad behavior doesn't make it any less offensive to my civil liberties.

> Am I the only one who finds no issue with this?

If you only think about the first order effects. This is great, we catch a bunch of kid sex pedos. The second order effects are much more dire. "slippery slope" sure... but false arrests, lives ruined based on mere investigations revealing nothing, expanding the role of government peering in to our personal lives, resulting suicides, corporations launching these programs - then defending them - then expanding them due to government pressure is *guaranteed*, additional PR and propaganda from corporations and government for further invasion in our personal lives due to the marketed/claimed success of these programs. The FBI has been putting massive pressure on Apple for years, due to the dominance and security measures of iOS.

Say what you want about the death penalty, but many many innocent people have dead in a truly horrific way, with some actually being tortured while being executed. That is a perfect example of second order effects on something most of us without any further information (ending murderous villains is a good thing) would agree on. So many Death Row inmates have been exonerated and vindicated.

edit: https://en.wikipedia.org/wiki/List_of_exonerated_death_row_i...

xadhominemx 2021-08-18 20:45:47 +0000 UTC [ - ]

> All of the slippery slope arguments have already been possible for nearly a decade now with the cloud.

Not only has it been possible for a decade, it’s been happening for a decade. Every major social media company already scans for and reports child pornography to the feds. Facebook submits millions of reports per year.

headShrinker 2021-08-19 10:08:28 +0000 UTC [ - ]

I really don’t understand how you can, with sound mind, compare a public facing ‘social media’ with my very personal and very private phone photos. These are two completely different things.

xadhominemx 2021-08-19 16:40:32 +0000 UTC [ - ]

They are not scanning photos stored on your phone, their are scanning photos stored on iCloud

endisneigh 2021-08-18 20:53:46 +0000 UTC [ - ]

Yes, exactly. This is honestly nothing new.

mrkstu 2021-08-18 21:06:45 +0000 UTC [ - ]

This is new for Apple's customers. And its new in that the device you bought and paid for is nosing through your files.

Apple is introducing a reverse 'Little Snitch' where instead of the app warning you what apps are doing on the network, the OS is scanning your photos. Introducing a 5th columnist into a device that you've bought and paid for is a huge philosophical jump from Apple's previous stances on privacy, where they'd gone as far as fighting the Feds about trying to break into terrorist's iPhones.

endisneigh 2021-08-18 21:15:21 +0000 UTC [ - ]

Luckily for their customers it can be turned off. So, what’s the issue?

mrkstu 2021-08-18 21:34:06 +0000 UTC [ - ]

A huge part of Apple’s value proposition is iCloud. If I have to turn that off to keep the spy out of my OS, it’s value to me is dramatically diminished.

endisneigh 2021-08-18 21:36:42 +0000 UTC [ - ]

Since you presumably don’t trust Apple to scan your photos, it sounds like Apple might not be for you, then. Who will you move to?

vorpalhex 2021-08-19 02:35:09 +0000 UTC [ - ]

Motte and Bailey much?

"Well you can turn it off" to "Well, but then you just don't trust Apple".

Clearly these are users who did trust Apple. Apple betrayed their trust. Given that Apple bulk handed over iCloud data to China, I don't really believe their pinky promise that they are, by policy only, going to resist government use of this tech. They can cave to government cases _and_ the government can certainly force them to.

endisneigh 2021-08-19 03:24:33 +0000 UTC [ - ]

The point is that turning it off resolves the issue, but if someone refuses to turn it off because they want to use iCloud there’s clearly a contradiction.

Therefore you must trust Apple. So if you still have issues then you don’t trust Apple.

mrkstu 2021-08-18 22:03:03 +0000 UTC [ - ]

In the past I trusted Apple to resist government efforts to spy on me to the best the law allowed. Now they are innovating ways to help a government spy literally live in my pocket.

Trust is fragile and Apple has taken what in the past I believed it understood to be a strategic advantage and stomped it into little pieces.

creddit 2021-08-18 20:49:04 +0000 UTC [ - ]

Why prefer your owned device tattling on you to Apple looking at data you give them on their servers?

The reason people don't like this, as opposed to, for example, Dropbox scanning your synced files on their servers, is that a compute tool you ostensibly own is now turned completely against you. Today, that is for CSAM, tomorrow, what else?

endisneigh 2021-08-18 20:52:22 +0000 UTC [ - ]

There’s no difference - all cloud services that aren’t encrypting your data is subject to the same thing. Dropbox could do the same thing tomorrow. If we’re talking about hypotheticals any vendor that handles your unencrypted files can do this now.

creddit 2021-08-18 21:28:48 +0000 UTC [ - ]

You’re not understanding. Dropbox already does scan your files and I’m comparing Apples actions explicitly to that fact. I’m pointing out that people don’t care about that because you’re literally, voluntarily, giving Dropbox your files. Here, Apple is controlling your phone to tattle on you at an OS level (yes, I know, Apple says you need to turn on iCloud Photos for this to run but the precedent is the problem).

headShrinker 2021-08-19 10:24:00 +0000 UTC [ - ]

> Dropbox already does scan your files

> people don’t care about that because you’re literally, voluntarily, giving Dropbox your files.

Correct and I agree. I don’t upload my most personal photos to Dropbox for this very specific reason. In fact I stopped using Dropbox when Condi Rice joined the board because she lacks good sense and doesn’t respect civil rights. See ‘The Patriot Act’ and ‘the Invasion of Afghanistan’. It was easy to stop using Dropbox because the alternatives were vast. Apple has me very purposely locked in to this scheme to where the alternatives are a huge transition and compromise on privacy no matter where I turn.

stevenalowe 2021-08-18 21:16:53 +0000 UTC [ - ]

big difference: you're now paying for the compute load, instead of the cloud providers, for something that offers no direct benefit to you and will most likely betray you in unforeseen ways in the future (especially in less 'free' countries)

mavhc 2021-08-18 21:19:52 +0000 UTC [ - ]

Why did you think you owned a closed source device?

creddit 2021-08-18 21:36:32 +0000 UTC [ - ]

Open Source fanatics are the worst. Even if all the software on your phone was Open Source, you wouldn’t have the mental bandwidth to ever verify you trust it. You’re just farming out your trust to a different set of people and hoping they’re more trustworthy than those with closed source. At best this MAY be reasonable because of incentive alignment but that’s super weak.

Not to mention the meaning of ownership is completely unrelated to ability to view the designs of a given object. I think I own the fan currently blowing air at me without ever having seen a schematic for its controls circuitry just fine and everyone for all of history has pretty much felt the same.

mavhc 2021-08-18 21:50:57 +0000 UTC [ - ]

That's because your fan is hardware and your phone is software. The hardware in your phone is only 1% of what your phone is.

The point is it's hard to hide your evil plans in daylight. Either you trust Apple or your don't. Same with Microsoft's telemetry, they wrote the whole OS, if they were evil they have 10000 easier ways to do it.

All Apple has to do is lie, you'll never know.

petersellers 2021-08-18 20:39:51 +0000 UTC [ - ]

Personally, I'd prefer no privacy intrusions at all.

The issue is that Apple previously was not intruding into their user's privacy (at least publicly), but now they are.

It sounds like your argument is that Apple could have been doing this all along and just not telling us. I find that unlikely mainly because they've marketed themselves as a privacy-focused company up until now.

shapefrog 2021-08-18 21:20:10 +0000 UTC [ - ]

> The issue is that Apple previously was not intruding into their user's privacy

Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.

You must not have been paying attention for the last 20 years.

endisneigh 2021-08-18 20:42:20 +0000 UTC [ - ]

I’m your situation can’t you just turn off the scanning? What’s the issue?

petersellers 2021-08-18 20:50:18 +0000 UTC [ - ]

There are two different "features" being implemented - Messages scanning for minors on a family plan (which can be turned off) and iCloud Photo scanning (which can't be turned off, as far as I know). So no, you can't just turn it off.

CubsFan1060 2021-08-18 20:57:20 +0000 UTC [ - ]

petersellers 2021-08-18 21:02:07 +0000 UTC [ - ]

Only if you disable iCloud photos completely. So no, you cannot turn off the feature unless you stop using iCloud photos.

CubsFan1060 2021-08-18 21:06:42 +0000 UTC [ - ]

Correct. I'm not sure how that's a distinction from "you can't turn it off" though. I believe they've been doing this scan server-side for quite some time already.

petersellers 2021-08-18 21:22:13 +0000 UTC [ - ]

Here's the distinction: I used to be able to use iCloud photos without having my photos scanned, and now I can't. So I have to make a choice of either dropping iCloud photos completely or submit to having all of my photos scanned.

I don't think they have been doing server-side scanning until now, hence the publicity. Do you have any evidence that shows they've been doing this before?

CubsFan1060 2021-08-18 21:41:09 +0000 UTC [ - ]

petersellers 2021-08-18 22:31:29 +0000 UTC [ - ]

The article you linked makes a surprising claim, and it contradicts what the EFF said recently about this here - https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...

> Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images

It also seems weird that the EFF wouldn't have complained about this before if Apple was known to be doing server-side scanning for some time now.

I also am not going to watch through an hour video, but I scanned the transcript and I didn't see anything that said that Apple currently (at that time) scanned content.

CubsFan1060 2021-08-18 22:53:25 +0000 UTC [ - ]

I can’t find any articles about eff complaining about gmail, onedrive, or discord doing similar scanning either. All of those services (and more) do similar scans.

endisneigh 2021-08-18 20:59:34 +0000 UTC [ - ]

But you can tho…

petersellers 2021-08-18 21:03:35 +0000 UTC [ - ]

Please show evidence where the feature can be turned off (without having to completely disable iCloud photos).

endisneigh 2021-08-18 21:16:05 +0000 UTC [ - ]

You turn it off by turning off iCloud photos, I never claimed otherwise.

If you don’t trust Apple why would you use iCloud anyway? Makes no sense.

petersellers 2021-08-18 22:27:30 +0000 UTC [ - ]

> You turn it off by turning off iCloud photos, I never claimed otherwise.

I just have to point out the ridiculousness of this statement. With your logic any feature in any product can be "turned off" by not using the entire product at all. For example, the radio in my car sounds like shit, I guess I should just stop driving completely to avoid having to hear it.

In reality, this new "feature" will be a requirement of using iCloud Photos. The feature itself cannot be turned off. If your answer is to stop using iCloud Photos, that is no help for the millions of people who currently use iCloud Photos.

> If you don’t trust Apple why would you use iCloud anyway? Makes no sense.

I've trusted Apple for a long time because I felt like they were one of the only companies that cared about consumer privacy. After these actions I am less convinced of that. I'm not sure why that stance is so surprising.

endisneigh 2021-08-18 22:40:47 +0000 UTC [ - ]

> In reality, this new "feature" will be a requirement of using iCloud Photos. The feature itself cannot be turned off. If your answer is to stop using iCloud Photos, that is no help for the millions of people who currently use iCloud Photos.

iCloud Photos can be used on Windows, for example. This scanning only applies to iOS devices. You could use iCloud Photos and not be subject to the scanning.

> I've trusted Apple for a long time because I felt like they were one of the only companies that cared about consumer privacy. After these actions I am less convinced of that. I'm not sure why that stance is so surprising.

Nothing about what they're doing is contradictory with privacy beyond what they're already doing. The only reason they're even implementing it this way is because they do care about privacy. They could just not encrypt and scan on the server like Google, Microsoft, Dropbox, Box.com and more.

petersellers 2021-08-18 23:12:19 +0000 UTC [ - ]

> iCloud Photos can be used on Windows, for example. This scanning only applies to iOS devices. You could use iCloud Photos and not be subject to the scanning.

That's great for the <0.001% of people who use iCloud Photos without an iPhone. Everyone else is SOL.

>Nothing about what they're doing is contradictory with privacy beyond what they're already doing. The only reason they're even implementing it this way is because they do care about privacy. They could just not encrypt and scan on the server like Google, Microsoft, Dropbox, Box.com and more.

False dichotomy. Apple doesn't have to do either of these things.

endisneigh 2021-08-18 23:20:51 +0000 UTC [ - ]

NCMEC is an arm of the government. Either they share voluntarily or they get subpoenaed endlessly. Why do you think all of these companies even do this? lol

in any case I've given you the solution on how to use iCloud and not be scanned. Take your photos, sync your iDevice to your computer and manually upload to iCloud photos. there you go.

petersellers 2021-08-18 23:39:07 +0000 UTC [ - ]

> NCMEC is an arm of the government. Either they share voluntarily or they get subpoenaed endlessly.

A subpoena is a lot different than scanning every single photo on every user's device automatically.

> Why do you think all of these companies even do this? lol

Because most companies don't give a shit about privacy? And they will cave at even the slightest government pressure to do so. Honestly it's probably easier for them that way (but worse for the consumer).

> in any case I've given you the solution on how to use iCloud and not be scanned. Take your photos, sync your iDevice to your computer and manually upload to iCloud photos. there you go.

Gee thanks, your "solution" is 1000x harder to use than just using iCloud Photos on your iPhone. One of the biggest selling points of it now is for convenience, and this is pretty obvious so I'm starting to think you're just trolling at this point.

endisneigh 2021-08-19 00:11:43 +0000 UTC [ - ]

If Apple really didn’t care about privacy they’d end e2ee and scan on the server side like Google and Microsoft.

> Because most companies don't give a shit about privacy? And they will cave at even the slightest government pressure to do so. Honestly it's probably easier for them that way (but worse for the consumer).

Most people don’t care. I’m giving you solutions and you’re taking about convenience how hard it is. Plugging in your phone is trivial. I assume you’re going to stick with iCloud even despite this “privacy” issue? If not, what are you switching to?

petersellers 2021-08-19 01:12:56 +0000 UTC [ - ]

> If Apple really didn’t care about privacy they’d end e2ee and scan on the server side like Google and Microsoft.

It's not black or white. Clearly Apple does care more about privacy than most other tech companies. That doesn't mean we shouldn't be critical of them when they make a mistake.

Also, Apple doesn't use e2ee now. AFAIK it's possible for them to decrypt the contents of your iCloud content, and they will do so and forward your data if legally required to.

> Most people don’t care. I’m giving you solutions and you’re taking about convenience how hard it is. Plugging in your phone is trivial.

I don't see what is so hard for you to understand about this - because of Apple's decision, I gain nothing and lose privacy. I shouldn't have to jump through extra hoops to avoid that, and it's reasonable to dislike Apple's position on this. In order to replicate the same functionality I would have to remember to sync my phone every night, and if I ever forgot and my phone died, I just lost data. Yours is an indefensible position when a MUCH easier solution exists today.

> I assume you’re going to stick with iCloud even despite this “privacy” issue? If not, what are you switching to?

That's a bold assumption. I haven't decided yet, for a couple reasons. One is that it's a pain in the ass to switch providers, so I'm going to wait and see if Apple actually follows through with it. Two is that it's possible that Apple is only implementing this so that they can in fact implement full e2ee and then tell law enforcement to kick rocks when they ask for user data (only allowing them to see the CSAM results before they are uploaded to the cloud). I might be willing to accept that compromise, but it's not clear that that is what their plan is.

basisword 2021-08-19 07:30:49 +0000 UTC [ - ]

If it's in the cloud you can choose not to use the cloud service. If it's on device - a device you'e spent over $1000 on and already own - you don't have a choice (unless you want to forgo all updates, including security patches).

the8472 2021-08-18 21:31:16 +0000 UTC [ - ]

> Personally I’d prefer this than no encryption and scanning in the cloud, which is already possible.

What is also possible: No scanning on your device and encrypted cloud storage. E.g. borg + rsync.net, mega, proton drive.

fuzzer37 2021-08-18 20:33:46 +0000 UTC [ - ]

> Personally I’d prefer this than no encryption and scanning in the cloud, which is already possible.

Why is that the alternative? How about everything is encrypted and nothing is scanned.

endisneigh 2021-08-18 20:38:46 +0000 UTC [ - ]

This is already possible if you self host your stuff. I’m talking about iCloud specifically here - backups are not encrypted so it’s either scan in the backend or scan on the client.

If you don’t want to be scanned you can turn it off. I honestly don’t see the issue. It seems the only thing people can say are hypothetical situations here.

saynay 2021-08-18 20:48:02 +0000 UTC [ - ]

Realistically, because if they don't have something to scan for it, countries / EU are going to require them to provide a backdoor to scan it. There are already proposals in the EU to that affect.

robertoandred 2021-08-18 20:29:33 +0000 UTC [ - ]

Never let a good Apple controversy go to waste. Clickbait blogs are thrilled to have something like this.

LatteLazy 2021-08-18 21:17:00 +0000 UTC [ - ]

>Personally I’d prefer this than no encryption and scanning in the cloud

How about neither? Just let people have their privacy. Some will misuse it. Thats life.

aborsy 2021-08-18 20:17:58 +0000 UTC [ - ]

With the rise of end to end encrypted data and messaging, such as signal and WhatsApp, it makes sense for governments to shift the search to user’s devices and operating systems rather than cloud, before stuff gets encrypted.

ALittleLight 2021-08-18 20:49:56 +0000 UTC [ - ]

Would this work as an attack?

1. Get a pornographic picture involving young though legal actors and actresses.

2. Encode a nonce into the image. Hash it checking for CSAM collisions. If you've found a collision go on to the next step, if not update the nonce and try again.

3. You now have an image that, to visual inspection will appear plausibly like CSAM, and to automated detection will appear like CSAM. Though, presumably, it is not illegal for you to have this image as it is, in fact, legal pornography. You can now text this to anyone with an iPhone who will be referred by Apple to law enforcement.

shapefrog 2021-08-18 21:33:07 +0000 UTC [ - ]

Your hash will match to say image 105 of the dataset. Upon visual inspection, your 'legal porn' is going to have to at least look passably like image 105 of the dataset to get anywhere.

So at this point we have an image that computers think is CSAM and people think is CSAM, and when held up next to the original verified horrific image everyone agrees is the same image. At this point, someone is going to ask, rightly so, where that came from.

In order to generate this attack, you have had to go out and procure, deliberately, known CSAM. Ignoring that it would be easier just to send that to the target, rather than hiring talent to recreate the pose of a specific piece of child porn (or 30 pieces to trigger the reporting levels), the most likely person by orders of magnitude to be prosecuted in this scenario is the attacker.

rlpb 2021-08-18 22:17:45 +0000 UTC [ - ]

> Upon visual inspection, your 'legal porn' is going to have to at least look passably like image 105 of the dataset to get anywhere.

Define "get anywhere". Why won't you get raided by the police and have all your devices seized first?

shapefrog 2021-08-18 22:23:07 +0000 UTC [ - ]

No, probably wouldnt even get to the desk of the police. End of the road would be NCMEC, to whom apple would refer hash matching images that pass human verification that are close enough to porn.

If your 30 or so hash matching images matched their corresponding known CSAM then that goes on to the police and then they knock on your door.

rlpb 2021-08-18 23:21:24 +0000 UTC [ - ]

> to whom apple would refer hash matching images that pass human verification

I am under the impression that Apple's scheme allows them only to verify the output of the matching algorithm (the "safety vouchers"), and not the image content itself. So in the hypothetical situation described in the thread, it won't be possible for Apple to detect the false positive, and they could pass on the report to NCMEC.

I fear that this will lead to a law enforcement raid without any actual human verification of the offending image itself.

If I'm wrong, I welcome citations which demonstrate the opposite.

shapefrog 2021-08-19 09:41:34 +0000 UTC [ - ]

Happy to provide citation of how the apple scheme works [1].

You should also point out that the NCMEC themselves are not law enforcment.

[1] https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

rlpb 2021-08-19 12:12:42 +0000 UTC [ - ]

Right, and what in there suggests to you that Apple can view an image after a sufficient number of images match? I see the opposite: "Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account."

That suggests that Apple can't access the actual matched CSAM images at all.

> You should also point out that the NCMEC themselves are not law enforcment.

I don't think the distinction is relevant. The point is that they will get passed on for enforcement purposes, and at that point, innocent parties will find themselves raided and their devices seized without any human actually manually verifying that it is CSAM first.

Ashanmaril 2021-08-18 21:18:24 +0000 UTC [ - ]

If you're gonna go through that much effort, you might as well just text someone an actually illegal image from something other than an iPhone

You're already attempting to frame someone for a crime, might as well commit another crime while you're at it

arn 2021-08-18 20:58:29 +0000 UTC [ - ]

How would you know if it's a CSAM collision? I don't believe that database is publicly available anywhere, for obvious reasons.

ALittleLight 2021-08-18 21:12:22 +0000 UTC [ - ]

Good point. I can offer three possibilities to how you might know. First, a data leak of the material. Second, law enforcement presumably gives hashes of known CSAM to service operators so they can detect and report it. You could pose as or be the operator of such a service and get the hashes that way. Third, if you were a government operator you may have access to the hashes that way. (Although I guess corrupt government agents would have other easier ways of getting to you)

FabHK 2021-08-18 21:13:48 +0000 UTC [ - ]

Yes. That, plus it's unclear whether you can create a collision with a nonce. It's a perceptual hash, not a cryptographic one. Lastly, to compute that collision might be so expensive that you could instead compute small SHA-256 hashes, ie mine BTC, and use the money to obtain your sinister goals via other means.

arsome 2021-08-18 21:09:39 +0000 UTC [ - ]

Apple's client clearly has some revealing information on that...

robertoandred 2021-08-18 21:19:06 +0000 UTC [ - ]

Why would you save to your personal photo library some porn that a creepy rando sent to you in a text?

YeBanKo 2021-08-19 05:00:10 +0000 UTC [ - ]

> Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

When did they make NeuralHash public?

almostdigital 2021-08-18 21:41:29 +0000 UTC [ - ]

It's clear now that Apple is never going to back down. It's bittersweet, I'm sad to see them go but also excited for what I think will be a FOSS renaissance.

jl6 2021-08-18 19:54:47 +0000 UTC [ - ]

> Anti-Child Abuse Imagery Tech

Room for improvement in the headline.

bruce343434 2021-08-18 21:23:46 +0000 UTC [ - ]

Oh man. They really want to die on this hill don't they.

m3kw9 2021-08-18 20:42:22 +0000 UTC [ - ]

Hashes can always collide, but how likely is it for neuralhash?

_trampeltier 2021-08-18 21:24:29 +0000 UTC [ - ]

Apple does check it just before you upload it to icloud. How much money do they save it's is checked clientside? How much would it cost for Apple to do it on there own servers, like everybody else does it?

63 2021-08-18 21:37:06 +0000 UTC [ - ]

Iirc the rationale for doing it clientside isn't saving money, but maintaining encryption. If it's checked client side, Apple should never get unencrypted uploads unless they're suspected to be CSAM

_trampeltier 2021-08-19 08:04:01 +0000 UTC [ - ]

Yes thats the official version. I still wonder what would be the cost difference. In this scale it is not just for free.

eurasiantiger 2021-08-18 20:48:21 +0000 UTC [ - ]

”the analyzed code is not the final implementation that will be used with the CSAM system itself and is instead a generic version”

So they are already running a generic version of this system since iOS 14.3?

gjsman-1000 2021-08-18 19:31:51 +0000 UTC [ - ]

For everyone upset about Apple's CSAM scanning, I think we all forgot about the EARN IT Act. It was nearly passed last year but Congress was finished before it could be voted on. It had Bipartisan support and would've virtually banned E2E of any kind. And it would have required scanning everywhere according to the recommendations of a 19-member board of NGOs and unelected experts. The reason for this mandatory backdoor was child safety... even though AG Barr admitted that reducing the use of encryption had useful side effects.

If you are Apple, even though EARN IT failed... you know where Washington's heart lies. Is CSAM scanning a "better alternative", a concession, an appeasement, a lesser evil, in the hope this prevents EARN IT from coming back?

Also, many people forgot about the Lawful Access to Encrypted Data Act of 2020, or LAED, which would unilaterally banned E2E encryption in entirety and required that all devices featuring encryption must be unlockable by the manufacturer. That also was on the table.

samename 2021-08-18 19:38:14 +0000 UTC [ - ]

Why can’t we have the alternative we already have, where we don’t have our phones scans and E2EE exists without compromise? Why do we have to accept it any other way?

If you’re trying to frame this as “we need to prevent Congress from ever passing something like the EARN IT act”, I agree. Apple and other tech companies already lobby Congress. Why aren’t they lobbying for encryption?

gjsman-1000 2021-08-18 19:39:34 +0000 UTC [ - ]

They did, they lobbied heavily against EARN IT, and so did groups like the ACLU and EFF. However, it didn't stop Congress members from moving it along through the process - they were only stopped because Congress just ran out of time. It was clear that Congress was not interested in listening at all to the lobbyists on the issue.

It's clear that EARN IT could literally be revived any day if Apple didn't do something to say "we don't need it because we've already satisfied your requirements."

orangecat 2021-08-18 19:56:36 +0000 UTC [ - ]

It's clear that EARN IT could literally be revived any day if Apple didn't do something to say "we don't need it because we've already satisfied your requirements."

Alternatively, "Apple has shown that it's possible to do without undue hardship, so we should make everyone else do it too".

gjsman-1000 2021-08-18 19:59:35 +0000 UTC [ - ]

Congress doesn't give a rip about "undue hardship." Otherwise why would we pay income tax?

They were going to legally mandate that everything be scanned through methods less private than the ones Apple has developed here, through EARN IT and potentially LAED (which would have banned E2E in all circumstances and any device that could not be unlocked by the manufacturer). While that crisis was temporarily averted, the risk of it coming back was and is very real.

Apple decided to get ahead of it with a better solution, even though that solution is still bad. It's a lesser evil to prevent the return of something worse.

cwizou 2021-08-18 20:04:02 +0000 UTC [ - ]

While I broadly agree that this (and other attempts in Australia and rumblings about it in UK) may have been the impetus for developing the technology, I'm not sure it explains why they released it on Photos, where Apple holds the keys, and not very far since you can look at your entire photo roll on icloud.com with just your username and password.

That tech is not being deployed on iMessage which is the only e2ee(ish) service from Apple (with Keychain) and is what those legislative attempts are usually targeting. One could argue it would have made sense (technically) there though, sure.

Was it a reason to release it preventively, on something unrelated, to be in the good graces of legislators ? I'm not sure it's a good calculation, and it doesn't cover other platforms like Signal and Telegram that would still be seen as a problem by those legislators and require them to legislate anyway.

heavyset_go 2021-08-18 19:50:06 +0000 UTC [ - ]

This seems like speculation with no evidence. The government cares about more than just CSAM, they care about terrorism, human and drug trafficking, organized crime, gangs, fraud, drug manufacturing etc.

This would only make sense if Apple intends to expand their CSAM detection and reporting system to detect and report those other things, as well.

gjsman-1000 2021-08-18 19:52:10 +0000 UTC [ - ]

The EARN IT Act would have basically legally mandated a backdoor in all services, with shifting recommendations, in the name of preventing the spread of CSAM. Sound familiar?

Also, there is another reason why there is the CSAM Detecting and Reporting system. With Apple CSAM Scan, that big "excuse" Congress was planning to use through EARN IT to ban E2E is diffused, meaning now Apple has the potential to add E2E to their iCloud service before Congress can figure out a different excuse.

falcolas 2021-08-18 19:54:31 +0000 UTC [ - ]

CSAM scanning is only one type of scanning which is performed at the behest of governments and corporations around the world. This alone will not allow for end-to-end encryption of items in the cloud.

There would need to be end-device scanning for arbitrary objects, including full text search for strings including 'Taiwan', 'Tiananmen Square', '09 F9', and so forth to even begin looking at e2e encryption of your items in the cloud.

At which point… what's the point?

gjsman-1000 2021-08-18 20:02:44 +0000 UTC [ - ]

Not really. E2E iMessage is still available in China if you disable the iCloud Backup. It's also the only messenger in China with an E2E option.

charcircuit 2021-08-18 20:28:16 +0000 UTC [ - ]

>which would unilaterally banned E2E encryption in entirety

It did not do this. The bill was essentially asking for search warrants to become a part of the protocol. If you're only solution to allowing for search warrants to work is to stop encrypting data I feel you are intentionally ignoring other options to make this seem worse than it is.

stormbrew 2021-08-18 20:44:45 +0000 UTC [ - ]

I am very intrigued about what you think the other options are that preserve E2E encryption. How do you make search warrants part of the process without some kind of escrow or third party involvement, at which point it is no longer "end to end"?

alerighi 2021-08-18 19:39:07 +0000 UTC [ - ]

You can't ban encryption, it's practically impossible, it's like banning math.

vineyardmike 2021-08-18 19:59:47 +0000 UTC [ - ]

You can ban it's use, you can't ban its existence. You can't mandate math work differently, you just mandate people not use math.

Its amazing since this would have decimated the American Tech sector in many unknown ways.

Unklejoe 2021-08-18 20:33:20 +0000 UTC [ - ]

All they have to do is force Apple and Google to ban it and any apps that use it from their app store and they've effectively banned encryption for like 99% of people.

eurasiantiger 2021-08-18 20:51:36 +0000 UTC [ - ]

Well, we are already banning numbers (see: DeCSS), what’s the big difference?

In some countries even discussing the application of certain numbers is unlawful.

rolph 2021-08-18 19:44:54 +0000 UTC [ - ]

in a loose sense, mathematics, and its associated notations are a form of encryption

bitwize 2021-08-18 20:28:08 +0000 UTC [ - ]

The ban won't make encryption disappear. You can still have it. But if you're found using it, it's a felony.

gjsman-1000 2021-08-18 19:40:22 +0000 UTC [ - ]

If you can get jailed for speech, you can get jailed for encryption.

jchw 2021-08-18 19:58:16 +0000 UTC [ - ]

Technically in the real world you can get jailed for anything as long as corruption exists and humans remain imperfect.

But you shouldn’t get jailed for protected speech and you shouldn’t get jailed for preserving your privacy (via encryption or otherwise.) As cynical as people may get, this is one thing that we have to agree on if we want to live in a free society.

And above all, most certainly, we shouldn’t allow being jailed over encryption to become codified as law, and if it does, we certainly must fight it and not become complacent.

Apathy over politics, especially these days, is understandable with the flood of terrible news and highly divisive topics, but we shouldn’t let the fight for privacy become a victim to apathy. (And yes, I realize big tech surveillance creep is a fear, but IMO we’re starting to get into more direct worst cases now.)

jedmeyers 2021-08-18 20:17:49 +0000 UTC [ - ]

> If you are Apple, even though EARN IT failed... you know where Washington's heart lies.

If you are IG Farben, you know where Berlin's heart lies...

bArray 2021-08-19 00:24:28 +0000 UTC [ - ]

> [..] the overall system is designed to account for this to happen in general, and that the analyzed code is not the final implementation that will be used with the CSAM system itself and is instead a generic version.

I think the claim here is that you won't have access to the source images, and therefore generating collisions will be more difficult. But, if you do have access to the source images, this has been shown to be trivial. This of course doesn't stop nations states generating images that cause hash collisions, in fact they would be incentivized to do so.

I would also add that Apple are behind the curve, attempts to crack the hashing algorithm more efficiently are still ongoing: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...

> [..] not the final implementation [..]

Why on earth would you invite people to come and test your algorithm and then say "sure, you broke it, but it's not the real one". This kind of defeats the point and seems like some bait and switch bullshit. I suspect this is some retroactive cope from management realising they can't deploy this version and whatever they do deploy needs to be heavily modified.

> If Apple finds they are CSAM, it will report the user to law enforcement.

One statistic I want to know is: How many people already trigger this report function in the wild? Surely currently is the largest number of positives they will ever have - if it turns out to be 0% - Apple should just scrap it.

> Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

So to avoid reporting, simply block Apple servers? Also, security by obscurity is not security - the algorithm supposedly being private just means that its not properly tested and Apple is not held to account.

> "Apple actually designed this system so the hash function doesn't need to remain secret, as the only thing you can do with 'non-CSAM that hashes as CSAM' is annoy Apple's response team with some garbage images until they implement a filter to eliminate those garbage false positives in their analysis pipeline," Nicholas Weaver, senior researcher at the International Computer Science Institute at UC Berkeley, told Motherboard in an online chat.

No. A report could be considered 'reasonable doubt' for law enforcement to do a full search. Imagine trying to explain to a judge why your iPhone shouldn't be searched because of a false-positive CSAM hash collision because of a malicious website you visited or a text message you received.

guerrilla 2021-08-18 20:55:42 +0000 UTC [ - ]

Something just occurred to me. If this goes forward, and then Microsoft and/or Google copy it, then we might see lawmakers expecting it. If that becomes the case, then it'd be yet another barrier to entry (in addition to a cop on every phone.)

Isn't that where we already with things like Article 17 of the EU's Copyright Directive?

shuckles 2021-08-18 21:05:05 +0000 UTC [ - ]

This is already the world we live in, with vendors like Thorn building SaaS solutions for people who accept UGC but don’t have the clout to integrate with NCMEC and Microsoft PhotoDNA directly.

trident5000 2021-08-18 20:52:17 +0000 UTC [ - ]

Erosion of privacy and freedom always starts with "but think of the children"

joelbondurant 2021-08-18 20:40:17 +0000 UTC [ - ]

ALL the dozens of people I know at Apple would do anything to create a job looking at kiddie porn.

throwawaymanbot 2021-08-18 20:26:35 +0000 UTC [ - ]

This was never about protecting Children it seems. This was about the ability to create and install the infrastructure for the mass surveilling of people by smart device.

Hashes can be created for anything on a phone. And hash collisions enable the "Near match" of hashes (Similar items).

Lets pretend.. you use your face to log in to an iPhone, and there is a notice out for a certain person. If your face matches the hash, will you be part of the scan? You betcha.

zakember 2021-08-18 20:52:21 +0000 UTC [ - ]

Is no one going to talk about how Apple is implementing this?

If Apple is training a neural network to detect this kind of imagery, I would imagine there to be thousands, if not millions of child pornography images on Apple's servers that are being used by their own engineers to train this system

arn 2021-08-18 20:53:38 +0000 UTC [ - ]

It's not a neural network. It's not AI. It's not trying to interpret image content. It's trying to identify known specific images, but is using a fuzzy fingerprinting match.

zimpenfish 2021-08-18 21:11:25 +0000 UTC [ - ]

> If Apple is training a neural network to detect this kind of imagery

NCMEC generate the hashes using their CSAM corpus.

firebaze 2021-08-18 19:59:05 +0000 UTC [ - ]

I think we're up for a new Apple CEO in a few days.

csilverman 2021-08-18 20:13:00 +0000 UTC [ - ]

The turmoil of Apple abruptly booting the CEO who's overseen the most obscenely prosperous stretch in its history would cost the company vastly more than the current controversy about CSAM.

I'm not saying this as a fan of either Cook or their anti-CSAM measures; I'm neither, and if anyone is ever wrongfully arrested because Apple's system made a mistake, Cook may well wind up in disgrace depending on how much blame he can/can't shuffle off to subordinates. I don't think we're there yet, though.

atonse 2021-08-18 20:30:09 +0000 UTC [ - ]

He's not Ballmer but Ballmer also presided over record profits even though MS barely did anything remotely interesting under his watch.

Just saying that money hides problems.

cirrus3 2021-08-18 23:38:29 +0000 UTC [ - ]

They are down to hash collision arguments against this now?

Sounds pretty desperate.

As if any normal user is going to upload a photo to iCloud that is a collision.

The fact that such images are possible to generate means nothing by itself.

Also, they would need to accidentally have 30 of them.

Also, a human would have to not be able to tell the difference.