Hugo Hacker News

Tell Apple: Don’t Scan Our Phones

nbzso 2021-08-18 08:43:57 +0000 UTC [ - ]

I have transformed my company from macOS to Arch Linux in one week. Never buying Apple product again.

People who find "normal" to host a hashed database with CSAM on their devices and computers which is provided by private organization(received 30+ millions from DOJ) are making a wrong choice.

People who are technically educated and refusing to see that this implementation is not required by any law, "encryption" and "system design" is made to full the masses and remove Apple from responsibility, are stupid.

Period. If someone comes to work for me and demands Apple hardware I will show him/her the door. I don't want to work with people who prioritize "emotional perception" over "rational analysis".

Apple singlehandedly created a new interpretation of "privacy" in which is normal to be "observed" on your property.

I stick my neck for the original definition which is;

A state in which one is not observed or disturbed by other people.

vezycash 2021-08-17 21:43:14 +0000 UTC [ - ]

Once Apple releases this 'feature', a law would eventually be passed to force Google to add the same to Android.

2OEH8eoCRo0 2021-08-18 00:46:30 +0000 UTC [ - ]

We can't know this so why is this always brought up? To try to deflect from Apple?

DrBenCarson 2021-08-17 21:46:11 +0000 UTC [ - ]

Oh don't worry, Google is way ahead of Apple when it comes to scanning users' data on devices. I would be surprised if Google weren't doing this but remotely (vs. on-device).

aaomidi 2021-08-17 22:00:41 +0000 UTC [ - ]

They are doing this remotely, and the thing is its BETTER to do it remotely than on device.

The issue here with Apple is that they want to move this type of scanning on-device. No one has really complained about them scanning for CSAM on iCloud.

The data is NOT e2ee in iCloud, there is literally no reason for them to move this scanning to on-device.

skygazer 2021-08-17 23:18:56 +0000 UTC [ - ]

It does seem overwhelmingly likely to me that Apple intended to clear a path to E2EE of iCloud Photos with this, however, I'm utterly stumped why they haven't made any announcement nor even suggested it as a motivation. It is clearly their best possible rationale for moving the perceptual hashing from cloud to device, and many people would say, "oh, well, I guess I understand that trade off, then." But without that announcement, this looks pointless on top of invasive. And after more than a week of punishing press, it seems even more unbelievable.

heavyset_go 2021-08-17 23:26:44 +0000 UTC [ - ]

> It does seem overwhelmingly likely to me that Apple intended to clear a path to E2EE

Some people say this, but the government doesn't just care about CSAM. They care about terrorism, human and drug trafficking, organized crime, gangs, fraud etc.

Unless Apple plans on expanding their system to detect those other things as well, I don't think CSAM detection alone is indicative of a clear path to E2EE. In fact, I see no evidence at all that indicates their intention of this project is to implement E2EE.

falcolas 2021-08-17 23:33:09 +0000 UTC [ - ]

> Apple intended to clear a path to E2EE of iCloud Photos with this

Here’s one problem with that idea: They would have to support scanning on the device for everything they are able to scan in the cloud today, in every country. Without changes, their current scanner would not do the trick (unless, of course, they’re lying about how it works).

SevenSigs 2021-08-18 05:48:30 +0000 UTC [ - ]

I dont think that he is talking about scanning in the cloud which we know that Google is already doing for profit.... he is talking about scanning on your phone from a remote server which is WORST.

new_realist 2021-08-17 21:55:53 +0000 UTC [ - ]

It can be more transparent to scan on users devices, which are in the possession of the public, than it can be to scan through user dat in the cloud, which is hosed in private and controlled "black box" data centers. This can be a step forward for user privacy and auditability, if Apple plays it that way.

politelemon 2021-08-17 22:30:00 +0000 UTC [ - ]

This is not a useful petition. Regardless of the outcome, their nature has been shown once again, but this time it is in a more egregious and nefarious manner, and a lot more people are taking notice.

The reason it's not useful is that it temporarily hides away a side, or an image, that users aren't comfortable with. It will not change intentions and facts, it only exists so that people with brand loyalty and a brand identity can feel better about being tied to an ecosystem.

The problem here is the brand identity, in a truly privacy friendly ecosystem, no such thing should exist. I encourage people to not sign it, and instead reflect on what privacy options do exist without a marketing message telling you what it should be.

istingray 2021-08-18 00:03:11 +0000 UTC [ - ]

Good, put this in an email and send to tcook@apple.com. If you don't like EFF's stance on this that's fine, I wrote my own email to Tim instead.

tzs 2021-08-18 02:30:35 +0000 UTC [ - ]

It's also not a useful petition because it is about both the scanning of all iCloud uploads feature and the new parental control feature that scans incoming images on children's phones.

The parental control feature requires explicit opt in when the parent's set up the child's phone. When it finds a questionable image it warns the child and asks if they want to reject it or accept it.

If the child is at least 13 and accepts the image, that is the end of it. Their parents are not notified.

If the child is under 13 they are asked again if they really want to accept it, and told that if they do their parents will be notified.

Only if the child then goes ahead and accepts the images do the parents get notified.

Most objections to these features I've seen other than from the EFF are just on the upload scanning. I suspect that there are plenty of people who would sign a petition on that, but that are OK with the parental controls.

It would be much more useful to have separate petitions for the two features.

DrBenCarson 2021-08-17 21:48:38 +0000 UTC [ - ]

> Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system.

I'm not on board with what Apple's doing here, but is there any evidence to suggest this statement? From what I know, this is at best misleading and at worst downright false. For example, the scanning is done on user devices so that image data remains encrypted from the time it leaves a users' phone to the time it is retrieved by a user (from any client).

aborsy 2021-08-17 21:57:27 +0000 UTC [ - ]

Like, what sort of evidence do you want exactly?

Apple can search for arbitrary information on user’s property. If you can search ciphertext, it’s not end to end encrypted anymore. End to end means no knowledge of plaintext should be discernible (sometimes even metadata).

Further, the dataset is set by them, is opaque and can be anything.

That’s obviously a back door in encryption (for government).

zepto 2021-08-17 22:00:42 +0000 UTC [ - ]

> Apple can search for arbitrary information on user’s property.

This is simply false, if you are referring to the CSAM mechanism.

aborsy 2021-08-17 22:06:04 +0000 UTC [ - ]

Which part is false?

They state they begin with image data (with a data set they control). In the future, they “can” evolve the scope to anything.

Read the EFF articles. They are well written.

zepto 2021-08-18 00:11:02 +0000 UTC [ - ]

> In the future, they “can” evolve the scope to anything.

So, they can’t now then.

robertoandred 2021-08-18 00:03:40 +0000 UTC [ - ]

The EFF’s articles were full of lies and false assumptions.

josephcsible 2021-08-17 22:05:33 +0000 UTC [ - ]

What do you think is stopping Apple from including a hash of Tank Man along with all of the real CSAM hashes?

zepto 2021-08-18 00:12:46 +0000 UTC [ - ]

The fact you can only use ‘tank man’ as the example proves that they can’t search for arbitrary information.

It’s a meaningless example in any case given that Chinese authorities have no problem forcing people to install spyware directly on their devices.

sharken 2021-08-17 22:35:51 +0000 UTC [ - ]

I guess it is a rhetorical question, but this is what will happen next if Apple doesn't halt this scanning initiative.

Too big to listen seems to be what Apple thinks of this matter.

zepto 2021-08-17 21:59:26 +0000 UTC [ - ]

It’s both false and unsubstantiated. As described, it is not a back door, and there is no evidence that it was done under pressure.

Either of them could turn out to be true, given some evidence.

99mans 2021-08-17 21:54:29 +0000 UTC [ - ]

Exactly right, from what you know, which is limited to the point of useless on a proprietary, closed source system. It's almost guaranteed to be backdoored being as such.

zepto 2021-08-17 22:01:19 +0000 UTC [ - ]

> It's almost guaranteed to be backdoored being as such.

I.e. it’s a guess.

finger 2021-08-17 22:07:13 +0000 UTC [ - ]

Don’t forget iPad OS.

99mans 2021-08-17 21:43:08 +0000 UTC [ - ]

This petition, or whatever it is supposed to be, completely mis-characterizes the problem entirely. It isn't that Apple can scan photos, it's that Apple can install any software of their choosing at any time, whether they choose to tell you or not. Therefore, there is no expectation of privacy on such closed source proprietary platforms.

iso1210 2021-08-17 21:33:44 +0000 UTC [ - ]

Disappointing number of signatures, 471 so far (2132 GMT)

I wonder how the Taliban would love to use this type of technology.

echelon 2021-08-17 21:40:26 +0000 UTC [ - ]

Don't just sign this.

Call or email your representatives and ask them to support the Open App Markets Act and the Digital Fair Repair Act.

https://www.house.gov/representatives/find-your-representati...

https://www.senate.gov/senators/senators-contact.htm

edit: Wow, two popular Apple stories removed from the HN front page today. How many more go down like this?

heavyset_go 2021-08-17 23:20:06 +0000 UTC [ - ]

Message dang and see if they were erroneously flagged by users. I think he said something about stories getting removed from the frontpage based on how many users flag them.

r00fus 2021-08-18 03:40:18 +0000 UTC [ - ]

Those do nothing to help this issue.