Tell Apple: Don’t Scan Our Phones
vezycash 2021-08-17 21:43:14 +0000 UTC [ - ]
2OEH8eoCRo0 2021-08-18 00:46:30 +0000 UTC [ - ]
DrBenCarson 2021-08-17 21:46:11 +0000 UTC [ - ]
aaomidi 2021-08-17 22:00:41 +0000 UTC [ - ]
The issue here with Apple is that they want to move this type of scanning on-device. No one has really complained about them scanning for CSAM on iCloud.
The data is NOT e2ee in iCloud, there is literally no reason for them to move this scanning to on-device.
skygazer 2021-08-17 23:18:56 +0000 UTC [ - ]
heavyset_go 2021-08-17 23:26:44 +0000 UTC [ - ]
Some people say this, but the government doesn't just care about CSAM. They care about terrorism, human and drug trafficking, organized crime, gangs, fraud etc.
Unless Apple plans on expanding their system to detect those other things as well, I don't think CSAM detection alone is indicative of a clear path to E2EE. In fact, I see no evidence at all that indicates their intention of this project is to implement E2EE.
falcolas 2021-08-17 23:33:09 +0000 UTC [ - ]
Here’s one problem with that idea: They would have to support scanning on the device for everything they are able to scan in the cloud today, in every country. Without changes, their current scanner would not do the trick (unless, of course, they’re lying about how it works).
SevenSigs 2021-08-18 05:48:30 +0000 UTC [ - ]
new_realist 2021-08-17 21:55:53 +0000 UTC [ - ]
politelemon 2021-08-17 22:30:00 +0000 UTC [ - ]
The reason it's not useful is that it temporarily hides away a side, or an image, that users aren't comfortable with. It will not change intentions and facts, it only exists so that people with brand loyalty and a brand identity can feel better about being tied to an ecosystem.
The problem here is the brand identity, in a truly privacy friendly ecosystem, no such thing should exist. I encourage people to not sign it, and instead reflect on what privacy options do exist without a marketing message telling you what it should be.
istingray 2021-08-18 00:03:11 +0000 UTC [ - ]
tzs 2021-08-18 02:30:35 +0000 UTC [ - ]
The parental control feature requires explicit opt in when the parent's set up the child's phone. When it finds a questionable image it warns the child and asks if they want to reject it or accept it.
If the child is at least 13 and accepts the image, that is the end of it. Their parents are not notified.
If the child is under 13 they are asked again if they really want to accept it, and told that if they do their parents will be notified.
Only if the child then goes ahead and accepts the images do the parents get notified.
Most objections to these features I've seen other than from the EFF are just on the upload scanning. I suspect that there are plenty of people who would sign a petition on that, but that are OK with the parental controls.
It would be much more useful to have separate petitions for the two features.
DrBenCarson 2021-08-17 21:48:38 +0000 UTC [ - ]
I'm not on board with what Apple's doing here, but is there any evidence to suggest this statement? From what I know, this is at best misleading and at worst downright false. For example, the scanning is done on user devices so that image data remains encrypted from the time it leaves a users' phone to the time it is retrieved by a user (from any client).
aborsy 2021-08-17 21:57:27 +0000 UTC [ - ]
Apple can search for arbitrary information on user’s property. If you can search ciphertext, it’s not end to end encrypted anymore. End to end means no knowledge of plaintext should be discernible (sometimes even metadata).
Further, the dataset is set by them, is opaque and can be anything.
That’s obviously a back door in encryption (for government).
zepto 2021-08-17 22:00:42 +0000 UTC [ - ]
This is simply false, if you are referring to the CSAM mechanism.
aborsy 2021-08-17 22:06:04 +0000 UTC [ - ]
They state they begin with image data (with a data set they control). In the future, they “can” evolve the scope to anything.
Read the EFF articles. They are well written.
zepto 2021-08-18 00:11:02 +0000 UTC [ - ]
So, they can’t now then.
robertoandred 2021-08-18 00:03:40 +0000 UTC [ - ]
josephcsible 2021-08-17 22:05:33 +0000 UTC [ - ]
zepto 2021-08-18 00:12:46 +0000 UTC [ - ]
It’s a meaningless example in any case given that Chinese authorities have no problem forcing people to install spyware directly on their devices.
sharken 2021-08-17 22:35:51 +0000 UTC [ - ]
Too big to listen seems to be what Apple thinks of this matter.
zepto 2021-08-17 21:59:26 +0000 UTC [ - ]
Either of them could turn out to be true, given some evidence.
99mans 2021-08-17 21:54:29 +0000 UTC [ - ]
zepto 2021-08-17 22:01:19 +0000 UTC [ - ]
I.e. it’s a guess.
99mans 2021-08-17 21:43:08 +0000 UTC [ - ]
iso1210 2021-08-17 21:33:44 +0000 UTC [ - ]
I wonder how the Taliban would love to use this type of technology.
echelon 2021-08-17 21:40:26 +0000 UTC [ - ]
Call or email your representatives and ask them to support the Open App Markets Act and the Digital Fair Repair Act.
https://www.house.gov/representatives/find-your-representati...
https://www.senate.gov/senators/senators-contact.htm
edit: Wow, two popular Apple stories removed from the HN front page today. How many more go down like this?
heavyset_go 2021-08-17 23:20:06 +0000 UTC [ - ]
nbzso 2021-08-18 08:43:57 +0000 UTC [ - ]
People who find "normal" to host a hashed database with CSAM on their devices and computers which is provided by private organization(received 30+ millions from DOJ) are making a wrong choice.
People who are technically educated and refusing to see that this implementation is not required by any law, "encryption" and "system design" is made to full the masses and remove Apple from responsibility, are stupid.
Period. If someone comes to work for me and demands Apple hardware I will show him/her the door. I don't want to work with people who prioritize "emotional perception" over "rational analysis".
Apple singlehandedly created a new interpretation of "privacy" in which is normal to be "observed" on your property.
I stick my neck for the original definition which is;
A state in which one is not observed or disturbed by other people.