Apple’s device surveillance plan is a threat to user privacy – and press freedom
sparker72678 2021-08-18 22:56:44 +0000 UTC [ - ]
So either we take this approach, or end up with a worse one, like pure on-cloud scanning with no transparency whatsoever.
I read the technical paper today, and this solution is super clever, well considered, and checks just about every box a crypto-solution-phile would want. I don't know how it gets better than this. (Technical note: Apple's solution on this requires client and server communication to flag an image, which is part of the reason it's totally off if you turn off iCloud Photo).
If this opens the door to Apple offering E2EE on all photos in iCloud (which it could), then this ends up being a big win over the status quo, in my opinion. And even if not, there are far, far worse alternatives.
[Edited for spelling].
joe_the_user 2021-08-19 00:12:06 +0000 UTC [ - ]
With cloud computing, I can choose my cloud provider and I can upload encrypted files. On device scanning is evil because it's move to create a computing architecture entirely outside the user's control.
almostdigital 2021-08-18 23:13:16 +0000 UTC [ - ]
Well of course it does, would be a very pointless surveillance system if it scanned for stuff and then threw away the result.
I've also read the technical papers and all other info put out by Apple so far and my takeaway is that the system is generic, it's not designed specifically for images but can be used with any old fingerprint and arbitrary data can be uploaded in the "security vouchers".
It does not take much imagination to see how it could be used to mass-surveil E2EE communications, hook it into Siri's intent system, add a bunch of banned intents and throw the message into the voucher. If a threshold of bad intents is reached a reviewer will check your messages to verify that you actually had some bad intent before dispatching a patrol to your location.
sparker72678 2021-08-18 23:17:14 +0000 UTC [ - ]
But right now all that data is available to Apple.
If they wanted to they could just generate a set of keys, replicate one of your iCloud devices, and send all your iMessages to a 3rd party, etc.
If CSAM scanning is coming, show me a better solution. I don't think there's a world ahead of us that doesn't include it for all platform-hosted photos.
almostdigital 2021-08-18 23:34:22 +0000 UTC [ - ]
My understanding from following this debacle is that law-enforcement don't have resources to investigate everyone who just have CSAM they focus on people who match and have other photos of novel abuse they can track down, and here they will only get an account name and a list of 30 CSAM photos the user had.
All this system will do is drive up Apples numbers on a "total child abuse found" scoreboard and do very little to help any actual children in need.
sparker72678 2021-08-18 23:59:39 +0000 UTC [ - ]
Still seems to be the direction the political winds are blowing.
joe_the_user 2021-08-19 00:19:14 +0000 UTC [ - ]
Part of the reason for these voluntary scans is because the state would have a difficult time implementing universal spyware on its own and it needs Apple for this. This stuff is by no means certain and how "we" helps determine whether it happens. Also, the state has secretly spied and attempted to legitimatize universal spying but we've had push back on many occasions. We should keep that up.
fmajid 2021-08-18 21:54:39 +0000 UTC [ - ]
simondotau 2021-08-19 01:36:14 +0000 UTC [ - ]
hda2 2021-08-19 03:04:03 +0000 UTC [ - ]
simondotau 2021-08-19 03:25:21 +0000 UTC [ - ]
Australia demands that Apple augment CSAM detection so that every iPhone in the USA starts scanning for (and reporting back to Australia about) some other set of images. There's absolutely no scenario where Apple wouldn't walk out of the room laughing.
End of hypothetical.
sparker72678 2021-08-18 23:27:45 +0000 UTC [ - ]
I'd love to be convinced that this mechanism could be easily extended to any content at any time without it being totally obvious to the rest of us that it's being used that way.
jjcon 2021-08-18 23:57:10 +0000 UTC [ - ]
If alongside this Apple rolled out the ability for them to initiate a scan on any phone for any thing (not a tool for mass surveillance but targeted surveillance) how would we know given Apples closed ecosystem?
This doesn't necessarily speak to the issue here but more to the Apple ethos in general, but I am curious.
zepto 2021-08-19 14:58:03 +0000 UTC [ - ]
simondotau 2021-08-19 01:38:36 +0000 UTC [ - ]
https://twitter.com/pwnallthethings/status/14248736290037022...
techdragon 2021-08-19 02:36:07 +0000 UTC [ - ]
simondotau 2021-08-19 03:40:23 +0000 UTC [ - ]
To begin with, there's no prospect of this being done in secret. So we're talking about Apple complying to demands being made in public.
In the case of the EU, it would have to be passed into law in a way that makes equal demands upon all of Apple's competitors. And in that case, everyone must comply with the law irrespective of whether any existing CSAM detection apparatus was in place. (Thus it's a sheer cliff, not a slippery slope.)
If Saudi Arabia or Sudan tried to turn the screws, the business case for Apple is absolutely clear-cut: they would leave. It wouldn't even be up for debate. There would be far too much at risk globally than they'd ever hope to gain/retain in that domestic market from compliance. Not only would they never risk serious damage to their global reputation (something they'll be extremely sensitive to, as the last two weeks have taught them) it would represent a massive opportunity for Apple to earn weeks of free media coverage that aligns with their security narrative.
puddingforears 2021-08-19 02:42:39 +0000 UTC [ - ]
HNPoliticalBias 2021-08-18 21:57:44 +0000 UTC [ - ]
jeremygaither 2021-08-18 22:23:04 +0000 UTC [ - ]
Apple’s new image scanning technology is almost as much of a backdoor to individuals' privacy as handing out their device decryption keys on demand. Except with this new tech, finding criminal activity is even easier for law enforcement, as Apple will attempt to locate criminals for them.
That said, there are better privacy tools available for those that need as much privacy as possible, but they are nowhere near as user-friendly as Apple's devices. For instance, if this new search tool only searches images uploaded to iCloud, as claimed, what stops a pedophile or a journalist from disabling iCloud photo storage and synchronizing photos manually? It is extra work, but not difficult. Keeping average everyday things and activities, such as web browsing history, mostly private requires more effort and expertise than manually copying photos between a phone and a computer.
jasamer 2021-08-18 23:20:47 +0000 UTC [ - ]
How would you use this to find criminals in general? I have a hard time to think of a very useful example. Maybe photos of bomb plans downloaded somewhere from the internet that are known to be used by criminal groups?
As to what stops a pedophile or a journalist form disabling iCloud photo storage: Absolutely nothing. This solution is supposed catch people storing CSAM photos in iCloud, that's it. A careful pedophile will not be caught by this.
JohnJamesRambo 2021-08-18 23:51:05 +0000 UTC [ - ]
I dumped mine and I don’t think I’m alone.
99mans 2021-08-19 01:29:35 +0000 UTC [ - ]
shadowfacts 2021-08-18 22:36:24 +0000 UTC [ - ]
> an adversary could trick Apple’s algorithm into erroneously matching an existing image
In which case the malicious, adversary-controlled images are sent to Apple. After which—the implication is—they can be re-obtained by... the adversary that created them. So what?
An adversary could conceivably lower the reporting threshold by getting the victim to save a bunch of false-positive images. Again, so what? Surely if the adversary has reason to believe there are some number of CSAM images on a user's phone, there are more direct ways of going after them.
> These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse
An adversary would either have to:
A) carry out a supply-chain attack on Apple, B) ship different iOS images in different countries, or C) insert entries into the database on a specific users phone.
Options A and C are irrelevant: if the phone is compromised, the CSAM database being modified is the least of your concerns. And option B is independently verifiable (granted, Apple does not do enough to make third-party auditing of iOS easy, but it is possible).
kemayo 2021-08-18 22:42:47 +0000 UTC [ - ]
More importantly, they'd need to suborn Apple's human-review process, because the people doing the review would need to know what not-CSAM they're looking for.
Could Apple be coerced into (or willingly) do this? I have no idea. But it's a very different threat-model than the article suggests, that boils down to the "do you trust your OS vendor?" question.
intricatedetail 2021-08-18 22:57:46 +0000 UTC [ - ]
kemayo 2021-08-18 23:05:10 +0000 UTC [ - ]
(I mean, assuming that you need ~30 matches to trigger the review phase of the process, I'd think it'd be weird to a reviewer looking for child porn if you got 30 pictures of apparently-random political documents or subversive memes.)
meowster 2021-08-19 11:14:07 +0000 UTC [ - ]
intricatedetail 2021-08-19 00:15:54 +0000 UTC [ - ]
kemayo 2021-08-19 00:25:56 +0000 UTC [ - ]
That said, I do think it'd be nice to have a better demonstration of exactly what this "derivative" the reviewers would be looking at is. There's a lot of variations there, balancing false-positive privacy concerns, the mental health of the reviewers, potential downsampling issues, etc.
simondotau 2021-08-19 01:34:31 +0000 UTC [ - ]
Personally I would also be placing a hard watermark in the middle of the image, or maybe some hard slashes randomly through the image, so that "clean" images cannot leak out of human review.
Let's imagine that the derivative is a 0.5 megapixel, grayscale, watermarked, HEIC-compressed copy of the original image. This would be plenty to determine with zero ambiguity that the image is actually "A1" classified, i.e. depicts a prepubescent minor ("A") engaged in a sex act ("1").
bitwise-evan 2021-08-19 00:38:32 +0000 UTC [ - ]
This is a very real, possible attack. Apple ships its CSAM model on device so any attacker can have a copy of the model. Then the attacker creates an image that triggers CSAM but looks like a panda [1]. Now the attacker sends tons of triggering photos to the unsuspecting victim, who now gets questioned by the FBI.
1: https://medium.com/@ml.at.berkeley/tricking-neural-networks-...
shadowfacts 2021-08-19 01:53:50 +0000 UTC [ - ]
That's glossing over the middle part where a human from Apple (before it even gets to law enforcement) actually look at the images and goes "oh, these are actually pandas" and realizes they were erroneously detected.
carom 2021-08-19 00:52:02 +0000 UTC [ - ]
simondotau 2021-08-19 01:05:30 +0000 UTC [ - ]
And if this trick ever works, it could only be done once before Apple has the opportunity to plug holes in their NeuralHash algorithm and fix any deficiencies in the manual review process.
gerash 2021-08-19 06:12:37 +0000 UTC [ - ]
Basically, if you have issue with this go change the law rather than ask large corporations to act illegally for some greater good.
mrharrison 2021-08-18 22:13:34 +0000 UTC [ - ]
amelius 2021-08-18 22:30:36 +0000 UTC [ - ]
hypothesis 2021-08-19 00:54:08 +0000 UTC [ - ]
hypothesis 2021-08-18 22:19:36 +0000 UTC [ - ]
OneLeggedCat 2021-08-19 00:41:35 +0000 UTC [ - ]
hypothesis 2021-08-19 00:52:13 +0000 UTC [ - ]
Otherwise it sounds like a nuclear option by Apple, with dire effects.
greyface- 2021-08-19 00:57:54 +0000 UTC [ - ]
> By using the Apple Software, you agree that Apple may download and install automatic Apple Software Updates onto your Device and your peripheral devices.
> Apple and its licensors reserve the right to change, suspend, remove, or disable access to any Services at any time without notice.
hypothesis 2021-08-19 01:21:01 +0000 UTC [ - ]
greyface- 2021-08-19 01:29:15 +0000 UTC [ - ]
hypothesis 2021-08-19 03:57:21 +0000 UTC [ - ]
There are a couple things that are nagging me though: 1) Apple has loads of legalese for each and every feature and 2) it sounds like they could just cut all that down to two points that you cited and be golden.
I definitely see your point and I guess we can leave it at that until a lawyer can explain this...
SSLy 2021-08-18 23:03:58 +0000 UTC [ - ]
hypothesis 2021-08-19 00:44:35 +0000 UTC [ - ]
Security updates to iOS 14 will let people (who might not be ready/able to switch from iOS) to actually do what GP said, without unnecessarily exposing themselves to security bugs in outdated OS.
jug 2021-08-18 23:07:55 +0000 UTC [ - ]
So why is Apple suddenly extremely adamant about this? Would it even cost them anything to not go here?
They’re a hardware and services company making mobile and computing devices mostly geared towards entertainment and creative work. How did they end up here?
sparker72678 2021-08-18 23:13:18 +0000 UTC [ - ]
heavyset_go 2021-08-18 23:27:22 +0000 UTC [ - ]
Also, governments care about much more than just CSAM. They care about terrorism, drug manufacturing, drug and human trafficking, organized crime, gangs, fraud etc.
This speculation only makes sense if Apple intends to use their CSAM detection and reporting system to detect and report on those other things, as well.
Governments won't like that either, because during investigations and discovery, they'll want all of the customer's data that could be evidence of crimes. These detection and reporting systems, as implemented and by themselves, are only useful for flagging suspicious people. They're no good for complying with warrants or subpeonas.
shuckles 2021-08-19 00:03:39 +0000 UTC [ - ]
jjcon 2021-08-18 23:45:46 +0000 UTC [ - ]
threeseed 2021-08-18 22:08:58 +0000 UTC [ - ]
a) People have extracted a pre-release version from an older iOS phone. It is not the one that will be released and not the same one Apple uses server-side to verify the process.
b) Adversaries can not reasonably break this process by flooding it with bad data across a range of compromised phones. The client side version is there to prevent this as is the fact it would require jailbroken devices with iCloud logins.
c) If you can't trust Apple not to include non-CSAM hashes in the database then how can you trust them not to compromise the operating system itself ? It's illogical. Apple verifies hashes against two sources and two manual checks are required thus needing multiple failure points in the process.
mattnewton 2021-08-18 22:13:20 +0000 UTC [ - ]
For c), the problem isn't just trusting Apple, for whom the hashes are somewhat opaque, my understanding is that they are provided by a governmentally regulated third party.
kemayo 2021-08-18 22:38:48 +0000 UTC [ - ]
JohnFen 2021-08-18 23:14:39 +0000 UTC [ - ]
Which is very easy to imagine. All I have to do is imagine a national security letter.
kemayo 2021-08-18 23:24:34 +0000 UTC [ - ]
Apple already has all these photos uploaded to their servers and knows the keys as iCloud Photos isn't end-to-end encrypted, so if a government can coerce them into cooperating with running scans like that, this new system doesn't seem to make any difference.
(If the argument is that they can be coerced into changing the new system to scan non-uploaded photos and reporting them outside of the iCloud Photos upload process... that's a threat that applies just as much to every phone. It's an argument for not trusting any OS that you didn't compile from source yourself.)
That is to say, it takes this away from the threat the article is presenting as "governments could sneakily use Apple's system to find dissidents despite Apple's best intentions" and into "governments could use the law to make Apple do things".
simondotau 2021-08-19 01:41:22 +0000 UTC [ - ]
(As distinct from Apple voluntarily searching for CSAM, which will be part of the terms of service. And distinct from being compelled to search cloud servers, which is exempt from 4A under the "third party doctrine".)
greyface- 2021-08-19 02:03:03 +0000 UTC [ - ]
simondotau 2021-08-19 02:15:54 +0000 UTC [ - ]
So the letter goes to Apple. They have ample time and resources to push back indefinitely. Demanding that Apple implement a Government dragnet across tens of millions of private devices is so far beyond unconstitutional that complying wouldn't even be fleetingly contemplated as an option. In fact, Apple is the sort of company that would move heaven and earth to ensure that this unconstitutional NSL becomes public. If nothing else, their defiance of it would be fantastic PR.
Therefore it would be a massively stupid-ass move for the Government to try. They would have zero prospect of a positive outcome and they know it.
mattnewton 2021-08-18 23:10:49 +0000 UTC [ - ]
simondotau 2021-08-19 01:49:43 +0000 UTC [ - ]
What does it take to have 30 or more images in your photo library that have been all been manipulated to be a perceptual hash match to 30 or more A1-classified images, while also being able to pass human review ("erring on the side of caution") with respect to them being maybe fitting the A1 classification.
There's no ambiguity in prepubescent minors engaged in sex acts.
meowster 2021-08-19 11:20:31 +0000 UTC [ - ]
simondotau 2021-08-19 14:31:21 +0000 UTC [ - ]
intricatedetail 2021-08-18 23:02:28 +0000 UTC [ - ]
kemayo 2021-08-18 23:07:05 +0000 UTC [ - ]
intricatedetail 2021-08-19 00:11:08 +0000 UTC [ - ]
simondotau 2021-08-19 02:00:34 +0000 UTC [ - ]
Here's what I don't get: who are these people importing CSAM into their camera roll? I for one have never felt the urge to import regular, legal porn into my camera roll. So why would anyone do that with stuff they know could land them in prison? Who the hell co-mingles their deepest darkest dirtiest secret amongst pictures of their family and last night’s dinner?
If someone wants to conceal their CSAM library, I'm sure there's probably dozens of apps in the App Store that can store photos securely behind an additional layer of encryption.
jasamer 2021-08-18 23:31:55 +0000 UTC [ - ]
I'd assume FB, Google & co have some solution to b), so Apple should be able to figure out something. For c), at least Apple takes extra precautions by requiring the photos to be in two separate database provided by different governments. (Maybe other cloud providers do that, too, I don't know).
mattnewton 2021-08-19 00:52:27 +0000 UTC [ - ]
threeseed 2021-08-19 02:05:30 +0000 UTC [ - ]
But the second you enable iCloud Photos sync, Apple has the right to not allow CSAM on THEIR servers.
kevingadd 2021-08-18 22:22:12 +0000 UTC [ - ]
encryptluks2 2021-08-18 21:38:59 +0000 UTC [ - ]
npteljes 2021-08-19 08:55:48 +0000 UTC [ - ]
99mans 2021-08-18 22:11:11 +0000 UTC [ - ]
sandworm101 2021-08-18 22:38:53 +0000 UTC [ - ]
simondotau 2021-08-19 01:15:01 +0000 UTC [ - ]
Saying that you can inspect the source code is true in theory. But unless you've done the full audit yourself, you're personally as blind as you are with closed source code. You're choosing to trust whichever security researchers deeply understand the security implications of all the gobbledegook in all the USB drivers, and you're trusting that Canonical is shipping you the same version that security researchers have validated. For 99.9% of users, it all comes down to blind trust.
As Linus himself once said: "If you have ever done any security work and it didn't involve the concept of a network of trust, it was not a security work. It was masturbation."
BTCOG 2021-08-18 23:24:13 +0000 UTC [ - ]
542458 2021-08-18 23:26:22 +0000 UTC [ - ]
One of my big disappointments is that canonical decided to go super-premium with their attempt at phones. I wish Firefox OS had done better (the f0x phone in particular was incredibly cool) but they never managed to get the sort of community traction they needed.
techdragon 2021-08-19 02:42:03 +0000 UTC [ - ]
sparker72678 2021-08-18 23:26:05 +0000 UTC [ - ]
BTCOG 2021-08-18 23:32:12 +0000 UTC [ - ]
35803288 2021-08-18 22:51:01 +0000 UTC [ - ]
greyface- 2021-08-19 00:27:55 +0000 UTC [ - ]
2021-08-19 00:54:03 +0000 UTC [ - ]
35803288 2021-08-18 22:58:19 +0000 UTC [ - ]
sparker72678 2021-08-18 23:04:09 +0000 UTC [ - ]
Does that mean nothing bad can happen? No. But it does mean that when something changes, we at least know something changed.
jtbayly 2021-08-18 23:51:05 +0000 UTC [ - ]
Lamad123 2021-08-19 08:37:27 +0000 UTC [ - ]
ohazi 2021-08-18 23:02:18 +0000 UTC [ - ]
These databases are unauditable by design -- all they'd need to do is hand Apple their own database of "CSAM fingerprints collected by our own local law enforcement that are more relevant in this region" (filled with political images of course), and ask Apple to apply their standard CSAM reporting rules.
That's it... Tyranny complete.
tzs 2021-08-19 00:01:00 +0000 UTC [ - ]
They wouldn't necessarily be able to tell if it was a false positive matching real CSAM material or a true positive matching illegitimate material in the databases put there by a government trying to misuse it, but they don't need to know whether it is or not. They just need to see that it isn't CSAM and so does not need to be reported.
gambiting 2021-08-19 00:14:59 +0000 UTC [ - ]
So yeah, the entire system is fucked and shouldn't exist. Apple is not law enforcement and them saying "we'll just prescreen every submission" is actually worse, not better.
tzs 2021-08-19 00:45:11 +0000 UTC [ - ]
Which is what they would be doing.
Some government gives Apple a purported CSAM hash database, which Apple only accepts because it is a CSAM database. An image gets a match. Apple looks at it and it is not CSAM. Therefore, unless the government lied to them about the database, it must be a false positive and gets rejected as an incorrect match.
The rejection is not because Apple judged the content per se. They just determined that it must be a false positive given the government's claims about the database.
gambiting 2021-08-19 06:50:55 +0000 UTC [ - ]
alwillis 2021-08-19 07:15:35 +0000 UTC [ - ]
The only CSAM Apple will flag has to come from multiple organizations in different jurisdictions; otherwise, those hashes are ignored.
And since no credible child welfare organization is going to have CSAM that matches stuff from the worst places, there's no simple or obvious way to get them to match.
gambiting 2021-08-19 08:11:56 +0000 UTC [ - ]
Have they actually said they would do that? I was under the impression that they just use the database of hashes provided by the American authority on prevention of child abuse.
>>And since no credible child welfare organization is going to have CSAM that matches stuff from the worst places
I'm not sure I understand what you mean, can you expand?
zimpenfish 2021-08-19 09:29:31 +0000 UTC [ - ]
In [1], "That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system."
[1] https://www.theverge.com/2021/8/13/22623859/apple-icloud-pho...
judge2020 2021-08-18 23:07:39 +0000 UTC [ - ]
Yes, they could change it at any time. But this was always the case with all software that implements OTA updates.
sparker72678 2021-08-18 23:11:40 +0000 UTC [ - ]
2) Apple has trillions of dollars to lose by selling out to a shitty change like that.
Do those things mean nothing bad can come of it? Hell no. But, right now we have FISA courts and silent warrants sucking in data without anyone knowing or being able to talk about it. It's not like we're a panacea at the moment.
Apple's approach creates a possibility of slowing down the politics already trying to move against E2EE data.
This is a political fight, and if we all act like ideologues we're going to lose it all in the end.
ScoobleDoodle 2021-08-18 23:33:30 +0000 UTC [ - ]
Apples move brings it one step closer for FISA courts and silent warrants to have access beyond what we send over the network to now what resides on our phones.
Apple is giving mass surveillance a foot hold into living on and monitoring data on our personal phones.
Apples has created the back door for increased surveillance: https://www.eff.org/deeplinks/2021/08/if-you-build-it-they-w...
Tell Apple don't scan our phones: https://act.eff.org/action/tell-apple-don-t-scan-our-phones
xkcd-sucks 2021-08-18 23:27:41 +0000 UTC [ - ]
This is "appeasement" or "Danegeld", and we all know how that works out in the end
TimTheTinker 2021-08-19 00:44:51 +0000 UTC [ - ]
Apple has trillions to lose by building this system in the first place. All it takes is one court order to do non-CP scanning with the existing system.
zepto 2021-08-19 14:52:51 +0000 UTC [ - ]
TimTheTinker 2021-08-19 16:24:18 +0000 UTC [ - ]
All it takes is a wiretap warrant and Apple would have to scan on-device pictures and iMessages for whatever the wiretap says. This is true even if the phone has iCloud switched off (most of us already do), since all Apple has to do is change a Boolean variable's value, or something similar requiring no creative effort (and hence legally coercible).
zepto 2021-08-19 17:07:16 +0000 UTC [ - ]
This is total and utter bullshit. It is a complete misunderstanding of how the system works.
If you and your family think this is true, then of course you are alarmed.
jtbayly 2021-08-18 23:42:45 +0000 UTC [ - ]
That is why they said that the content has to be in two separate nation’s databases. Of course, there is no information that I’ve seen about what other nation’s db they would use. And without another nation, there will be no content in the database? I doubt it.
Regardless, it’s a moot issue, since we already know that the 5 eyes all conspire and would gladly add content for each other, the same as several middle-east nations.
sparker72678 2021-08-18 23:49:25 +0000 UTC [ - ]
jtbayly 2021-08-19 00:52:30 +0000 UTC [ - ]
Right now they have no ability to scan every photo on every iOS device for “objectionable” content (as defined by them on that day, based on their mood). But soon they will. All they have to do is add photos to the ncemc and an equivalent db in another country.
salawat 2021-08-18 23:50:11 +0000 UTC [ - ]
Yes... Because it is impossible for different servers to be configured as backends depending on where handsets are destined to be sold. It's not like it's possible to quickly whip up a geolocation aware API that can swap things out on the fly. C'mon. This isn't even hard. These are all trivially surmountable problems. The one thing standing in the way of already having done this, was there was no way in hell anyone would have been daft enough to even try doing something like this with a straight face. For heaven sake, even Hollywood lampshaded it with that "using cell phones as broadband sensors" shtick in the Dark Knight or whatever it was.
>This is a political fight, and if we all act like ideologues we're going to lose it all in the end.
The fight was lost the moment someone caved to "think of the children". Every last warning sign that has been left all over the intellectual landscape ignored, history, risk, human nature, any semblance of good sense ignored.
Honestly, I'm sitting here scratching my head wondering if I took a wrong turn or something 20 years ago. This isn't even close to the place I lived anymore.
ScoobleDoodle 2021-08-18 23:34:38 +0000 UTC [ - ]
raxxorrax 2021-08-19 10:47:30 +0000 UTC [ - ]
Here governments could have opted for a contrast but they neglected these opportunities.
simondotau 2021-08-19 00:35:48 +0000 UTC [ - ]
(After a photo is uploaded to a cloud service, a search of photos stored on servers doesn't enjoy the same 4A protection as this falls under the so-called "third party doctrine".)
(Apple searching for CSAM is also not a 4A violation because it was Apple's free choice as a private company to do so, and you will have agreed to it as part of the Terms of Service of the next version of iOS.)
cyanite 2021-08-18 22:59:42 +0000 UTC [ - ]
jjcon 2021-08-18 23:51:51 +0000 UTC [ - ]
By the time we know that it is happening it will be far more difficult to do something about it (see Snowden and the Patriot Act).
zepto 2021-08-19 14:55:33 +0000 UTC [ - ]
The only protection against future abuse, whether or not this mechanism is deployed, is a legal system that cares.
JohnFen 2021-08-18 23:08:49 +0000 UTC [ - ]
Your devices, however, are your own house.
sparker72678 2021-08-18 23:20:59 +0000 UTC [ - ]
99mans 2021-08-18 23:43:39 +0000 UTC [ - ]
eternalban 2021-08-18 23:19:21 +0000 UTC [ - ]
So by your logic, if I use a post office box, that means the postal service has the right to open all my packages?
Jtsummers 2021-08-18 23:31:12 +0000 UTC [ - ]
From: https://www.uspis.gov/wp-content/uploads/2019/05/USPIS-FAQs....
> 4. Can Postal Inspectors open mail if they feel it may contain something illegal? First-Class letters and parcels are protected against search and seizure under the Fourth Amendment to the Constitution, and, as such, cannot be opened without a search warrant. If there is probable cause to believe the contents of a First-Class letter or parcel violate federal law, Postal Inspectors can obtain a search warrant to open the mail piece. Other classes of mail do not contain private correspondence and therefore may be opened without a warrant.
Companies hosting your data aren't similarly restricted (though if the US government wants access then they'd be restricted by the Constitution and legislation again). A company's ability to look at whatever you give them is only restricted by your contract with them and the technical limitations created by how you share it (upload encrypted files where they don't have the key? they can't really do much). They may have some legal restrictions on some kinds of data, but it's not going to be uniform across the globe so you'll have to take care with which companies you choose to host your unencrypted data.
LdSGSgvupDV 2021-08-19 01:41:46 +0000 UTC [ - ]
naravara 2021-08-18 23:09:29 +0000 UTC [ - ]
I'm having trouble seeing how Apple's actions make this any more or less likely. It's not like matching photos is some esoteric concept that no repressive government has ever thought of before. It's not even like it's particularly hard. Apple's implementation is the most privacy sensitive way of doing it, but if the rules were going to come down they were going to come down, and they'd be implemented in less privacy sensitive ways.
simondotau 2021-08-19 00:55:20 +0000 UTC [ - ]
If Saudi Arabia or Sudan tried to turn the screws, the business case for Apple is absolutely clear-cut: they leave. This isn't even up for debate. There's far too much at risk globally than there is to gain domestically from compliance.
Not only do they avoid serious damage to their global reputation (something they'll be extremely sensitive to, as the last two weeks have taught them) it would represent a massive opportunity for Apple to earn weeks of free media coverage that aligns with their security narrative.
robertoandred 2021-08-19 00:48:59 +0000 UTC [ - ]