Policy groups ask Apple to drop plans to inspect messages, scan for abuse images
savant_penguin 2021-08-19 04:59:58 +0000 UTC [ - ]
They better monitor every aspect of their customers lives to protect the children. Think of the children
RpFLCL 2021-08-19 05:47:14 +0000 UTC [ - ]
Plumbing installed in people's homes should scan for illegally consumed narcotics or prescription medications and check them against pharmacy records. Your home's electrical wiring should make sure it's not being used to illegally grow drugs. Cellphone cameras should make sure they never record a naked body without a signed consent waiver. Guitar amplifiers should verify that you're not playing an owned piece of music without a license.
How could we ever survive in a world where we relied upon targeted investigations and probable cause? How could we ever live in a world where pipes just moved things from A to B without inspecting them.
/s
maze-le 2021-08-19 07:42:58 +0000 UTC [ - ]
abraae 2021-08-19 05:08:01 +0000 UTC [ - ]
It's fair IMO to use fit for purpose technology to target specific crimes.
Though I agree this is bad technology, not fit for purpose, and they should yank it.
pjmlp 2021-08-19 05:51:53 +0000 UTC [ - ]
It is also the only country from all of those I have been in almost 50 years, where great care is taken to never park in an illegal way.
Sure you will see it, away from city centers where no cameras are around.
intricatedetail 2021-08-19 12:30:37 +0000 UTC [ - ]
pjmlp 2021-08-19 12:32:21 +0000 UTC [ - ]
short_sells_poo 2021-08-19 08:28:40 +0000 UTC [ - ]
This is only true for the masses. Rich people just leave their cars parked anywhere they want in Mayfair in London because they don't care about a 50 quid fine.
zimpenfish 2021-08-19 08:41:22 +0000 UTC [ - ]
Although, to their credit, apparently[1] they do pay their parking fines.
[1] https://www.independent.co.uk/news/uk/home-news/us-diplomats...
alfiedotwtf 2021-08-19 05:37:00 +0000 UTC [ - ]
EGreg 2021-08-19 05:08:34 +0000 UTC [ - ]
Everything you said is already being collected about you. Cellphone towers can log where you are. Cameras around your city can recognize you. All they have to do is link all the feeds into one database that can be searched. The only place you can escape the watchful eye is your end to end encrypted stuff… which ain’t your SMS. And even there, you are just taking their word for it since it’s a closed source product.
nbzso 2021-08-19 08:39:13 +0000 UTC [ - ]
Can you imagine this "narrative" applied everywhere? From the moment you open your eyes everything around you is extracting data, monitoring and policing you.
We have to redefine our relationship with software and hardware (electronics). The infrastructure (5g) and IOT advancements are here. Data is the fuel of the coming "consumer revolution". Apple just positioned themselves perfectly as a Data Broker for high level customers - special groups, oppressive governments, you name it.
They will not back up from this. This is more profitable than selling shiny toys and "services". This gives political elites a reason not to break Apple monopolistic business practices.
I will go even further: This gives Apple a favorable position to become a trusted partner of governments of the World.
Deconstructing this only from a technical point of view is non-efficient. Apple is making an entrance on political stage by embracing the role of World Digital Police.
Everyone is free to apply his own conclusions and analysis. Mine are proportionally adequate. I have removed macOS from production workflow. I will never buy another Apple product. And I am willing to throw away "conveniences" and polished UX for preserving my privacy.
This privacy, not Apples corporate redefinition:
The quality or condition of being secluded from the presence or view of others. The state of being free from public attention or unsanctioned intrusion. A state of being private, or in retirement from the company or from the knowledge or observation of others; seclusion.
pulse7 2021-08-19 11:19:13 +0000 UTC [ - ]
testfoobar 2021-08-19 05:02:08 +0000 UTC [ - ]
I would personally quit if asked to work on any of these teams.
If you were the QA team lead for this, how would you test it? I can't imagine anyone would do day-to-day testing with actual illegal images.
It would not surprise me if the dev and QA teams at Apple already have hash colliding images to run their tests?
This is all awful. Steve Jobs must be rolling over in his grave.
unityByFreedom 2021-08-19 07:34:58 +0000 UTC [ - ]
June, 2010 - Jobs: "Privacy means people know what they're signing up for."
https://youtu.be/39iKLwlUqBo?t=154
December, 2015 - Cook: "If you put a back door in, that back door is for everybody, for good guys and bad guys."
https://youtu.be/rQebmygKq7A?t=66
August, 2021
Interviewer: "Isn't this in a way a back door?"
Federighi: "I really don't understand that characterization."
arvinsim 2021-08-19 10:28:41 +0000 UTC [ - ]
sumedh 2021-08-19 05:06:17 +0000 UTC [ - ]
You select any random image as your test image, marking that image as CP in the backend and then try to upload that image from your test phone.
testfoobar 2021-08-19 05:09:40 +0000 UTC [ - ]
Ambroos 2021-08-19 05:34:09 +0000 UTC [ - ]
johnny53169 2021-08-19 06:18:33 +0000 UTC [ - ]
sampling 2021-08-19 07:44:34 +0000 UTC [ - ]
1: Run on-device code to perform perceptual hash comparison of each photo against an on-device encrypted database of known CSAM hashes.
2: On iCloud Photos servers, send out the relevant notifications when a user’s iCloud Photos account exceeds a threshold of positive matches.
So as a high level testing strategy, I would want to:
- Verify on-device lookup of CSAM hashes. This could be tested by provisioning a test device with an on-device database containing CSAM hashes of images that aren't illegal. As a bystander, I think I'd be fairly confident with this approach because I'm guessing the on-device database that Apple ships could conceivably be changed over time to expand the definition of the images it will flag as CSAM.
- Do some exploratory testing to discover the threshold of how much image manipulation can be done on a flagged image before the perceptual hash comparison fails to return a match.
- Verify that the notification system notifies the correct parties once a user account exceeds the defined threshold of positive CSAM matches.
- Ensure the flagged account can still be investigated if user deletes the offending material from iCloud, or their account by the time a real person gets around to investigating.
- Ensure that the logging is informative and adequate (contains device name, timestamp, etc.).
- Test behaviour on same iCloud account logged in to multiple devices.
- Figure out any additional business logic - are positive matches a permanent count on the account or are they reset after a certain amount of time?
source: https://www.apple.com/child-safety/pdf/Security_Threat_Model...
ByteWelder 2021-08-19 08:50:04 +0000 UTC [ - ]
nebula8804 2021-08-19 06:47:52 +0000 UTC [ - ]
They didn't even last 10 years since his death. :/
testfoobar 2021-08-19 05:24:29 +0000 UTC [ - ]
pulse7 2021-08-19 05:56:31 +0000 UTC [ - ]
boublepop 2021-08-19 15:21:11 +0000 UTC [ - ]
dkdbejwi383 2021-08-19 06:19:52 +0000 UTC [ - ]
zimpenfish 2021-08-19 08:43:07 +0000 UTC [ - ]
As could PhotoDNA, though, which everyone has been using for a decade to scan cloud photos - but it doesn't seem to have happened.
intricatedetail 2021-08-19 12:34:14 +0000 UTC [ - ]
kemayo 2021-08-19 14:27:53 +0000 UTC [ - ]
> More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.
But... it wouldn't. The iMessage feature doesn't expose the contents of your message to anyone else under any circumstance.
If you're a child under 13 and your parents have opted in to this feature, you get a choice of seeing naked-pictures sent to you and having your parents be notified that you chose to, and not seeing it with no notifications of anything. (But once you're 13+, no notifications would occur either way.)
There are potential issues with this, mostly relating to abusive families being controlling. They'd have to do weird things like forcing their teenaged children to keep making new under-13 accounts to actually take advantage of it like that, though. And none of these issues impact the e2e status of iMessage in any way.
Apple really screwed up PR by launching the iMessage feature alongside the scanning-and-reporting iCloud Photos feature. There's so much confusion out there about this.
(The breaking-e2e aspect does exist with the iCloud Photos scanning... not that it's currently e2e, of course.)
EricE 2021-08-19 14:44:35 +0000 UTC [ - ]
Content is no longer private if there is anything inspecting it other than you.
I expect cloud providers to fiddle with my data - because they can and do. I don't expect vendors, especially one touting themselves as being more privacy focused than everyone else, to be sifting through my data on device.
Yes, today it's a carefully designed algorithm designed to be one way. The algorithm isn't the problem. The problem is the method used to deliver this algorithm can also be used to deliver *other* algorithms. And no, especially since 9/11 I have zero faith that this is as far as this mechanism will ever be used.
kemayo 2021-08-19 14:56:48 +0000 UTC [ - ]
The iMessage feature is "it looks like someone has sent you a picture of their dick, are you sure you want to see that?" I know adults who'd like to turn that feature on.
Could it hypothetically be extended in the future? Sure. But arguing against this feature as-is is about the same as arguing against the feature that lets you search for "dog" in your photos.
TechBro8615 2021-08-19 14:41:16 +0000 UTC [ - ]
kemayo 2021-08-19 14:46:02 +0000 UTC [ - ]
There's no e2e encryption for Photos currently, so it can't be breaking it there. If you use iCloud Photos, your photos are stored encrypted, but Apple has the key and could be doing all the cloud-scanning it felt like.
TechBro8615 2021-08-19 15:14:20 +0000 UTC [ - ]
kemayo 2021-08-19 15:26:03 +0000 UTC [ - ]
I'm sure in such a future there'd be quite the argument happening about whether you could call it e2e.
TechBro8615 2021-08-19 16:29:32 +0000 UTC [ - ]
kemayo 2021-08-19 16:52:43 +0000 UTC [ - ]
Let's not pretend that this sort of thing is happening in isolation. There really is a big legal movement towards requiring encryption backdoors, and programs like Apple's here are effectively part of a big negotiation with law enforcement -- note that the reason Apple's currently not e2e is apparently that the FBI pushed them on it[1]. "We'll give you what you claim is important, CSAM scans, and in return we'll lock everything else up securely". Is this great? No. Would an absolutist "law enforcement can have nothing" position backfire into stupid legal decisions? Plausibly.
[0]: https://en.wikipedia.org/wiki/PhotoDNA [1]: https://www.popularmechanics.com/technology/security/a306318...
mtbnut 2021-08-19 16:05:03 +0000 UTC [ - ]
JumpCrisscross 2021-08-19 04:49:25 +0000 UTC [ - ]
jamil7 2021-08-19 07:19:57 +0000 UTC [ - ]
Salgat 2021-08-19 05:09:58 +0000 UTC [ - ]
cobookman 2021-08-19 05:23:28 +0000 UTC [ - ]
Google [1], Dropbox [2], Microsoft [3], and many many other Photo Cloud storage providers have implemented similar child porn detection.
[1] https://www.columbiatribune.com/story/news/crime/2021/03/22/...
[2] https://gizmodo.com/dropbox-refuses-to-explain-its-mysteriou...
[3] https://www.local10.com/news/local/2020/02/21/microsoft-aler...
ipv6ipv4 2021-08-19 05:40:13 +0000 UTC [ - ]
This is the text of the current law [1]. Alerting authorities about CP is required if discovered. Actively searching for CP is not required. Look for the aptly named "protection of privacy" paragraph.
JumpCrisscross 2021-08-19 05:32:47 +0000 UTC [ - ]
This doesn't explain the insistence on scanning all images on device, whether they're hosted on iCloud or not.
The moment for leadership from Cook has come and almost passed. It's surprising watching Apple squander its brand at this moment. A good amount of its previous support with respect to federal antitrust has permanently dissipated.
paldepind2 2021-08-19 05:44:45 +0000 UTC [ - ]
minton 2021-08-19 11:36:40 +0000 UTC [ - ]
tpush 2021-08-19 05:39:55 +0000 UTC [ - ]
They're only obligated to report if found, but they don't have to proactively scan (yet?). In the EU until recently they weren't even allowed to voluntarily scan.
istingray 2021-08-19 04:50:13 +0000 UTC [ - ]
PopePompus 2021-08-19 05:00:17 +0000 UTC [ - ]
zionic 2021-08-19 14:56:39 +0000 UTC [ - ]
The engineers who built, the project managers who coordinated, and the executives who signed off on this smoothbrain idea need to be drummed out without ceremony.
No severance, no golden parachutes, no cushy board jobs etc.
People need to understand they risk their livelihoods building and deploying a surveillance system like this. In a just world they would face criminal charges.
cmelbye 2021-08-19 05:37:46 +0000 UTC [ - ]
minton 2021-08-19 11:30:51 +0000 UTC [ - ]
Lamad123 2021-08-19 09:24:01 +0000 UTC [ - ]
EGreg 2021-08-19 05:12:35 +0000 UTC [ - ]
Now Apple’s just a follower…
nebula8804 2021-08-19 06:50:02 +0000 UTC [ - ]
Take all this with a grain of salt. I wonder if Jobs was just good at managing these brash personalities together.
stackbutterflow 2021-08-19 05:43:07 +0000 UTC [ - ]
Jobs loved the end of skeumorphic design or Jobs loved skeumorphic design?
dkdbejwi383 2021-08-19 06:25:37 +0000 UTC [ - ]
https://www.businessinsider.com/steve-jobss-signature-design...?
echelon 2021-08-19 05:40:19 +0000 UTC [ - ]
2. FBI/CIA, frustrated at previous failures, pressure senators/congresspeople to turn up the heat.
3. FBI/CIA then gives Apple the option to implement surveillance with the promise that they can make the antitrust troubles go away if Apple does what they want.
4. Apple does the thing. (We're here now.)
5. Antitrust case goes away.
The FSB, CCP, and other intelligence orgs might be trying the same strategy in their countries.
I expect this is how everything works in the big leagues.
istingray 2021-08-19 04:44:32 +0000 UTC [ - ]
DanWritesCode 2021-08-19 07:32:57 +0000 UTC [ - ]
After advocating to all my friends that Apple is a beacon of how privacy should be done, I just can't understand how they've made such a hash of this.
fsflover 2021-08-19 09:56:01 +0000 UTC [ - ]
LinuxBender 2021-08-19 13:10:16 +0000 UTC [ - ]
fsflover 2021-08-19 13:13:00 +0000 UTC [ - ]
LinuxBender 2021-08-19 14:22:27 +0000 UTC [ - ]
fsflover 2021-08-19 14:33:18 +0000 UTC [ - ]
Upd: I am not an expert, but maybe the schematics could help you here: https://source.puri.sm/Librem5/l5-schematic and https://wiki.pine64.org/index.php/PinePhone#PinePhone_board_....
LinuxBender 2021-08-19 14:48:21 +0000 UTC [ - ]
- Who writes and maintains the firmware the OS is running on.
- Who writes and maintains the modem firmware and who can update it.
- Who can update that firmware for the board the OS is running on. This could be a different answer than who initially creates it for the retail distribution of the phone.
- What level of trust has been inserted into the OS by kernel modules and who maintains those kernel modules.
- What control is given to the end user to see what those modules are doing and limit what they can do.
I suspect more questions could arise as kernel hackers audited the phone. The dilemma I see is that such kernel hackers won't be interested until those phones are wildly popular. The only other way to get their interest is with money.
fsflover 2021-08-19 14:57:02 +0000 UTC [ - ]
Both Librem 5 and Pinephone run FLOSS operating systems with software maintained by Purism and community. The latter smartphone has two firmware blobs in kernel for WiFi/Bluetooth AFAIK: https://www.pine64.org/2020/01/24/setting-the-record-straigh....
> Who writes and maintains the modem firmware and who can update it.
See my link above concerning the Pinephone modem. It's software can be updated by the OS. AFAIK it's the same for Librem 5.
> Who can update that firmware for the board the OS is running on.
See the first answer. Librem 5 is going to get FSF certification "Respects Your Freedom". Proprietary software only runs on the modem and WiFi card. See also: https://puri.sm/posts/librem5-solving-the-first-fsf-ryf-hurd....
> What level of trust has been inserted into the OS by kernel modules and who maintains those kernel modules.
I am not knowledgeable enough to answer that. Maybe the schematics linked above could help you.
> What control is given to the end user to see what those modules are doing and limit what they can do.
User has full control. This is a selling point of Purism the company: https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque...
norov 2021-08-19 13:04:52 +0000 UTC [ - ]
cassalian 2021-08-19 11:46:46 +0000 UTC [ - ]
Here's an article from a year ago on EARN IT:
> Theoretically, a system that uses client-side scanning could still send messages encrypted end to end, and so the Leahy amendment would not offer any protection, but many of the same confidentiality concerns with backdoored “e2ee” systems would continue to apply.
Source: https://cdt.org/insights/the-new-earn-it-act-still-threatens...
EricE 2021-08-19 14:47:40 +0000 UTC [ - ]
It's all about conditioning. First it's to save the children. Then it will be about preventing the terrorists. Finally it will be about preventing "hate speech" - which these days hate speech is pretty much speech from anyone you disagree with :p
Like the 1984 novel, Minority Report wasn't an instruction manual!
jitl 2021-08-19 04:53:24 +0000 UTC [ - ]
(This comment is intended as a joke.)
vegetablepotpie 2021-08-19 05:06:12 +0000 UTC [ - ]
EGreg 2021-08-19 05:10:09 +0000 UTC [ - ]
vegetablepotpie 2021-08-19 05:14:00 +0000 UTC [ - ]
EGreg 2021-08-19 05:16:14 +0000 UTC [ - ]
Thinking two steps ahead can be done even when you make mistakes
istingray 2021-08-19 06:05:35 +0000 UTC [ - ]
sharperguy 2021-08-19 05:48:41 +0000 UTC [ - ]
pulse7 2021-08-19 05:58:22 +0000 UTC [ - ]
pulse7 2021-08-19 05:50:14 +0000 UTC [ - ]
arbirk 2021-08-19 06:31:58 +0000 UTC [ - ]
CSAM is one of the most extreme crimes against a person and the number of people sharing and profiting from CSAM is growing. This is something every tech company will have to deal with.
EricE 2021-08-19 14:49:18 +0000 UTC [ - ]
Really? Based on what? Multiple stories of hash collisions (false positives) are *already* popping up.
If only all the people who think this isn't a big deal were the first to get hit by false positives. Now that would be poetic justice :p
minton 2021-08-19 11:47:58 +0000 UTC [ - ]
amanaplanacanal 2021-08-19 12:01:42 +0000 UTC [ - ]
pulse7 2021-08-19 09:56:56 +0000 UTC [ - ]
umanwizard 2021-08-19 11:18:30 +0000 UTC [ - ]
jl2718 2021-08-19 11:44:36 +0000 UTC [ - ]
tlogan 2021-08-19 06:17:10 +0000 UTC [ - ]
EricE 2021-08-19 14:49:55 +0000 UTC [ - ]
robertwt7 2021-08-19 06:07:00 +0000 UTC [ - ]
somenewaccount1 2021-08-19 05:07:11 +0000 UTC [ - ]
ipv6ipv4 2021-08-19 05:41:51 +0000 UTC [ - ]
nonbirithm 2021-08-19 07:34:17 +0000 UTC [ - ]
The intent of the laws making the possession CSAM illegal are ultimately to stop the spread of CSAM. If those laws fail in their stated purpose just because a company chooses not to be proactive in how it searches for the material, that would be missing the point. The media and governmental pressure would wipe any company foolish enough not to implement child safety measures out of existence, but no major tech company is that foolish (except Kik perhaps). And hypothetically, the government could just pass a law enforcing proactive scanning if there were to exist a company that chose not to proactively scan, which would mean that the government would fail to solve the issue of preventing the spread of CSAM, and thus CSA.
But in practice, such a law isn't needed. No company wants to be derided as a service that allows child abusers to get away with their crimes.
BoHerfIIIJrEsq 2021-08-19 05:39:33 +0000 UTC [ - ]
endisneigh 2021-08-19 04:49:57 +0000 UTC [ - ]
https://transparencyreport.google.com/child-sexual-abuse-mat...
https://www.facebook.com/safety/onlinechildprotection
https://www.microsoft.com/en-us/photodna
https://blog.flickr.net/en/2021/04/16/keeping-flickr-and-chi...
Most companies dealing with photos have had this in place for almost a decade now. The FUD and slippery slope arguments are getting old.
ralph84 2021-08-19 05:06:23 +0000 UTC [ - ]
EGreg 2021-08-19 05:14:41 +0000 UTC [ - ]
https://m.youtube.com/watch?v=Af0gtsjfy7E
What’s your point? Apple said RISC was better and faster than Intel, until the day they adopted Intel, and then they said the complete opposite. They acted like they invented the Omnibox for Safari and so forth. It’s for marketing.
brendoelfrendo 2021-08-19 05:08:36 +0000 UTC [ - ]
endisneigh 2021-08-19 05:09:34 +0000 UTC [ - ]
commoner 2021-08-19 05:49:45 +0000 UTC [ - ]
Bilal_io 2021-08-19 04:59:29 +0000 UTC [ - ]
Just because the bar is so low does not mean we have to accept privacy abuse.
endisneigh 2021-08-19 05:00:11 +0000 UTC [ - ]
brendoelfrendo 2021-08-19 05:00:57 +0000 UTC [ - ]
No, why would I be happy with this? I actually prefer the alternative: scan photos server side, fine. That's Apple's computer and if I send my photos there, then I can't justifiably argue against that. But my device shouldn't be part of their attempt to appease law enforcement by turning the world into a dragnet.
endisneigh 2021-08-19 05:01:50 +0000 UTC [ - ]
torstenvl 2021-08-19 05:07:34 +0000 UTC [ - ]
tyingq 2021-08-19 05:10:00 +0000 UTC [ - ]
brendoelfrendo 2021-08-19 05:02:59 +0000 UTC [ - ]
EricE 2021-08-19 14:52:10 +0000 UTC [ - ]
Especially in this post 9/11 world - look at what was supposed to be "temporary" to deal with the "terrorists". Ha! You have to be an utter simpleton if you don't think use of this mechanism will expand. It's not even a leap of logic - it's been going on for 20 years all around us!
brandon272 2021-08-19 17:01:46 +0000 UTC [ - ]
I'm sure 3 weeks ago no one could have fathomed that Apple would make an announcement like this, yet here we are.
arthurcolle 2021-08-19 04:42:33 +0000 UTC [ - ]
Can't believe this dude is running software at Apple.
The headline for all of this should be "We know what it does and we don't want it"
SilverRed 2021-08-19 04:50:09 +0000 UTC [ - ]
Everyone understands well enough how it works and understands how trivially it could be changed to search for other kinds of content.
Animats 2021-08-19 06:22:21 +0000 UTC [ - ]
At some point, the back story behind this will leak out. That will be interesting. This has to be something Apple was asked/pressured/ordered to do. As a business activity it makes no sense.
Bear in mind that over the last week, people in Afghanistan have been frantically trying to erase any evidence of doing things the Taliban doesn't like, such as women playing sports.
unityByFreedom 2021-08-19 07:29:23 +0000 UTC [ - ]
I firmly believe we'll never discover who applied that pressure, or how. And it's hard for me to imagine the US government doing that without some court battle with serious risk of leaks to journalists. I mean, it makes no financial sense, so Apple would fight that. The only thing I can imagine is pressure from a private interest whose market is big enough for Apple to care. And if it is that market, then I think we're in for a tough ride until we can build our own devices.
red_admiral 2021-08-19 07:41:40 +0000 UTC [ - ]
If you read the documentation on how it's implemented, there's some fairly advanced crypto - private set intersection, threshold secret sharing - that only makes sense if Apple took the line "we have to do this, but we're willing to do it in a really expensive way so that we ourselves have as little access as possible". They went to the effort of running the NeuralHash on the client device, as far as I understand.
The standard implementation on other cloud providers is that the provider has access to your data on the server if they really need it - much cheaper, and makes it technically possible in future to easily change the T&C to "we may use this for market research and to improve our products". I view Apple's client-side implementation as drawing a very big line in the sand saying we will not do that and we are willing to put our money where our mouth is by writing this crypto protocol.
But that certainly raises the question, why would Apple do anything like that at all, unless there's some (current or anticipated) external threat that forced them to?
As to whether we'll ever discover it - I wouldn't have predicted the Snowden revelations, maybe this will come out the next time something like that happens. If it does, I also predict we'll be shocked to discover just how big the market and distribution network for child abuse online is, and how many people are involved.
unityByFreedom 2021-08-19 07:47:59 +0000 UTC [ - ]
I don't think it is big compared to what authoritarian states would like to use this for. You're talking about, at the very least, putting everyone in non-democratic states under even more pressure to obey than they're already under. And in democratic states this invites bad actors both in and outside the government. There is no net-gain in justice here. It's wrong across the board.
Welcome back to HN after taking 5 months off, btw. You're a SSC fan?
red_admiral 2021-08-19 08:02:20 +0000 UTC [ - ]
Yes, I agree the state market is even bigger - China's potential market for surveillance alone is 1.3 billion and rising - but I've heard from colleagues in tech who seriously think child abuse is only a very small number of bad actors. My understanding is the "industry" is almost as out of control as the illegal drugs trade.
I am indeed a SSC/ACX fan (and subscriber).
hdjjhhvvhga 2021-08-19 11:54:01 +0000 UTC [ - ]
Snowden paid a very high price not many people can afford, especially when they have a family. So I highly doubt we could know it within a reasonable timeframe.
arvinsim 2021-08-19 09:48:44 +0000 UTC [ - ]
ByteWelder 2021-08-19 08:39:40 +0000 UTC [ - ]
Anything going through a FISA court (Foreign Intelligence Surveillance Court) could come with a gag order, preventing anyone involved to talk about it.
amanaplanacanal 2021-08-19 11:43:32 +0000 UTC [ - ]
ByteWelder 2021-08-19 12:49:45 +0000 UTC [ - ]
> "Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,"
Many people and organizations fear that Apple's plans are paving the way for governments to scan for more than just child abuse images. (e.g. https://appleprivacyletter.com/)
In that light, FISA is relevant. My statement was to show that not all dealings of the US government are public, because not all courts and legal proceedings are available to the public. So "serious risk of leaks to journalists" won't likely happen in that case.
jjcon 2021-08-19 07:22:39 +0000 UTC [ - ]
hughrr 2021-08-19 07:33:22 +0000 UTC [ - ]
This is a fairly typical corporate failure mode.
Sometimes the best thing is to do nothing as they have found out because now they’re stuck in a position where the two exits are either piss off everyone by writing off their privacy stance or piss off everyone else for canning child porn scanning.
What a complete fuck up.
michelb 2021-08-19 08:46:37 +0000 UTC [ - ]
hdjjhhvvhga 2021-08-19 11:58:38 +0000 UTC [ - ]
unityByFreedom 2021-08-19 07:31:21 +0000 UTC [ - ]
kgwxd 2021-08-19 07:05:16 +0000 UTC [ - ]
varispeed 2021-08-19 11:02:18 +0000 UTC [ - ]
I think it's not really a problem of what is being looked for, but the mere fact that it will be done without a warrant on a personal device.
ilogik 2021-08-19 07:39:19 +0000 UTC [ - ]
zimpenfish 2021-08-19 08:27:55 +0000 UTC [ - ]
(One counterpoint to this is that PhotoDNA has been used for scanning cloud images using NCMEC-provided hashes for a decade now and this slippery slope doesn't seem to have happened yet.)
ilogik 2021-08-19 08:32:41 +0000 UTC [ - ]
My understanding is that there is a threshold that needs to be met, and then an Apple employee with a very difficult job will need to confirm the image. So even if NCMEC get pressured, I don't see what the result can be.
tjpnz 2021-08-19 09:17:57 +0000 UTC [ - ]
ilogik 2021-08-19 09:37:35 +0000 UTC [ - ]
umanwizard 2021-08-19 11:08:35 +0000 UTC [ - ]
Replace “FBI” and “Biden” with their equivalents in various other countries and yes, it’s absolutely plausible.
saba2008 2021-08-19 12:42:12 +0000 UTC [ - ]
ilogik 2021-08-19 13:06:01 +0000 UTC [ - ]
umanwizard 2021-08-19 13:09:38 +0000 UTC [ - ]
ilogik 2021-08-19 14:37:39 +0000 UTC [ - ]
scrps 2021-08-19 06:45:06 +0000 UTC [ - ]
Stop feeding them and they will starve on their ignorance. I'd rather be poor in a better world than rich in a dystopia.
varispeed 2021-08-19 11:06:51 +0000 UTC [ - ]
The problem is that we all end up being poor in a dystopia. The regulatory capture of industries is an ongoing thing. When 20 years ago you could start your own business in your garage, without substantial capital, today you are dependent on "Angel investors" and bank loans, which is an equivalent of having to be referred into a "club". If you can't find an investor, it will take you a lifetime to save money to start a business, if you are lucky. Take notice that all taxation is being shifted to workers to limit their chances of raising capital and creating competition.
labster 2021-08-19 05:16:06 +0000 UTC [ - ]
Not sure if it’s true or not, but we shouldn’t assume companies always intend their public plans to succeed. Maybe they actually want a high profile retreat — it could certainly save them development time down the road by not implementing government misfeatures.
JumpCrisscross 2021-08-19 05:35:48 +0000 UTC [ - ]
Still incompetent. This would have made sense under another administration. The only thing holding back the antitrust dogs is public opinion. An Apple antitrust case would be politically costly in a way a Facebook case wouldn't. This type of thing corrodes the core of the pro-Apple vocal minority. It's bad GR, if that was the plan.
arthurcolle 2021-08-19 05:50:19 +0000 UTC [ - ]
JumpCrisscross 2021-08-19 06:43:53 +0000 UTC [ - ]
cryptonector 2021-08-19 05:39:37 +0000 UTC [ - ]
JumpCrisscross 2021-08-19 05:58:50 +0000 UTC [ - ]
Neither do they about privacy.
cryptonector 2021-08-19 14:51:09 +0000 UTC [ - ]
goldcd 2021-08-19 16:33:05 +0000 UTC [ - ]
Rather than fighting small battles to defend privacy, just take a hyperbolic extreme and use the backlash to bolster their actual position.
(or I geuninely think they have lost their f'in minds)
rjzzleep 2021-08-19 05:03:21 +0000 UTC [ - ]
https://www.apple.com/leadership/craig-federighi/
arthurcolle 2021-08-19 05:48:48 +0000 UTC [ - ]
There are so many more talented software development leaders out there (maybe scratch all the ones working on adtech), but he's clearly a cultural troglodyte and this is evidence that he is incapable of actually leading the future engineers and software workers of the world.
If I saw him in a social setting I would try to ask him how he lives with himself, actively enabling the potential future dystopia and being incapable of defending the actual impetus of his actions, much less the reaction to it. Truly unforgivable, especially in an age of increased social fragmentation.
rjzzleep 2021-08-19 06:32:55 +0000 UTC [ - ]
He seems like a guy who genuinely believe that what he's doing is a net positive for society and that society should just leave things in his hands.
There are plenty of people like that at google last I remember. I used to go to the google dev day and the was no greater difference between really talented technical people presenting interesting things and engineering leadership believe that everyone should just have a dumb terminal in their hand and that google should do any processing and how that would be so much better for society.
zepto 2021-08-19 04:53:03 +0000 UTC [ - ]
There are good reasons not to want this. Just not making everyone into a suspect would be a start.
But whoever wrote the letter is clueless.
elisbce 2021-08-19 05:01:43 +0000 UTC [ - ]
In my opinion, this is the biggest concern, not the technology. Before, Apple could simply refuse by saying we don't have the capabilities. But now, that excuse is gone. Apple's promise to human review content and only report CSAM is the weakest link.
testfoobar 2021-08-19 06:02:56 +0000 UTC [ - ]
zepto 2021-08-19 17:08:52 +0000 UTC [ - ]
zimpenfish 2021-08-19 08:34:33 +0000 UTC [ - ]
could, yes, but given PhotoDNA has been using similar hashes from NCMEC for a decade and doesn't appear to have had similar surveillance pressure imposed, what makes the Apple scanning different to warrant slippery slope arguments like this?
cwizou 2021-08-19 11:38:54 +0000 UTC [ - ]
zepto 2021-08-19 05:33:27 +0000 UTC [ - ]
What do they have now that they didn’t have before?
karlshea 2021-08-19 05:46:41 +0000 UTC [ - ]
So you'd have to pressure NEMEC and another org under a different government to both add the non-CSAM hash, plus Apple would need to be pressured to verify a non-CSAM derivative image, plus you'd need other hash matches on-device to exceed the threshold before they could even do the review in the first place (they can't even tell if there was a match unless the threshold is exceeded).
I get why people are concerned, but between this thread and the other thread yesterday it's clear that pretty much everyone discussing this has no idea how it works.
1: https://www.apple.com/child-safety/pdf/Security_Threat_Model...
jjulius 2021-08-19 05:57:27 +0000 UTC [ - ]
tylerchr 2021-08-19 06:42:43 +0000 UTC [ - ]
Apple could have quietly implemented CSAM scanning server-side, and left the door open to it being quietly exploited in who knows what way. But they didn’t: instead they put a whole bunch of infrastructure in place that all but guarantees that they’ll be immediately caught (and publicly excoriated) if they try to use this CSAM mechanism for anything other than CSAM. (See the PDF that GP linked for technical details on why.)
Of course, they could still do it with some other mechanism. But in that case none of these CSAM changes are at all relevant to the concern, as that risk is unchanged from a month ago.
zimpenfish 2021-08-19 08:35:21 +0000 UTC [ - ]
Aren't iCloud Photos already scanned for CSAM though?
zepto 2021-08-19 17:09:17 +0000 UTC [ - ]
karlshea 2021-08-19 06:23:11 +0000 UTC [ - ]
zimpenfish 2021-08-19 08:37:07 +0000 UTC [ - ]
To be fair, if the other organisation is IWF[1] under the UK government, I don't think there'd be much pressure needed to get them to comply - just offer to bung them and their mates a few million in contracts and you'd be golden.
It's a sensible plan, it just might not be as strong as it seems.
[1] https://www.iwf.org.uk
hypothesis 2021-08-19 04:50:03 +0000 UTC [ - ]
And that interview is likely supposed to telegraph that they “don’t care”. It wasn’t a live one, was it? They could have improved parts of it if they wanted.
beckman466 2021-08-19 05:06:54 +0000 UTC [ - ]
Anybody a link?
arthurcolle 2021-08-19 05:21:54 +0000 UTC [ - ]
Here you go
anshumankmr 2021-08-19 05:28:44 +0000 UTC [ - ]
Clubber 2021-08-19 15:35:55 +0000 UTC [ - ]
Team Apple,
This is from someone outside the Apple ecosystem that management brought in and gave a lot of credence to.
I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.
The recipients of this memo didn't make any of the decisions and those were foisted on them by management who in turn were heavily influenced by this person/organization. As to why management was so heavily influenced, I have no idea. This is an attempt to get buy-in from Apple employees I suspect, or at least to ease their concerns. "Don't worry, you are doing the right thing."
It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.
The team they are referring to are the ones that pushed for the implementation for child protection. The recipients did all the work because management told them to.
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
This is the big quote. The author knew what this would do, and knows the tarnish the Apple brand is about to go through, but is trying to convince the audience that it's only a small group of people who will reject this and it will pass over. For the author, the ends justify any means because they are probably personally emotionally traumatized by child porn (rightfully so). Once this happens though, any rationality about consequence of action goes out the window. This is an attempt to get everyone concerned on the same level of ends-justify-the-means with the author. I don't know if the author believes this is just a temporary outrage or not. I suspect not. It doesn't really matter though the author got what they wanted, Apple brand be damned.
Our voices will be louder.
This obviously isn't the case. Not sure if the author means their organization or their organization + Apple. Either way, the counter resistance to this has been pretty minimal. They left Apple hang out to dry it seems.
Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.
During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.
This is the emotional sell, or "for the children." It's also a difficult bridge to reason across; comparing the possession of pictures and the manufacture of pictures as the same thing. Obviously the hash compare only finds existing manufactured pictures, so I don't see how it will protect children from any abuse that hasn't already occurred. Not only is this an emotional sell, it's a lie.
Thank you for finding a path forward for child protection while preserving privacy.
The final emotional pitch directed at people at Apple typically concerned with privacy. They aren't really protecting children and they also aren't preserving privacy. It's a boondoggle.
In closing, "for the children," is almost always a trap. It was used when I was a kid to mass incarcerate Americans that had drug problems, many of them the children they were protecting. It's typically done in an any-means-necessary heavy handed manner, as seen here. I don't condone child pornography in any way. I have never seen child pornography and hope to never see it. At the same time, what they are doing is the road to hell paved with good intentions seemingly by people who are so traumatized (based on their work) that they can't see the collateral damage they are doing.
istingray 2021-08-19 04:49:30 +0000 UTC [ - ]