Hugo Hacker News

Policy groups ask Apple to drop plans to inspect messages, scan for abuse images

arthurcolle 2021-08-19 04:42:33 +0000 UTC [ - ]

Apple did this in such a myopic, I want to say almost comically stupid manner. Craig's interview in and of itself was the biggest indictment of technology companies one could ever hope for. Dude literally stumbles as he says "multi-step algorithm" as if the world is being graced by his bountiful excellence.

Can't believe this dude is running software at Apple.

The headline for all of this should be "We know what it does and we don't want it"

SilverRed 2021-08-19 04:50:09 +0000 UTC [ - ]

All of the PR out of Apple seems to assume that people simply do not understand how it works and if they did they would feel better.

Everyone understands well enough how it works and understands how trivially it could be changed to search for other kinds of content.

Animats 2021-08-19 06:22:21 +0000 UTC [ - ]

Right.

At some point, the back story behind this will leak out. That will be interesting. This has to be something Apple was asked/pressured/ordered to do. As a business activity it makes no sense.

Bear in mind that over the last week, people in Afghanistan have been frantically trying to erase any evidence of doing things the Taliban doesn't like, such as women playing sports.

unityByFreedom 2021-08-19 07:29:23 +0000 UTC [ - ]

> This has to be something Apple was asked/pressured/ordered to do. As a business activity it makes no sense.

I firmly believe we'll never discover who applied that pressure, or how. And it's hard for me to imagine the US government doing that without some court battle with serious risk of leaks to journalists. I mean, it makes no financial sense, so Apple would fight that. The only thing I can imagine is pressure from a private interest whose market is big enough for Apple to care. And if it is that market, then I think we're in for a tough ride until we can build our own devices.

red_admiral 2021-08-19 07:41:40 +0000 UTC [ - ]

I think there's some strong hints in apple's implementation that they were under external pressure.

If you read the documentation on how it's implemented, there's some fairly advanced crypto - private set intersection, threshold secret sharing - that only makes sense if Apple took the line "we have to do this, but we're willing to do it in a really expensive way so that we ourselves have as little access as possible". They went to the effort of running the NeuralHash on the client device, as far as I understand.

The standard implementation on other cloud providers is that the provider has access to your data on the server if they really need it - much cheaper, and makes it technically possible in future to easily change the T&C to "we may use this for market research and to improve our products". I view Apple's client-side implementation as drawing a very big line in the sand saying we will not do that and we are willing to put our money where our mouth is by writing this crypto protocol.

But that certainly raises the question, why would Apple do anything like that at all, unless there's some (current or anticipated) external threat that forced them to?

As to whether we'll ever discover it - I wouldn't have predicted the Snowden revelations, maybe this will come out the next time something like that happens. If it does, I also predict we'll be shocked to discover just how big the market and distribution network for child abuse online is, and how many people are involved.

unityByFreedom 2021-08-19 07:47:59 +0000 UTC [ - ]

> I also predict we'll be shocked to discover just how big the market and distribution network for child abuse online is, and how many people are involved.

I don't think it is big compared to what authoritarian states would like to use this for. You're talking about, at the very least, putting everyone in non-democratic states under even more pressure to obey than they're already under. And in democratic states this invites bad actors both in and outside the government. There is no net-gain in justice here. It's wrong across the board.

Welcome back to HN after taking 5 months off, btw. You're a SSC fan?

red_admiral 2021-08-19 08:02:20 +0000 UTC [ - ]

Thanks!

Yes, I agree the state market is even bigger - China's potential market for surveillance alone is 1.3 billion and rising - but I've heard from colleagues in tech who seriously think child abuse is only a very small number of bad actors. My understanding is the "industry" is almost as out of control as the illegal drugs trade.

I am indeed a SSC/ACX fan (and subscriber).

hdjjhhvvhga 2021-08-19 11:54:01 +0000 UTC [ - ]

> As to whether we'll ever discover it - I wouldn't have predicted the Snowden revelations

Snowden paid a very high price not many people can afford, especially when they have a family. So I highly doubt we could know it within a reasonable timeframe.

arvinsim 2021-08-19 09:48:44 +0000 UTC [ - ]

What are the ramifications if Apple outright tells the public that they are cooperating with the authorities assuming that there is no gag order?

ByteWelder 2021-08-19 08:39:40 +0000 UTC [ - ]

> And it's hard for me to imagine the US government doing that without some court battle with serious risk of leaks to journalists.

Anything going through a FISA court (Foreign Intelligence Surveillance Court) could come with a gag order, preventing anyone involved to talk about it.

amanaplanacanal 2021-08-19 11:43:32 +0000 UTC [ - ]

But FISA doesn’t have anything to do with kiddie porn.

ByteWelder 2021-08-19 12:49:45 +0000 UTC [ - ]

While that is true, the security concerns are much wider than child abuse scanning. From the article:

> "Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,"

Many people and organizations fear that Apple's plans are paving the way for governments to scan for more than just child abuse images. (e.g. https://appleprivacyletter.com/)

In that light, FISA is relevant. My statement was to show that not all dealings of the US government are public, because not all courts and legal proceedings are available to the public. So "serious risk of leaks to journalists" won't likely happen in that case.

jjcon 2021-08-19 07:22:39 +0000 UTC [ - ]

I’m inclined to believe it was a quid-pro-quo (or a play at it). Apple inserts a little back door, in exchange the antitrust case against them will be eased.

hughrr 2021-08-19 07:33:22 +0000 UTC [ - ]

Based on the letter by NCMEC describing opponents to the technology as “screeching voices of the minority”, I suspect this is some kind of variant of being guilted into doing something by either lobbying, marketing or internal proponents. This resulted in a tone deaf death march and now they’re trying to work out how to spin it.

This is a fairly typical corporate failure mode.

Sometimes the best thing is to do nothing as they have found out because now they’re stuck in a position where the two exits are either piss off everyone by writing off their privacy stance or piss off everyone else for canning child porn scanning.

What a complete fuck up.

michelb 2021-08-19 08:46:37 +0000 UTC [ - ]

I'm convinced none of these conspiracy theories hold up. This is a typical Apple thing. Thought long and hard about something in a mostly isolated setting and then missing some obvious things.

hdjjhhvvhga 2021-08-19 11:58:38 +0000 UTC [ - ]

You might be right. On the other hand, all their past controversial ideas were reasonable from the financial point of view: introduce non-standard connectors, remove the minijack, make the hardware non-upgradeable and so on. All these things were bad for consumers, but good for Apple's bottom line. This one is bad for everybody. Apple's management is not stupid, they must have a good reason.

unityByFreedom 2021-08-19 07:31:21 +0000 UTC [ - ]

That sounds too corrupt to be believed. If someone came out with evidence of that it would completely blow such a deal.

kgwxd 2021-08-19 07:05:16 +0000 UTC [ - ]

Gaslighting is the term. From the first instance any one/entity uses it, trust should be over, forever, never to be regained.

varispeed 2021-08-19 11:02:18 +0000 UTC [ - ]

> Everyone understands well enough how it works and understands how trivially it could be changed to search for other kinds of content.

I think it's not really a problem of what is being looked for, but the mere fact that it will be done without a warrant on a personal device.

ilogik 2021-08-19 07:39:19 +0000 UTC [ - ]

can someone please explain to me what is this scary scenario where this is being misused? Am I the only one that isn't seeing it?

zimpenfish 2021-08-19 08:27:55 +0000 UTC [ - ]

The scanning matches against CSAM hashes provided by NCMEC - the fear is that NCMEC will be pressured into including hashes of non-CSAM material that USGOV want to track as well (say MAGA images for the current administration.)

(One counterpoint to this is that PhotoDNA has been used for scanning cloud images using NCMEC-provided hashes for a decade now and this slippery slope doesn't seem to have happened yet.)

ilogik 2021-08-19 08:32:41 +0000 UTC [ - ]

this is a very weak argument. The MAGA crowd has been posting images all over social media.

My understanding is that there is a threshold that needs to be met, and then an Apple employee with a very difficult job will need to confirm the image. So even if NCMEC get pressured, I don't see what the result can be.

tjpnz 2021-08-19 09:17:57 +0000 UTC [ - ]

That Apple employee (assuming the role doesn't get contracted out) has no incentive to protect you or me. But they'll absolutely err on the side of caution to protect their own livelihood.

ilogik 2021-08-19 09:37:35 +0000 UTC [ - ]

and what's the next step? FBI breaks down my door because I said something bad about Biden?

umanwizard 2021-08-19 11:08:35 +0000 UTC [ - ]

Something like that is unlikely to happen because the US is not — for now — an authoritarian state.

Replace “FBI” and “Biden” with their equivalents in various other countries and yes, it’s absolutely plausible.

saba2008 2021-08-19 12:42:12 +0000 UTC [ - ]

There is also international perspective: once this mechanism is in place, forced by US, it will become normalized in several years. Then China will have much better leverage to implement their own scans. Once China does it, every powerful state does it. And after that every third world hole will scan for everything.

ilogik 2021-08-19 13:06:01 +0000 UTC [ - ]

what's stopping them from doing that now?

umanwizard 2021-08-19 13:09:38 +0000 UTC [ - ]

Because Apple does not currently have code for scanning local devices for illegal images, and would have to implement it, which is a much higher bar than having already implemented it and just needing to add a few hashes to a database.

ilogik 2021-08-19 14:37:39 +0000 UTC [ - ]

but they do have it. pretty much all image hosts are already doing it as well

scrps 2021-08-19 06:45:06 +0000 UTC [ - ]

The second the world realized those weird nerds no one paid any attention to were gold mines was the beginning of the end. Whatever values we held as hackers, engineers, scientists is now subservient to the monied powers who could never figure any of this out on their own but are exceptionally skilled at convincing or threatening everyone into dancing to the beat of their own self interest.

Stop feeding them and they will starve on their ignorance. I'd rather be poor in a better world than rich in a dystopia.

varispeed 2021-08-19 11:06:51 +0000 UTC [ - ]

> I'd rather be poor in a better world than rich in a dystopia.

The problem is that we all end up being poor in a dystopia. The regulatory capture of industries is an ongoing thing. When 20 years ago you could start your own business in your garage, without substantial capital, today you are dependent on "Angel investors" and bank loans, which is an equivalent of having to be referred into a "club". If you can't find an investor, it will take you a lifetime to save money to start a business, if you are lucky. Take notice that all taxation is being shifted to workers to limit their chances of raising capital and creating competition.

labster 2021-08-19 05:16:06 +0000 UTC [ - ]

I’m starting to wonder if this is Apple’s strategy all along. They’ve had pressure for years from law enforcement to abuse their monopoly power to invade people's privacy. So they implement it in a relatively naive approach and sell in a hamfisted way, so that everyone turns against them for a little while. Apple backs off, and every time after this they point law enforcement to this debacle. See, we can’t help you, our customers wouldn’t tolerate it!

Not sure if it’s true or not, but we shouldn’t assume companies always intend their public plans to succeed. Maybe they actually want a high profile retreat — it could certainly save them development time down the road by not implementing government misfeatures.

JumpCrisscross 2021-08-19 05:35:48 +0000 UTC [ - ]

> they actually want a high profile retreat

Still incompetent. This would have made sense under another administration. The only thing holding back the antitrust dogs is public opinion. An Apple antitrust case would be politically costly in a way a Facebook case wouldn't. This type of thing corrodes the core of the pro-Apple vocal minority. It's bad GR, if that was the plan.

arthurcolle 2021-08-19 05:50:19 +0000 UTC [ - ]

GR?

JumpCrisscross 2021-08-19 06:43:53 +0000 UTC [ - ]

Government relations.

cryptonector 2021-08-19 05:39:37 +0000 UTC [ - ]

The average joe doesn't care about antitrust.

JumpCrisscross 2021-08-19 05:58:50 +0000 UTC [ - ]

> average joe doesn't care about antitrust

Neither do they about privacy.

cryptonector 2021-08-19 14:51:09 +0000 UTC [ - ]

They almost certainly care more about privacy. Enough more? No. But when horror stories start filtering out they'll care more, even if it is too late.

goldcd 2021-08-19 16:33:05 +0000 UTC [ - ]

I genuinely think it's intentional - Apple want to be stopped.

Rather than fighting small battles to defend privacy, just take a hyperbolic extreme and use the backlash to bolster their actual position.

(or I geuninely think they have lost their f'in minds)

rjzzleep 2021-08-19 05:03:21 +0000 UTC [ - ]

My initial response was that maybe that's why Apples software development has gone down in quality so much in the recent years. But then again he's been leading it since 2009 so maybe not. But it's definitely around the same time that they started removing programmability from the core os.

https://www.apple.com/leadership/craig-federighi/

arthurcolle 2021-08-19 05:48:48 +0000 UTC [ - ]

I deeply hope that he gets pushed out because of this. The skeuomorphism guy was definitely horrible and obviously was the cause of many assaults upon the senses, but this CSAM spyware scandal was such an affront on all that democratic, enlightened society holds close to the chest.

There are so many more talented software development leaders out there (maybe scratch all the ones working on adtech), but he's clearly a cultural troglodyte and this is evidence that he is incapable of actually leading the future engineers and software workers of the world.

If I saw him in a social setting I would try to ask him how he lives with himself, actively enabling the potential future dystopia and being incapable of defending the actual impetus of his actions, much less the reaction to it. Truly unforgivable, especially in an age of increased social fragmentation.

rjzzleep 2021-08-19 06:32:55 +0000 UTC [ - ]

> If I saw him in a social setting I would try to ask him how he lives with himself, actively enabling the potential future dystopia and being incapable of defending the actual impetus of his actions, much less the reaction to it. Truly unforgivable, especially in an age of increased social fragmentation.

He seems like a guy who genuinely believe that what he's doing is a net positive for society and that society should just leave things in his hands.

There are plenty of people like that at google last I remember. I used to go to the google dev day and the was no greater difference between really talented technical people presenting interesting things and engineering leadership believe that everyone should just have a dumb terminal in their hand and that google should do any processing and how that would be so much better for society.

zepto 2021-08-19 04:53:03 +0000 UTC [ - ]

If you read the letter, if’s clear that whoever wrote it has little understanding of what it does.

There are good reasons not to want this. Just not making everyone into a suspect would be a start.

But whoever wrote the letter is clueless.

elisbce 2021-08-19 05:01:43 +0000 UTC [ - ]

This part is legit though: governments around the world could pressure Apple to add other forms of surveillance. Be it hashes of non-CSAM, or simply pressure NEMEC or other hash providers, or extending its capabilities to all messages or photos on device.

In my opinion, this is the biggest concern, not the technology. Before, Apple could simply refuse by saying we don't have the capabilities. But now, that excuse is gone. Apple's promise to human review content and only report CSAM is the weakest link.

testfoobar 2021-08-19 06:02:56 +0000 UTC [ - ]

Apple doesn't even have to know how their scanner will be used. It could simply be a requirement for selling phones in Country X, Apple must scan devices for hashes from DB-20210819-Illegal-Hashes-Country-X.zip. The hash file will be provided by Country X as will the messaging contacts for appropriate state security services.

zepto 2021-08-19 17:08:52 +0000 UTC [ - ]

That’s pure fantasy. The system involves Apple reviewing any detected images.

zimpenfish 2021-08-19 08:34:33 +0000 UTC [ - ]

> governments around the world could pressure Apple to add other forms of surveillance. Be it hashes of non-CSAM, or simply pressure NEMEC or other hash providers

could, yes, but given PhotoDNA has been using similar hashes from NCMEC for a decade and doesn't appear to have had similar surveillance pressure imposed, what makes the Apple scanning different to warrant slippery slope arguments like this?

cwizou 2021-08-19 11:38:54 +0000 UTC [ - ]

PhotoDNA is not exclusive to NCMEC, and has been used for about 5 years by FB, MS and Youtube on terrorist content : https://about.fb.com/news/2016/12/partnering-to-help-curb-sp...

zepto 2021-08-19 05:33:27 +0000 UTC [ - ]

> governments around the world could pressure Apple to add other forms of surveillance.

What do they have now that they didn’t have before?

karlshea 2021-08-19 05:46:41 +0000 UTC [ - ]

If you read their Security Threat Model Review [1] they're only using the "intersection of hashes provided by at least two child safety organizations operating in separate sovereign jurisdictions".

So you'd have to pressure NEMEC and another org under a different government to both add the non-CSAM hash, plus Apple would need to be pressured to verify a non-CSAM derivative image, plus you'd need other hash matches on-device to exceed the threshold before they could even do the review in the first place (they can't even tell if there was a match unless the threshold is exceeded).

I get why people are concerned, but between this thread and the other thread yesterday it's clear that pretty much everyone discussing this has no idea how it works.

1: https://www.apple.com/child-safety/pdf/Security_Threat_Model...

jjulius 2021-08-19 05:57:27 +0000 UTC [ - ]

I think you're missing what concerns people; what guarantee do we have that Apple will, always and forever, only use the "intersection of hashes provided by at least two child safety organizations operating in separate sovereign jurisdictions"?

tylerchr 2021-08-19 06:42:43 +0000 UTC [ - ]

We have the same guarantee about that today as we did a month ago before this was announced. Apple writes the software that runs their devices, so they’ll always have the option to ship something like that.

Apple could have quietly implemented CSAM scanning server-side, and left the door open to it being quietly exploited in who knows what way. But they didn’t: instead they put a whole bunch of infrastructure in place that all but guarantees that they’ll be immediately caught (and publicly excoriated) if they try to use this CSAM mechanism for anything other than CSAM. (See the PDF that GP linked for technical details on why.)

Of course, they could still do it with some other mechanism. But in that case none of these CSAM changes are at all relevant to the concern, as that risk is unchanged from a month ago.

zimpenfish 2021-08-19 08:35:21 +0000 UTC [ - ]

> Apple could have quietly implemented CSAM scanning server-side

Aren't iCloud Photos already scanned for CSAM though?

zepto 2021-08-19 17:09:17 +0000 UTC [ - ]

No

karlshea 2021-08-19 06:23:11 +0000 UTC [ - ]

It’s almost like you didn’t read the paper or my whole comment.

zimpenfish 2021-08-19 08:37:07 +0000 UTC [ - ]

> So you'd have to pressure NEMEC and another org under a different government

To be fair, if the other organisation is IWF[1] under the UK government, I don't think there'd be much pressure needed to get them to comply - just offer to bung them and their mates a few million in contracts and you'd be golden.

It's a sensible plan, it just might not be as strong as it seems.

[1] https://www.iwf.org.uk

hypothesis 2021-08-19 04:50:03 +0000 UTC [ - ]

> "We know what it does and we don't want it"

And that interview is likely supposed to telegraph that they “don’t care”. It wasn’t a live one, was it? They could have improved parts of it if they wanted.

beckman466 2021-08-19 05:06:54 +0000 UTC [ - ]

> Craig's interview in and of itself was the biggest indictment of technology companies one could ever hope for. Dude literally stumbles as he says "multi-step algorithm"

Anybody a link?

Clubber 2021-08-19 15:35:55 +0000 UTC [ - ]

The "screeching minority" memo gives some insight. Let me break it down line by line. This is just my personal opinion as an external observer; I have no actual insight into what was going on.

Team Apple,

This is from someone outside the Apple ecosystem that management brought in and gave a lot of credence to.

I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.

The recipients of this memo didn't make any of the decisions and those were foisted on them by management who in turn were heavily influenced by this person/organization. As to why management was so heavily influenced, I have no idea. This is an attempt to get buy-in from Apple employees I suspect, or at least to ease their concerns. "Don't worry, you are doing the right thing."

It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.

The team they are referring to are the ones that pushed for the implementation for child protection. The recipients did all the work because management told them to.

I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.

This is the big quote. The author knew what this would do, and knows the tarnish the Apple brand is about to go through, but is trying to convince the audience that it's only a small group of people who will reject this and it will pass over. For the author, the ends justify any means because they are probably personally emotionally traumatized by child porn (rightfully so). Once this happens though, any rationality about consequence of action goes out the window. This is an attempt to get everyone concerned on the same level of ends-justify-the-means with the author. I don't know if the author believes this is just a temporary outrage or not. I suspect not. It doesn't really matter though the author got what they wanted, Apple brand be damned.

Our voices will be louder.

This obviously isn't the case. Not sure if the author means their organization or their organization + Apple. Either way, the counter resistance to this has been pretty minimal. They left Apple hang out to dry it seems.

Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.

During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.

This is the emotional sell, or "for the children." It's also a difficult bridge to reason across; comparing the possession of pictures and the manufacture of pictures as the same thing. Obviously the hash compare only finds existing manufactured pictures, so I don't see how it will protect children from any abuse that hasn't already occurred. Not only is this an emotional sell, it's a lie.

Thank you for finding a path forward for child protection while preserving privacy.

The final emotional pitch directed at people at Apple typically concerned with privacy. They aren't really protecting children and they also aren't preserving privacy. It's a boondoggle.

In closing, "for the children," is almost always a trap. It was used when I was a kid to mass incarcerate Americans that had drug problems, many of them the children they were protecting. It's typically done in an any-means-necessary heavy handed manner, as seen here. I don't condone child pornography in any way. I have never seen child pornography and hope to never see it. At the same time, what they are doing is the road to hell paved with good intentions seemingly by people who are so traumatized (based on their work) that they can't see the collateral damage they are doing.

istingray 2021-08-19 04:49:30 +0000 UTC [ - ]

I'm waiting for betting odds of Apple rolling this back to pop up on polymarket https://polymarket.com/

savant_penguin 2021-08-19 04:59:58 +0000 UTC [ - ]

The real question is: how can they make sure people are not committing other crimes? They better scan all messages at all times. Are they meeting more people face to face than the government permits? The only way to be sure is to log all gps data and compare.

They better monitor every aspect of their customers lives to protect the children. Think of the children

RpFLCL 2021-08-19 05:47:14 +0000 UTC [ - ]

In fact, couldn't we go even further, taking inspiration from Apple?

Plumbing installed in people's homes should scan for illegally consumed narcotics or prescription medications and check them against pharmacy records. Your home's electrical wiring should make sure it's not being used to illegally grow drugs. Cellphone cameras should make sure they never record a naked body without a signed consent waiver. Guitar amplifiers should verify that you're not playing an owned piece of music without a license.

How could we ever survive in a world where we relied upon targeted investigations and probable cause? How could we ever live in a world where pipes just moved things from A to B without inspecting them.

/s

maze-le 2021-08-19 07:42:58 +0000 UTC [ - ]

Don't give them any ideas, it's dystopian enough already...

abraae 2021-08-19 05:08:01 +0000 UTC [ - ]

According to that line of thought, we wouldn't have speed cameras because they can't be used to detect e.g. drunk driving.

It's fair IMO to use fit for purpose technology to target specific crimes.

Though I agree this is bad technology, not fit for purpose, and they should yank it.

pjmlp 2021-08-19 05:51:53 +0000 UTC [ - ]

When in UK I always feel being watched thanks to CCTV warnings everywhere.

It is also the only country from all of those I have been in almost 50 years, where great care is taken to never park in an illegal way.

Sure you will see it, away from city centers where no cameras are around.

intricatedetail 2021-08-19 12:30:37 +0000 UTC [ - ]

Funny that, when something happens to you and you request footage the usual answer is that camera was broken or it wasn't recording.

pjmlp 2021-08-19 12:32:21 +0000 UTC [ - ]

A friend of mine had the footage of her motorbike being stolen, didn't help much, she never managed to get it back.

short_sells_poo 2021-08-19 08:28:40 +0000 UTC [ - ]

> It is also the only country from all of those I have been in almost 50 years, where great care is taken to never park in an illegal way.

This is only true for the masses. Rich people just leave their cars parked anywhere they want in Mayfair in London because they don't care about a 50 quid fine.

zimpenfish 2021-08-19 08:41:22 +0000 UTC [ - ]

Famously, the US embassy owes about $13M in congestion charges (as of 2017!) they just don't bother to pay.

Although, to their credit, apparently[1] they do pay their parking fines.

[1] https://www.independent.co.uk/news/uk/home-news/us-diplomats...

alfiedotwtf 2021-08-19 05:37:00 +0000 UTC [ - ]

No. This is more like Apple enabling telemetry on your iPhone, and if it looks like you're traveling on a road higher than its designated speed, you'll be auto-fined.

EGreg 2021-08-19 05:08:34 +0000 UTC [ - ]

Well, actuuuuaaallly…

Everything you said is already being collected about you. Cellphone towers can log where you are. Cameras around your city can recognize you. All they have to do is link all the feeds into one database that can be searched. The only place you can escape the watchful eye is your end to end encrypted stuff… which ain’t your SMS. And even there, you are just taking their word for it since it’s a closed source product.

nbzso 2021-08-19 08:39:13 +0000 UTC [ - ]

I am glad that Apple has shown the true colors of trillion dollar corporation thinking. I see opportunity for people to understand the biggest problem with a future in which humans are only a datapoint in framework designed to benefit few on expense of many.

Can you imagine this "narrative" applied everywhere? From the moment you open your eyes everything around you is extracting data, monitoring and policing you.

We have to redefine our relationship with software and hardware (electronics). The infrastructure (5g) and IOT advancements are here. Data is the fuel of the coming "consumer revolution". Apple just positioned themselves perfectly as a Data Broker for high level customers - special groups, oppressive governments, you name it.

They will not back up from this. This is more profitable than selling shiny toys and "services". This gives political elites a reason not to break Apple monopolistic business practices.

I will go even further: This gives Apple a favorable position to become a trusted partner of governments of the World.

Deconstructing this only from a technical point of view is non-efficient. Apple is making an entrance on political stage by embracing the role of World Digital Police.

Everyone is free to apply his own conclusions and analysis. Mine are proportionally adequate. I have removed macOS from production workflow. I will never buy another Apple product. And I am willing to throw away "conveniences" and polished UX for preserving my privacy.

This privacy, not Apples corporate redefinition:

The quality or condition of being secluded from the presence or view of others. The state of being free from public attention or unsanctioned intrusion. A state of being private, or in retirement from the company or from the knowledge or observation of others; seclusion.

https://www.youtube.com/watch?v=z15JLtAuwVI

pulse7 2021-08-19 11:19:13 +0000 UTC [ - ]

Why don't Apple customers start a class action for misleading ads? From Apple's ads: "Privacy. That's iPhone."...

brandon272 2021-08-19 16:55:09 +0000 UTC [ - ]

"Privacy is a fundamental human right."

https://www.apple.com/privacy/

testfoobar 2021-08-19 05:02:08 +0000 UTC [ - ]

1) Some dev team at Apple is writing the on device scanning code. 2) Some dev team at Apple is writing a "sendMessagePolice()" API. 3) Some QA team at Apple is testing both systems.

I would personally quit if asked to work on any of these teams.

If you were the QA team lead for this, how would you test it? I can't imagine anyone would do day-to-day testing with actual illegal images.

It would not surprise me if the dev and QA teams at Apple already have hash colliding images to run their tests?

This is all awful. Steve Jobs must be rolling over in his grave.

unityByFreedom 2021-08-19 07:34:58 +0000 UTC [ - ]

> Steve Jobs must be rolling over in his grave.

June, 2010 - Jobs: "Privacy means people know what they're signing up for."

https://youtu.be/39iKLwlUqBo?t=154

December, 2015 - Cook: "If you put a back door in, that back door is for everybody, for good guys and bad guys."

https://youtu.be/rQebmygKq7A?t=66

August, 2021

Interviewer: "Isn't this in a way a back door?"

Federighi: "I really don't understand that characterization."

https://youtu.be/OQUO1DSwYN0?t=426

arvinsim 2021-08-19 10:28:41 +0000 UTC [ - ]

Apple really is talking down on people on that last one.

sumedh 2021-08-19 05:06:17 +0000 UTC [ - ]

> I can't imagine anyone would do day-to-day testing with actual illegal images.

You select any random image as your test image, marking that image as CP in the backend and then try to upload that image from your test phone.

testfoobar 2021-08-19 05:09:40 +0000 UTC [ - ]

Understood that that is how you test in the early stages. But at some point you have to run tests against the actual deployed perceptual hash database.

Ambroos 2021-08-19 05:34:09 +0000 UTC [ - ]

No reason why you can't just generate random images and add them to the real database, indistinguishable from actual hashes. As long as the test images aren't public it's low risk, and even if they leak they could be removed from the database and new test images could be generated.

johnny53169 2021-08-19 06:18:33 +0000 UTC [ - ]

Wouldn't that mean that people created a system that can be used with normal pictures? So your code was literally to flag an innocent photo?

testfoobar 2021-08-19 05:40:39 +0000 UTC [ - ]

That makes sense. I would still quit the team.

2021-08-19 06:52:46 +0000 UTC [ - ]

sampling 2021-08-19 07:44:34 +0000 UTC [ - ]

Apple's "CSAM detection" feature is a two-part on-device/server as I understand it:

1: Run on-device code to perform perceptual hash comparison of each photo against an on-device encrypted database of known CSAM hashes.

2: On iCloud Photos servers, send out the relevant notifications when a user’s iCloud Photos account exceeds a threshold of positive matches.

So as a high level testing strategy, I would want to:

- Verify on-device lookup of CSAM hashes. This could be tested by provisioning a test device with an on-device database containing CSAM hashes of images that aren't illegal. As a bystander, I think I'd be fairly confident with this approach because I'm guessing the on-device database that Apple ships could conceivably be changed over time to expand the definition of the images it will flag as CSAM.

- Do some exploratory testing to discover the threshold of how much image manipulation can be done on a flagged image before the perceptual hash comparison fails to return a match.

- Verify that the notification system notifies the correct parties once a user account exceeds the defined threshold of positive CSAM matches.

- Ensure the flagged account can still be investigated if user deletes the offending material from iCloud, or their account by the time a real person gets around to investigating.

- Ensure that the logging is informative and adequate (contains device name, timestamp, etc.).

- Test behaviour on same iCloud account logged in to multiple devices.

- Figure out any additional business logic - are positive matches a permanent count on the account or are they reset after a certain amount of time?

source: https://www.apple.com/child-safety/pdf/Security_Threat_Model...

ByteWelder 2021-08-19 08:50:04 +0000 UTC [ - ]

You wouldn't test this with child abuse images. You would likely test it with a set of legal images, for which you generated your own hashes.

nebula8804 2021-08-19 06:47:52 +0000 UTC [ - ]

>This is all awful. Steve Jobs must be rolling over in his grave.

They didn't even last 10 years since his death. :/

testfoobar 2021-08-19 05:24:29 +0000 UTC [ - ]

Homosexuality is a crime in some countries. Would Apple allow country level filters that call the police in some countries but not others due to local criminal statutes? What about cannabis use?

pulse7 2021-08-19 05:56:31 +0000 UTC [ - ]

Apple follows "local laws" because it wants to make business everywhere in the world... Example: Apple in China.

boublepop 2021-08-19 15:21:11 +0000 UTC [ - ]

What is the alternative? Not following local law? Apple is just a company, they cannot and do not make their own laws, this should be obvious and not a point of criticism. If you don’t like the laws as they are written in China, then you shouldn’t be lobbying Apple to ignore or break them, you should be lobbying politicians to change them.

2021-08-19 16:06:55 +0000 UTC [ - ]

dkdbejwi383 2021-08-19 06:19:52 +0000 UTC [ - ]

This system could potentially look for images that offend the CCP, such as the Hong Kong flag, Uighur iconography, pictures of Winnie the Pooh, etc.

zimpenfish 2021-08-19 08:43:07 +0000 UTC [ - ]

> This system could potentially

As could PhotoDNA, though, which everyone has been using for a decade to scan cloud photos - but it doesn't seem to have happened.

intricatedetail 2021-08-19 12:34:14 +0000 UTC [ - ]

There is a difference though. Apple is going to scan your device. It's like having a search without a warrant.

kemayo 2021-08-19 14:27:53 +0000 UTC [ - ]

I want to ignore most of this article to complain about an inaccuracy that keeps coming up.

> More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.

But... it wouldn't. The iMessage feature doesn't expose the contents of your message to anyone else under any circumstance.

If you're a child under 13 and your parents have opted in to this feature, you get a choice of seeing naked-pictures sent to you and having your parents be notified that you chose to, and not seeing it with no notifications of anything. (But once you're 13+, no notifications would occur either way.)

There are potential issues with this, mostly relating to abusive families being controlling. They'd have to do weird things like forcing their teenaged children to keep making new under-13 accounts to actually take advantage of it like that, though. And none of these issues impact the e2e status of iMessage in any way.

Apple really screwed up PR by launching the iMessage feature alongside the scanning-and-reporting iCloud Photos feature. There's so much confusion out there about this.

(The breaking-e2e aspect does exist with the iCloud Photos scanning... not that it's currently e2e, of course.)

EricE 2021-08-19 14:44:35 +0000 UTC [ - ]

You can only inspect something if it's decrypted.

Content is no longer private if there is anything inspecting it other than you.

I expect cloud providers to fiddle with my data - because they can and do. I don't expect vendors, especially one touting themselves as being more privacy focused than everyone else, to be sifting through my data on device.

Yes, today it's a carefully designed algorithm designed to be one way. The algorithm isn't the problem. The problem is the method used to deliver this algorithm can also be used to deliver *other* algorithms. And no, especially since 9/11 I have zero faith that this is as far as this mechanism will ever be used.

kemayo 2021-08-19 14:56:48 +0000 UTC [ - ]

Okay, but your device literally has to decrypt the message in order to show it to you. It then does other things that are vaguely "content analysis" like generating link previews and marking up the text to make dates clickable.

The iMessage feature is "it looks like someone has sent you a picture of their dick, are you sure you want to see that?" I know adults who'd like to turn that feature on.

Could it hypothetically be extended in the future? Sure. But arguing against this feature as-is is about the same as arguing against the feature that lets you search for "dog" in your photos.

TechBro8615 2021-08-19 14:41:16 +0000 UTC [ - ]

So one part of the proposal breaks e2e encryption for Photos.app, and the other breaks it for Messages.

kemayo 2021-08-19 14:46:02 +0000 UTC [ - ]

How does this break e2e for Messages? They're still only ever decrypted on your device, and their content is never shared with others. You have to radically redefine what "e2e encryption" means for this to qualify as breaking it.

There's no e2e encryption for Photos currently, so it can't be breaking it there. If you use iCloud Photos, your photos are stored encrypted, but Apple has the key and could be doing all the cloud-scanning it felt like.

TechBro8615 2021-08-19 15:14:20 +0000 UTC [ - ]

Not sure it needs to exist already to be broken

kemayo 2021-08-19 15:26:03 +0000 UTC [ - ]

A hypothetical future where Apple doesn't have the encryptions keys for iCloud Photo Library, but this on-device scanning for uploads to iCloud Photo Library is happening -- that sounds more secure than the status quo, in which totally arbitrary searches of your photos could be done on the server in response to warrants/NSLs.

I'm sure in such a future there'd be quite the argument happening about whether you could call it e2e.

TechBro8615 2021-08-19 16:29:32 +0000 UTC [ - ]

What also sounds more secure than the status quo is not scanning my private photos anywhere – not on my device, and not in your cloud where I grant you custody of them.

kemayo 2021-08-19 16:52:43 +0000 UTC [ - ]

Although this is true, none of the major cloud services seem inclined to actually do that. Google, Microsoft, Dropbox, Facebook, they're all scanning photos[0].

Let's not pretend that this sort of thing is happening in isolation. There really is a big legal movement towards requiring encryption backdoors, and programs like Apple's here are effectively part of a big negotiation with law enforcement -- note that the reason Apple's currently not e2e is apparently that the FBI pushed them on it[1]. "We'll give you what you claim is important, CSAM scans, and in return we'll lock everything else up securely". Is this great? No. Would an absolutist "law enforcement can have nothing" position backfire into stupid legal decisions? Plausibly.

[0]: https://en.wikipedia.org/wiki/PhotoDNA [1]: https://www.popularmechanics.com/technology/security/a306318...

mtbnut 2021-08-19 16:05:03 +0000 UTC [ - ]

"It's a backdoor. We know we said backdoors are bad and that a backdoor can be used by both good guys and bad guys, but this backdoor is made with Apple-certified lumber."

JumpCrisscross 2021-08-19 04:49:25 +0000 UTC [ - ]

Do we know who within Apple is pushing for this?

jamil7 2021-08-19 07:19:57 +0000 UTC [ - ]

I don’t think anyone within Apple is pushing for this, I assume the pressure is coming from somewhere external or they’re anticipating pressure because the feature doesn’t really make a lot of sense right now. We’ll probably get the actual truth years from now in some leak.

Salgat 2021-08-19 05:09:58 +0000 UTC [ - ]

It's so bizarre how hard they're pushing this, even though they don't really gain much from it. This is especially weird because Apple has always pushed for user privacy. I just don't get it.

cobookman 2021-08-19 05:23:28 +0000 UTC [ - ]

As a cloud photo hosting provider (iCloud) they might legally be required to (IANAL).

Google [1], Dropbox [2], Microsoft [3], and many many other Photo Cloud storage providers have implemented similar child porn detection.

[1] https://www.columbiatribune.com/story/news/crime/2021/03/22/...

[2] https://gizmodo.com/dropbox-refuses-to-explain-its-mysteriou...

[3] https://www.local10.com/news/local/2020/02/21/microsoft-aler...

ipv6ipv4 2021-08-19 05:40:13 +0000 UTC [ - ]

Nope.

This is the text of the current law [1]. Alerting authorities about CP is required if discovered. Actively searching for CP is not required. Look for the aptly named "protection of privacy" paragraph.

[1] https://www.law.cornell.edu/uscode/text/18/2258A

JumpCrisscross 2021-08-19 05:32:47 +0000 UTC [ - ]

> As a cloud photo hosting provider (iCloud) they might legally be required to (IANAL)

This doesn't explain the insistence on scanning all images on device, whether they're hosted on iCloud or not.

The moment for leadership from Cook has come and almost passed. It's surprising watching Apple squander its brand at this moment. A good amount of its previous support with respect to federal antitrust has permanently dissipated.

paldepind2 2021-08-19 05:44:45 +0000 UTC [ - ]

They only scan images for CSAM before they’re uploaded to iCloud. If you don’t use iCloud then that check will never happen.

minton 2021-08-19 11:36:40 +0000 UTC [ - ]

That is true today and they pinky promise not to enable it down the road.

tpush 2021-08-19 05:39:55 +0000 UTC [ - ]

[...] they might legally be required to (IANAL).

They're only obligated to report if found, but they don't have to proactively scan (yet?). In the EU until recently they weren't even allowed to voluntarily scan.

istingray 2021-08-19 04:50:13 +0000 UTC [ - ]

In the crazy universe where Apple rolls this back, what does it look like? Cook resigns? Craig? Team is sacked?

PopePompus 2021-08-19 05:00:17 +0000 UTC [ - ]

Nobody needs to resign. Nobody needs to be sacked. They just need to acknowledge the mistake and not push the spyware update.

zionic 2021-08-19 14:56:39 +0000 UTC [ - ]

Hard disagree. This entire process reveals systemic rot at Apple.

The engineers who built, the project managers who coordinated, and the executives who signed off on this smoothbrain idea need to be drummed out without ceremony.

No severance, no golden parachutes, no cushy board jobs etc.

People need to understand they risk their livelihoods building and deploying a surveillance system like this. In a just world they would face criminal charges.

cmelbye 2021-08-19 05:37:46 +0000 UTC [ - ]

Trust takes years to build and seconds to break. I could be grossly overestimating the impact that this has had to their brand. But in my eyes, I can't see a way that they revert the damage, even if they revert the decision.

minton 2021-08-19 11:30:51 +0000 UTC [ - ]

Even as someone in tech, I am having a terrible time finding a viable replacement for my iPhone and MacBooks. People won’t inconvenience themselves too much. If there is no easy alternative, most people will choose not to think about it and carry on as before.

Lamad123 2021-08-19 09:24:01 +0000 UTC [ - ]

This device that I've cherished so long scares the hell out of me now!!!

EGreg 2021-08-19 05:12:35 +0000 UTC [ - ]

Yeah I still remember Scott Forstall being pushed out over the Apple Maps failures. And that is how Jony Ive came to take over both hardware and software design, leading to the end of skeumorphic design that Jobs loved so much - and made the iPhone so unique.

Now Apple’s just a follower…

nebula8804 2021-08-19 06:50:02 +0000 UTC [ - ]

I heard he was pushed out not due to the failure but that he refused to improve and accept the mistakes that he made. It pissed off Tim Cook who was already mad that he apparently couldn't work with the other team members well.

Take all this with a grain of salt. I wonder if Jobs was just good at managing these brash personalities together.

stackbutterflow 2021-08-19 05:43:07 +0000 UTC [ - ]

>leading to the end of skeumorphic design that Jobs loved so much

Jobs loved the end of skeumorphic design or Jobs loved skeumorphic design?

dkdbejwi383 2021-08-19 06:25:37 +0000 UTC [ - ]

2021-08-19 05:43:24 +0000 UTC [ - ]

echelon 2021-08-19 05:40:19 +0000 UTC [ - ]

1. Apple faces looming antitrust battles.

2. FBI/CIA, frustrated at previous failures, pressure senators/congresspeople to turn up the heat.

3. FBI/CIA then gives Apple the option to implement surveillance with the promise that they can make the antitrust troubles go away if Apple does what they want.

4. Apple does the thing. (We're here now.)

5. Antitrust case goes away.

The FSB, CCP, and other intelligence orgs might be trying the same strategy in their countries.

I expect this is how everything works in the big leagues.

istingray 2021-08-19 04:44:32 +0000 UTC [ - ]

Here's the link to the open letter from the policy group (PDF): https://cdt.org/wp-content/uploads/2021/08/CDT-Coalition-ltr...

DanWritesCode 2021-08-19 07:32:57 +0000 UTC [ - ]

One of the things I really didn't expect to be Googling in 2021 is 'Best dumb phone 2021'.

After advocating to all my friends that Apple is a beacon of how privacy should be done, I just can't understand how they've made such a hash of this.

fsflover 2021-08-19 09:56:01 +0000 UTC [ - ]

Why a dumb phone if you have a verifiable FLOSS GNU/Linux phone in 2021, Librem 5 or Pinephone?

LinuxBender 2021-08-19 13:10:16 +0000 UTC [ - ]

Have any wireless providers verified that Firmware/OS/App updates can't be pushed via OTA to those devices? Asking because to my knowledge no such verification has been attempted. Background: I used to push firmware updates to phones in the 90's on a GSM network. The only reason we didn't do this globally was risk of customer support issues.

fsflover 2021-08-19 13:13:00 +0000 UTC [ - ]

Both phones have their modems separated from the OS with USB/M.2 interfaces. Also, they have kill switches. Pinephone's modem even runs Linux: https://linuxsmartphones.com/hackers-develop-open-source-fir....

LinuxBender 2021-08-19 14:22:27 +0000 UTC [ - ]

I understand what you are saying, but in my opinion that doesn't really answer the question. For a phone to attach to a network, there is some level of bidirectional trust and those kill-switches aren't going to just arbitrarily activate. Has anyone at a wireless provider tested pushing OTA updates to those phones? Has the vendor explicitly stated that the phones will explicitly block by default any attempts at OTA updates or that the phones explicitly do not have the capability?

fsflover 2021-08-19 14:33:18 +0000 UTC [ - ]

I don't think any vendor confirmed that. But is this possible to update software on my device without permission by its OS via a well-defined external interface? These phones are desktop computers running GNU/Linux. If they can be updated this way, then my laptop can as well be updated by a WiFi card, can't it? (I don't think so)

Upd: I am not an expert, but maybe the schematics could help you here: https://source.puri.sm/Librem5/l5-schematic and https://wiki.pine64.org/index.php/PinePhone#PinePhone_board_....

LinuxBender 2021-08-19 14:48:21 +0000 UTC [ - ]

I suppose it depends on what standards each of these Linux phone implementations have followed. Android also runs Linux, but they modified it to give the wireless providers and Google full control, hence people wanting to root their phones. Just running Linux is not enough to give me confidence that the phone is truly isolated from network management. So I suppose it comes down to what kernel modules, libraries and trust these phones are giving to the provider and what standards they are following. A good start would be independent kernel hackers digging into the hardware and OS to see what they can make the phone do and what they see the phone doing under normal operations when attaching to the network and how the phone responds to OTA update attempts and what the firmware on the phone is doing. If I control the firmware and you control the OS, I control what your OS can read and write to. So I suppose my unanswered questions are:

- Who writes and maintains the firmware the OS is running on.

- Who writes and maintains the modem firmware and who can update it.

- Who can update that firmware for the board the OS is running on. This could be a different answer than who initially creates it for the retail distribution of the phone.

- What level of trust has been inserted into the OS by kernel modules and who maintains those kernel modules.

- What control is given to the end user to see what those modules are doing and limit what they can do.

I suspect more questions could arise as kernel hackers audited the phone. The dilemma I see is that such kernel hackers won't be interested until those phones are wildly popular. The only other way to get their interest is with money.

fsflover 2021-08-19 14:57:02 +0000 UTC [ - ]

> Who writes and maintains the firmware the OS is running on

Both Librem 5 and Pinephone run FLOSS operating systems with software maintained by Purism and community. The latter smartphone has two firmware blobs in kernel for WiFi/Bluetooth AFAIK: https://www.pine64.org/2020/01/24/setting-the-record-straigh....

> Who writes and maintains the modem firmware and who can update it.

See my link above concerning the Pinephone modem. It's software can be updated by the OS. AFAIK it's the same for Librem 5.

> Who can update that firmware for the board the OS is running on.

See the first answer. Librem 5 is going to get FSF certification "Respects Your Freedom". Proprietary software only runs on the modem and WiFi card. See also: https://puri.sm/posts/librem5-solving-the-first-fsf-ryf-hurd....

> What level of trust has been inserted into the OS by kernel modules and who maintains those kernel modules.

I am not knowledgeable enough to answer that. Maybe the schematics linked above could help you.

> What control is given to the end user to see what those modules are doing and limit what they can do.

User has full control. This is a selling point of Purism the company: https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque...

norov 2021-08-19 13:04:52 +0000 UTC [ - ]

Starting to suspect that a trillion dollar corporation is more interested in profits than fighting for the little guy?

cassalian 2021-08-19 11:46:46 +0000 UTC [ - ]

There's been a lot of question about the motivation for this feature. Am I the only one that thinks this could be related to things like the EARN IT (or something similar).

Here's an article from a year ago on EARN IT:

> Theoretically, a system that uses client-side scanning could still send messages encrypted end to end, and so the Leahy amendment would not offer any protection, but many of the same confidentiality concerns with backdoored “e2ee” systems would continue to apply.

Source: https://cdt.org/insights/the-new-earn-it-act-still-threatens...

EricE 2021-08-19 14:47:40 +0000 UTC [ - ]

Sensitizing people to the idea that a benevolent dictator is going through content on their local device for their benefit.

It's all about conditioning. First it's to save the children. Then it will be about preventing the terrorists. Finally it will be about preventing "hate speech" - which these days hate speech is pretty much speech from anyone you disagree with :p

Like the 1984 novel, Minority Report wasn't an instruction manual!

jitl 2021-08-19 04:53:24 +0000 UTC [ - ]

What if this is four-dimensional chess by Apple: maybe they deliberately announced this back door, just to prove how politically poor the backdoor topic is with the public? Maybe Congress will think twice about trying to force this via legislation on anyone.

(This comment is intended as a joke.)

vegetablepotpie 2021-08-19 05:06:12 +0000 UTC [ - ]

How could you say that? You have to apply Occam’s razor here, which would say that Apple comically misstepped and nothing more.

EGreg 2021-08-19 05:10:09 +0000 UTC [ - ]

Why would you assume companies are more likely to be unable to think two steps ahead?

vegetablepotpie 2021-08-19 05:14:00 +0000 UTC [ - ]

Because companies are lead by people and people make mistakes.

EGreg 2021-08-19 05:16:14 +0000 UTC [ - ]

Making mistakes is one thing

Thinking two steps ahead can be done even when you make mistakes

istingray 2021-08-19 06:05:35 +0000 UTC [ - ]

heheh possible! After pages and pages of "I'm confused why people are outraged about this" trolls....I could use a joke like this :)

sharperguy 2021-08-19 05:48:41 +0000 UTC [ - ]

I find this discussion interesting, because there is no way to prove that Apple does not already do this, since their operating system source code is completely opaque, and the lenghts they go to to prevent you from viewing or modifying it.

pulse7 2021-08-19 05:58:22 +0000 UTC [ - ]

Maybe all this is just a PR message to everybody: "Your data is not safe! Don't store anything private on your phone!"

pulse7 2021-08-19 05:50:14 +0000 UTC [ - ]

Imagine this: Apple's software is scanning for abuse images and finds a ton of false positives which are then manually inspected. Those false positives are... >>private photos of naked adult man and woman<<... and this is exactly what you want to hide from >>everybody<< on the planet... but no: they will be manually inspected... Good luck Apple fans! :-)

arbirk 2021-08-19 06:31:58 +0000 UTC [ - ]

False positives will be very rare. We probably won't hear of any incidents except for activists trying to spoof the system.

CSAM is one of the most extreme crimes against a person and the number of people sharing and profiting from CSAM is growing. This is something every tech company will have to deal with.

EricE 2021-08-19 14:49:18 +0000 UTC [ - ]

>False positives will be very rare.

Really? Based on what? Multiple stories of hash collisions (false positives) are *already* popping up.

If only all the people who think this isn't a big deal were the first to get hit by false positives. Now that would be poetic justice :p

minton 2021-08-19 11:47:58 +0000 UTC [ - ]

If true, it’s incredible that the numbers are growing. With today’s big data and surveillance (i.e., privacy is dead), I would think it would be impossible for there to be any sizable groups profiting from CSAM.

amanaplanacanal 2021-08-19 12:01:42 +0000 UTC [ - ]

This part I’m unclear on: is anybody actually profiting? And is child abuse actually growing? Or is it just photo sharing that’s growing?

pulse7 2021-08-19 09:56:56 +0000 UTC [ - ]

In Germany about 40% of the cases where naked child photos are found are false positives: it is sexting among teenagers...

umanwizard 2021-08-19 11:18:30 +0000 UTC [ - ]

In the US, if I understand correctly, those would be true positives, and everyone who created, received, or distributed them would be guilty of a crime. There’s no exception in the law for pictures you took of yourself.

jl2718 2021-08-19 11:44:36 +0000 UTC [ - ]

I understand the issues, but, this headline sounds absolutely awful to the uninitiated. They can never go backward on this.

tlogan 2021-08-19 06:17:10 +0000 UTC [ - ]

So Apple is not required by law to do this. So why they are doin this?

EricE 2021-08-19 14:49:55 +0000 UTC [ - ]

That's the real question. They had to know it would be a contentious issue; what's the real long term goal here?

robertwt7 2021-08-19 06:07:00 +0000 UTC [ - ]

Will they ever listen though? This is so wrong in so many levels but not sure if apple ever listen.

somenewaccount1 2021-08-19 05:07:11 +0000 UTC [ - ]

i imagine they are doing this to comply with some law that requires they scan images in iCloud, and they don't want to start storing those un-encrypted. Does anyone know if there is such a law?

ipv6ipv4 2021-08-19 05:41:51 +0000 UTC [ - ]

This is the text of the current law in the U.S. [1]. Alerting authorities about CP is required if discovered. Actively searching for CP is not required. Look for the aptly named "protection of privacy" paragraph.

[1] https://www.law.cornell.edu/uscode/text/18/2258A

nonbirithm 2021-08-19 07:34:17 +0000 UTC [ - ]

There might not be a law, but in practice, letting criminals host illegal data on your servers is unlikely to be a good look.

The intent of the laws making the possession CSAM illegal are ultimately to stop the spread of CSAM. If those laws fail in their stated purpose just because a company chooses not to be proactive in how it searches for the material, that would be missing the point. The media and governmental pressure would wipe any company foolish enough not to implement child safety measures out of existence, but no major tech company is that foolish (except Kik perhaps). And hypothetically, the government could just pass a law enforcing proactive scanning if there were to exist a company that chose not to proactively scan, which would mean that the government would fail to solve the issue of preventing the spread of CSAM, and thus CSA.

But in practice, such a law isn't needed. No company wants to be derided as a service that allows child abusers to get away with their crimes.

BoHerfIIIJrEsq 2021-08-19 05:39:33 +0000 UTC [ - ]

I don't care if Apple actually ends up going through with it. I already decided to de-Apple myself. No more iPhones, no more MacBooks.

endisneigh 2021-08-19 04:49:57 +0000 UTC [ - ]

Apple will not drop their plans. Google and Facebook are already doing this. Be happy at least Apple’s implementation happens client side and therefore can preserve end to end encryption.

https://transparencyreport.google.com/child-sexual-abuse-mat...

https://www.facebook.com/safety/onlinechildprotection

https://www.microsoft.com/en-us/photodna

https://blog.flickr.net/en/2021/04/16/keeping-flickr-and-chi...

Most companies dealing with photos have had this in place for almost a decade now. The FUD and slippery slope arguments are getting old.

ralph84 2021-08-19 05:06:23 +0000 UTC [ - ]

Nobody expects Google and Facebook to care about privacy. Apple was the one running TV ads about how they were the privacy company. If Apple was really planning E2EE they would have announced it at the same time, because it would have been the perfect time to do it. Disarm the critics of E2EE with CP scanning and disarm the critics of CP scanning with E2EE.

EGreg 2021-08-19 05:14:41 +0000 UTC [ - ]

Apple also ran ads about this:

https://m.youtube.com/watch?v=Af0gtsjfy7E

What’s your point? Apple said RISC was better and faster than Intel, until the day they adopted Intel, and then they said the complete opposite. They acted like they invented the Omnibox for Safari and so forth. It’s for marketing.

bostik 2021-08-19 05:51:19 +0000 UTC [ - ]

Propaganda 101: we have always been at war with Oceania.

brendoelfrendo 2021-08-19 05:08:36 +0000 UTC [ - ]

I agree. The claims that E2EE are right around the corner are getting really tiring. If it was coming, Apple should have announced it alongside this "feature" and not expected everyone to trust them.

endisneigh 2021-08-19 05:09:34 +0000 UTC [ - ]

iCloud photos already has E2EE.

commoner 2021-08-19 05:49:45 +0000 UTC [ - ]

No, iCloud Photos are not currently end-to-end encrypted. Apple lists all of the items in iCloud that are covered by E2EE in the "End-to-end encrypted data" section after the table, and photos aren't on the list:

https://support.apple.com/en-us/HT202303

Bilal_io 2021-08-19 04:59:29 +0000 UTC [ - ]

> Be happy

Just because the bar is so low does not mean we have to accept privacy abuse.

endisneigh 2021-08-19 05:00:11 +0000 UTC [ - ]

You don’t have to accept it. Don’t use iCloud, no problem.

2021-08-19 05:02:40 +0000 UTC [ - ]

brendoelfrendo 2021-08-19 05:00:57 +0000 UTC [ - ]

> Be happy at least Apple’s implementation happens client side and therefore can preserve end to end encryption potentially.

No, why would I be happy with this? I actually prefer the alternative: scan photos server side, fine. That's Apple's computer and if I send my photos there, then I can't justifiably argue against that. But my device shouldn't be part of their attempt to appease law enforcement by turning the world into a dragnet.

endisneigh 2021-08-19 05:01:50 +0000 UTC [ - ]

If you don’t send your photos this doesn’t affect you in any way.

torstenvl 2021-08-19 05:07:34 +0000 UTC [ - ]

Sure. And the police only have your house under constant surveillance in case you sell heroin out of the garage. Don't worry, comrade, there is no reason to worry if you have nothing to hide.

tyingq 2021-08-19 05:10:00 +0000 UTC [ - ]

iPhone defaults and settings have a strong history of resetting themselves to values you didn't pick.

2021-08-19 06:07:06 +0000 UTC [ - ]

crooked-v 2021-08-19 05:03:05 +0000 UTC [ - ]

...for now.

brendoelfrendo 2021-08-19 05:02:59 +0000 UTC [ - ]

That's how it's supposed to work, but frankly, once the capability exists client-side, I don't trust that it will continue to work that way.

EricE 2021-08-19 14:52:10 +0000 UTC [ - ]

Not sure why you are getting voted down - you are 100% correct.

Especially in this post 9/11 world - look at what was supposed to be "temporary" to deal with the "terrorists". Ha! You have to be an utter simpleton if you don't think use of this mechanism will expand. It's not even a leap of logic - it's been going on for 20 years all around us!

brandon272 2021-08-19 17:01:46 +0000 UTC [ - ]

For some reason with every new privacy infringement people seem to want to believe that "this will be the last one".

I'm sure 3 weeks ago no one could have fathomed that Apple would make an announcement like this, yet here we are.

2021-08-19 05:03:39 +0000 UTC [ - ]