Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple’s ill thought out on-device CSAM detection and thankfully, the removal of it from its plans after heavy pushback comes to mind.

Any time a specific ability is granted or is available to law enforcement, it will expand to broaden surveillance on everything. CSAM is horrific, despicable and devastating. But the way law enforcement treats it as an easy “in” to push for more surveillance and more powers for itself is shameful.



It’s not shameful that people try to grab power to stop evil, it’s natural. What’s shameful is that we don’t have more political checks against these type of power grabs. And that citizens who could stand up and fight against abusive surveillance are increasingly apathetic.


Mostly because they can go for those power grabs again, and again - ad nauseam. After a while populace is just bored of fighting, and only few people care.


The problem is, that we have to complain loudly every goddamn time, get the media, protest, write articles, etc., to get them to pull back.

They only have to succeed once, and the law is written and stamped.


I know, sigh. Which is exactly why we need more structural defense against this behavior. I think it’s about time for a modern day constitutional convention.


People who prefer liberty over tyranny have to win almost every time, it's only takes one failure to lose liberty for a hundred--two hundred?-- years.


The statement is that it is shameful to grab for powers that invade the privacy of law abiding citizens.


The quandary is that powers that "invade the privacy of law abiding citizens" are also powers "to stop evil."

Pretending they aren't is part of the problem, as it empowers those who would push them to publicly advertise the latter good in a vacuum of silence from the tech side.

Something like 'Personal privacy is more important than maximizing law enforcement efficiency, including of CSAM' is a more honest, complete position.


> to stop evil

It's not at all obvious whether stopping CSAM is really the primary goal of some of the people who are pushing these regulations. It seems just like a justification to invade personal privacy.


You are speculating about intents some other people might have.

Let's take the issue in vacuum first. Either you hold your privacy as more important than X, or you are willing to compromise some of your privacy in the name of X-- there's no third option. Eg. in case of airport security or CCTV in some public space suddenly everyone is OK compromising personal privacy in the name of personal life and safety.

Now finally let's get back to those other people whom you suspect of having an agenda to surveil everybody. If they honestly tried to combat child abuse, how do you see them going about it?


"If they honestly tried to combat child abuse, how do you see them going about it?"

Detective work, stake outs, researching who makes this stuff, convicting the actual producers, convicting people who do direct abuse... In general, taking real steps instead of reading everyone's dairy and then doing nothing.


> Detective work, stake outs

So, physical surveillance. Idk if you are aware but this means physical surveillance on everyone, because with Tor you can't narrow this stuff geographically. Would you rather to be physically surveilled.

> researching who makes this stuff, convicting the actual producers, convicting people who do direct abuse

First it already happened when they used to expose identifying details. Those days are over.

Second, more importantly, what you described does nothing about resellers, aka the people who keep the abuse economy running and make money from it.

And please. Hash matching is not dairy reading.

And on the likely chance my dairy happens to have 1:1 collision with a know cp video, I would not mind if someone being able to look at it if it meant they also can look at the actual thing and identify reseller/perpetrator. How can you think differently?


> Idk if you are aware but this means physical surveillance on everyone, because with Tor you can't narrow this stuff geographically. Would you rather to be physically surveilled.

Presumable the content is actually produced at some specific physical location.

> Hash matching is not dairy reading.

The article is not talking about has matching, though. Quote from one of the Europol officials:

“All data is useful and should be passed on to law enforcement, there should be no filtering by the [EU] Centre because even an innocent image might contain information that could at some point be useful to law enforcement,”


> Presumable the content is actually produced at some specific physical location

Yeah and how to find that location? If you are opposed to any measure that compromises your digital privacy, physical surveillance is the only way

> The article is not talking about has matching, though

Sure. In context of this subthread you are correct. But remember when Apple tried to do it with hash matching? They published a white paper detailing their algorithm. Remember how everyone here instantly whined about total surveillance? It was just like last year. The sentiment is always the same "my privacy may not be compromised if it concerns safety of helpless victims whom I don't care about"


> in case of airport security

Not everyone is OK – lots of people argue it's a security theatre.


Let's see if you think it's a security theatre next time you fly from Jordan to Israel. Or actually anywhere within the US, where gun carry is allowed...


Sopmer of it is theatre but screening people to make sure they don't have a bomb ios not theatre - people will do this as has been shown.


It is a bit weird in that screening started only after bombing... a useful metric would be to see how many times bookings/hijackings were thwarted. (which I would guess many)


No screening started after hijacking in the 1970s.


> in case of airport security or CCTV in some public space suddenly everyone is OK compromising personal privacy in the name of personal life and safety.

These are not really comparable. Even without CCTVs you can't really expect that no one will observe you public areas (it's just that cost of doing so would be significantly higher).

Also it's something you have much more control over and it's significantly less intrusive than monitoring personal communication. e.g. an equivalent would be the government opening and reading every single letter you sent or received back in the days when people still sent them (or having the option to, which to be fair is something they probably had it was prohibitively expensive to do at scale). That is not something most people living in free societies found acceptable.

> If they honestly tried to combat child abuse, how do you see them going about it?

By actually directly targeting it as the other comment describes? Instead of using "think of the children!" as a vail to justify unlimited government surveillance.

> You are speculating about intents some other people might have.

Yes. Are you implying there is something fundamentally wrong with that? Do you always accept everything politicians say at face value? If so, perhaps you're on the market for a bridge?


> not comparable

Airport security literally checks the inside of your body (if they want to) through xray or other means. How you consider this not comparable in privacy invasiveness?

> By actually directly targeting it as the other comment describes

Please your own take. That comment didn't contribute anything useful.

> Are you implying there is something fundamentally wrong with that?

I can't believe this is a question. You realize you are putting your own thoughts in another person's head?


> How you consider this not comparable in privacy invasiveness?

How is that comparable to having access to someone's personal communication? What's so particularly private about the 'inside' of anyone's body? Physically checking the outside seems much more invasive. But yeah, overall I agree that compromises can and should be made in certain cases when the potential harm to society might outweigh certain individual rights (I don't see how that might be the case in this situation).

> Please your own take. That comment didn't contribute anything useful.

I don't agree and to be fair more or less the same can be said about your previous comment.

> You realize you are putting your own thoughts in another person's head?

No. I'm trying to infer what thoughts might exist in another person's head when they do or say certain things. I don't really understand what are you implying (that we should never assume that no politicians have any hidden agendas and they they all are perfectly honest?)


> What's so particularly private about the 'inside' of anyone's body

Seriously? If your body is not private to you, then what's so particularly private about your communication?

> I don't agree

That's not an answer to "how would they go about it if their goal was to actually combat child abuse, as opposed to some conspiracy to surveil that you imagine"

> I'm trying to infer what thoughts might exist in another person's head when they do or say certain things

Exactly. It is what you think they think, not what they think, and such says more about your mind than theirs.


It's absolutely shameful. These people are not naive children. They know exactly the ramifications of the tradeoff, which indicates that they are not doing so in good faith, but rather at the behest of vested interests.


Or they're mentally unsound of some variety.


Pedantic but I would still say it's shameful although expected. Like that it is shameful to have your car stolen if you leave it unlocked with the keys in clear view on the driver's seat, although it is expected.


> try to grab power to stop evil, it’s natural

Those who seek this kind of power over others are themselves EVIL by definition


Also that would have set the precedence for the police to use apple as a proxy to put software on your phone/computer to monitor everything you do and scan for them, while they lean back in office chairs and wait for a hit. Same would have been extended to any and all government bureaucracy. No warrant, no innocent until proven guilty; guilty until proven innocent is their credo.


Apple announcing that stupid Csam scanning started all of this and they proved it's possible.

An absolutely stupid and disastrous move.


> Apple announcing that stupid Csam scanning started all of this

Except all of the cloud providers are already scanning uploaded photos for CSAM and have been for years. Trying to blame Apple for this is insane.


I’m particularly tired of this point. Tech companies did a lot of unsound things in the pre-mass adoption days — for example, letting admins read stored content without any access controls. We don’t claim these things are standard or desirable just because they once happened. Moreover, these things rarely entered public awareness.

In the very early days of cloud computing (AKA the 2010s) when “upload to cloud” typically involved clicking a button and sending a photo to a server, a subset of cloud companies began scanning photos for CSAM content. Many of these companies exclusively scanned shared content rather than unshared repositories, because the purported goal was to stop distribution (allegedly Dropbox did this, recognizing that “upload” was an automated feature of their system and “share” represented a user choice.) A few companies were blurry on the distinction and just scanned everything, perhaps because it was technically easier.

What’s important is that this was never widely advertised to users, nor was there ever any sort of public debate about whether it should be SOP, particularly for “uploads” produced by default-on cloud backup software like iCloud. When people say “Apple started this” what they mean is that the first real instance of widespread public debate around this feature I know of was in 2019 when Apple very publicly announced their plans, and the feedback from customers was apparently so negative that they abandoned the idea. Moreover, Apple “started this” in a second sense of the term: they developed the first system capable of scanning end-to-end encrypted photos by conducting the scanning on-device, thus providing a technology demonstrator for the ideas in the new EU regulation.


This isnt comparable.

Apple was going to scan local photos, not cloud stored ones.

No matter the source, no matter the app.

And the phone will report you to the police if the algorithm marks any local photos as ones reported by the police using a "neural hash", which has a non-zero amount of hash collisions.

https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...


> Apple was going to scan local photos, not cloud stored ones.

At the point of upload to the cloud service -where they would be scanned anyway-.

> if the algorithm marks any local photos

No, it required a threshold of N photos to match before they were submitted for human verification.

> which has a non-zero amount of hash collisions

Hence the threshold and human verification step.


So, imagine your new house having cameras and microphones all over the place, that you cannot turn off, recording 24/7, but "only locally". If there's screaming, could be TV, could be just an argument, could be rough sex, drama practice, or maybe even violence and murder, it will mark those recordings and after a few repeats it'll send them to a person to look at your private recordings to see if it's just some bdsm play or if you're murdering your wife. Oh, the police officer looked at the video and it was just bdsm? Ok, continue until the next threshold.


> If there's screaming

... that matches a specific fingerprint.

But also this is a flawed analogy because the scanning is not 24/7, only when you are uploading to iCloud. It's more like "letting people in for dinner and them seeing blood splatter on your walls; after a few visits with different blood splatters, they might well suggest that someone have a look and check it's not just an accident-prone haemophiliac living there."


I'm an artist with unconventional religious convictions you insensitive clod!

You're missing the point though. The broadband sensors acting on another's behalf is the problem because all that'll happen is more and more liberties will be taken with the concept of ownership/post-purchase monteization, then god knows who is watching what. Hell, it's a security exploit away from becoming a home invader's wet dream.


> At the point of upload to the cloud service -where they would be scanned anyway-.

So scan them there? Why ahould the phone scan local photos? And icloud is enabled by default, guess who's going to disable it if that would've been implemented?

> No, it required a threshold of N photos to match before they were submitted for human verification.

Yay, private photos leaking to companies employees because of a flawed algorithm, makes perfect sense.


They can't scan them in the cloud because, unlike other cloud storage services, the data is encrypted before leaving the device and they don't have access to what they are storing. They still don't want to host bad stuff though so they tried to come up with a way to still scan somewhere while not making the encryption in the cloud useless for everyone.


Possibly your understanding of the motive is correct. But your understanding of iCloud security is not. Apple did not offer end to end encryption of photos until after. And it is not the default now.


> Why ahould the phone scan local photos?

To avoid doing it in the cloud? Then you can turn on end-to-end encryption on uploaded photos.

> private photos leaking to companies employees

Where N of them have matched known CSAM hashes at the point of being uploaded to iCloud, they will be presented for human verification, yes. How is this worse than the photos being scanned in iCloud and being flagged for similar verification?


You can turn on end to end encryption without scanning. Apple did. And Apple's modified key escrow ruled out end to end encryption. End to end means end to end. Not end to back door.

Known CSAM hashes is incorrect. The sources of the hashes are known to contain false positives. And true positives are not limited to depictions of sexual abuse.


Because icloud is a cloud service, this is my phone scanning my photos.


> Apple’s ill thought out on-device CSAM detection and thankfully, the removal of it from its plans after heavy pushback comes to mind.

Calling Apple's CSAM detection "ill thought out" seems to me to be letting the perfect be the enemy of the good.

Remember when everyone was up in arms about W3C standardizing DRM? Yes, the world would be better without DRM. But the DRM exists and will continue to exist. Arguing for not standardizing DRM isn't arguing for DRM to not exist, it's arguing for it to not be standardized. The encrypted media extensions let me watch DRM-protected content on Linux instead of being locked out of it all because none of the proprietary DRM everyone's using works on anything but Windows/OSX. Linux and Firefox and a DRM blob so I can watch Netflix is better than being forced to use Windows 10 with all its telemetry, Microsoft proprietary DRM, and a browser from Google or Microsoft to watch things.

Yes. It would be better if the government stayed the hell away from my files. Apple not implementing their CSAM scanning didn't get rid of the underlying issue or argument. It just means that it's no longer the tech industry setting the standard for how this will work--it's going to be the government.

Apple's implementation[0] was about as privacy preserving as we can hope for. It only scanned media that was about to be uploaded to their cloud service. It did scanning on device and used crypto to ensure, mathematically, that they couldn't even access the _hash_ of matching images until such a point that enough images matched to cross a threshold. They had client devices feed in fake matches to obscure even the number of potential matches that occur before reaching the threshold. At any no point in any of this is it possible for them to retrieve even the hash of an image that does not match, even after you've passed the threshold.

Would it be better if none of this happened at all? Sure. Is this a _fuck_ of a lot better than what we're seeing Europol pushing for? Absolutely.

Apple cancelling their scanning was a short term win. This isn't a new problem, and this isn't one that's going away. We can throw all the technical solutions we want at it, but it's not a technical problem. ("Sorry, can't scan user's content it's all end-to-end encrypted!". "Don't care. You wrote the encryption. Work around it.". "It's impossible!". "Okay, enjoy your new regulation that all encryption has to have a backdoor HTH HAND.")

I'd rather lose the battle and win the war. Let's put the most privacy-preserving CSAM scanning we can in place and take that card out of play. Let the regulators come out and try and explain how "Well yeah, you're scanning every image for CSAM but, uh, it's not enough. We do really need to see _all_ the images people have on their phones!". We're not making an argument, we're drawing a hard line and standing in place. The Europols of the world are not going to stop pushing. Apple's big, but not "override the EU" big.

When push comes to shove, we will lose. The EU _will_ respond with regulations. Maybe not now, but it should be obvious the way the winds are shifting. I'd bet my left testicle their vision for this is much more onerous than what Apple was proposing.

But hey, at least we can tell our children (away from our phones or any other electronics, and probably standing somewhere deep in the woods) that we were proudly defiant to the end.

[0]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


The question being debated right now is not “which precise scanning technology should we use,” but rather: “should we scan private photos and messages for CSAM and other illicit content.” By proposing a scanning system that scanned user-private, unshared photos Apple announced that they felt the answer was “yes.” Everything else is a technical detail.

And to be clear, once you’ve established the capability and the principle of the thing, the technical details will not remain static. The EU regulation already requires scanning for novel CSAM content and “grooming conversations,” because the people proposing this tech think hash-based photo scanning is insufficient. Having conceded the need to scan users’ private data Apple would have found itself mired in a long-term losing argument about specific technologies, one that the public wouldn’t understand or care about. And the other side would have the force of law behind them.

What precisely was Apple’s plan to maintain this “balance” then? Refuse to obey the law? Leave Europe? To paraphrase apocryphal Winston Churchill: there is one point at which you can defend your stance on principle, once you abandon that everything else is just haggling on price.


> Remember when everyone was up in arms about W3C standardizing DRM?

I'm really surprised how well that actually turned out.

Whatever you say about DRM (and I'm no fan of it myself), at least W3C's version is as open as it can be. You're prevented from making a copy of whatever you watch (which, to be entirely honest, is a very reasonable precaution in the age of streaming), but you can still write browser extensions[1] or even custom Electron apps[2] that interact with the video element in other ways. If you want automatic skipping of intros, changing playback rate where such functionality isn't supported, access to community subtitles or subtitle-related tools useful when learning foreign languages[3], automatic subtitle reading (with a synthetic voice) or even a completely custom Multi View interface, it's all possible. You can even do synchronized playback across multiple users[4], as long as all of them are authorized to play the relevant media. You could have achieve none of this if you had to use a Flash-based player with no programmatic access to its state whatsoever. It's the best compromise we could have hoped for.

[1] https://chrome.google.com/webstore/detail/netflix-extended/g... [2] https://multiviewer.app/ [3] https://chrome.google.com/webstore/detail/netflix-dual-subti... [4] https://www.teleparty.com/


Is it still a slippery slope fallacy when someone at the bottom of the first slope argues for further slopes?


That's exactly it. We know Apple plays by the rules. EU demands it and they make all phones USB-C, CCP demands it and they host iCloud stuff in PRC. If they are required to give out data to fight CSAM, they will comply. Hopefully Apple would try to not make it "free for all" but whatever they do it will probably be hidden behind relevant regulations, as opposed to a solution they tried to push.


It's very naive to think that Apple's solution would remain as described in that paper. All it takes is a push for 'proactive searching for images not in the database' through e.g. models predicting whether an image is CSAM or something and you have countless cases like [0]. This needs to be wholly unacceptable. Once the system is put in place, expanding it is a much easier pill for the public to swallow.

You're just advocating for frog boiling.

https://www.nytimes.com/2022/08/21/technology/google-surveil...


CSAM & apple debacle shows we can win some of the time. I'm not going to give up. CSAM was a huge precedent for having government sponsored software on your phone, running 100% of the time, scanning your device. That is just plain fucking awful and it was new and precedent setting. It had to fought tooth and nail, and it was just "don't let perfect be the enemy of the good". I'm sorry to be so blatant but that's what it was. Yeah I read the pdf and understood what they were doing, it was a crack in the armor of having government surveillance software on your phone and computer 24/7 and that is huge.


I don't think they would have stopped at Apple.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: