EU pending CSAM regulations

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
These are *much* more invasive than what Apple tried to do (which would have protected your privacy pretty well). This should cause a stir at some point among the folks who figured that Apple doing on-device comparisons to known images, hashing them, and only sending a hashed and resolution-reduced collection that can be read by Apple if more than a large threshold number of bad images are found was too much privacy invasion. Because this EU proposal actually invades privacy.


 

Andropov

Site Champ
Posts
615
Reaction score
773
Location
Spain
Ah, the EU proposing dumb regulation again. And AI based image recognition, no less! I wonder what could go wrong, heh.
 

Renzatic

Egg Nog King of the Eastern Seaboard
Posts
3,904
Reaction score
6,836
Location
Dinosaurs
Given how I'm always making tongue-in-cheek threats to punch people on the internet, it makes me wonder if I'm on any FBI watchlists.
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
These are *much* more invasive than what Apple tried to do (which would have protected your privacy pretty well). This should cause a stir at some point among the folks who figured that Apple doing on-device comparisons to known images, hashing them, and only sending a hashed and resolution-reduced collection that can be read by Apple if more than a large threshold number of bad images are found was too much privacy invasion. Because this EU proposal actually invades privacy.


I always had a feeling Apple were trying to get out ahead of regulations like this that would be much, much more invasive. But everyone shat a brick without 1) actually understanding how the technology worked and/or 2) were too short sighted to see something like this coming down the line.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
I always had a feeling Apple were trying to get out ahead of regulations like this that would be much, much more invasive. But everyone shat a brick without 1) actually understanding how the technology worked and/or 2) were too short sighted to see something like this coming down the line.
For sure. Apple wouldn‘t implement such a system just to satisfy their own perverse curiosity - they clearly were getting pressured by governments.
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
And this is what the blowhards at MR will have accomplished with all their uninformed bitching and moaning:

Well, gee, look at that

Woodward does note that there is a possible workaround: on-device scanning, after the message has been decrypted. But that it precisely the same approach Apple proposed to use for CSAM scanning, and which led to such a furore about the potential for abuse by repressive governments.
 

Andropov

Site Champ
Posts
615
Reaction score
773
Location
Spain
I hadn't read this part before:
can be ordered to “detect” both new and previously discovered CSAM, as well as potential instances of “grooming.”
How do they plan on checking for "potential grooming" short of running a mass surveillance system?
 

mr_roboto

Site Champ
Posts
282
Reaction score
453
I always had a feeling Apple were trying to get out ahead of regulations like this that would be much, much more invasive. But everyone shat a brick without 1) actually understanding how the technology worked and/or 2) were too short sighted to see something like this coming down the line.
The problem with your (1) is that not everybody complaining about the tech was clueless. Researchers figured out how to defeat that kind of perceptual hash algorithm well before Apple's scheduled go-live timeframe. That has to be a major factor in why Apple seems to have quietly sidelined it.

 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Just to toss in my two pieces of eight, I was fully aware of the technical reasons, claimed privacy preserving benefits, and likely pressure from governments, yet I still opposed it. I think that on-device scanning, even if it allows end-to-end encryption, is a bad idea. I'm going to oppose any company scanning on the behalf of the government, even if it is Apple, which is a company that I believe respects my privacy.

And this is what the blowhards at MR will have accomplished with all their uninformed bitching and moaning:
I realize that you are being toungue-in-cheek, @Cmaier, but I give the ne'er-do-wells over at the other place no credit in this. They occasional get hits in their logs for pre-release versions of macOS, but I doubt the company executives give two craps about the opinions of angry internet nerds that can never be pleased. I'd say that EFF had more of an impact than online forums.

As an attorney, I know you don't comment on legal matters online. As someone admittedly not versed in the law, I think there may be some protections within the U.S. justice system. The EU has different standards. In the USA, there are certain 1st and 4th amendment protections. I'd even argue that a very creative legal mind could apply the much lampooned 3rd amendment concerning this. (Not likely, but it would be amusing.)
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
Just to toss in my two pieces of eight, I was fully aware of the technical reasons, claimed privacy preserving benefits, and likely pressure from governments, yet I still opposed it. I think that on-device scanning, even if it allows end-to-end encryption, is a bad idea. I'm going to oppose any company scanning on the behalf of the government, even if it is Apple, which is a company that I believe respects my privacy.


I realize that you are being toungue-in-cheek, @Cmaier, but I give the ne'er-do-wells over at the other place no credit in this. They occasional get hits in their logs for pre-release versions of macOS, but I doubt the company executives give two craps about the opinions of angry internet nerds that can never be pleased. I'd say that EFF had more of an impact than online forums.

As an attorney, I know you don't comment on legal matters online. As someone admittedly not versed in the law, I think there may be some protections within the U.S. justice system. The EU has different standards. In the USA, there are certain 1st and 4th amendment protections. I'd even argue that a very creative legal mind could apply the much lampooned 3rd amendment concerning this. (Not likely, but it would be amusing.)
Questionable whether apple is a government actor - the protections you cite apply to the government and not to third parties acting on their own. (I assume you were talking about apple’s plan, but I address the alternative below)

The issue here is one of practicality. Governments are going to force tech to do SOMETHING, and if it’s not on-device it’s going to be that they scan every message themselves or force the tech companies to do so.

Would that violate US constitutional principals? Maybe. Maybe not. (They’d be acting on behalf of the government. But is there an expectation of privacy when you send non-e2e encrypted messages via a third party?). European? No idea. Chinese? Good luck with that.

My point is simply that demanding purity here means we are going to lose the right to encrypted communications.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
My point is simply that demanding purity here means we are going to lose the right to encrypted communications.
That is the crux of the argument. I realize that U.S. constitutional protections apply to the government only, not private entities like Apple. However, from my very much layman's perspective, it appears murky, since Apple would be acting as a government proxy.

Thus far, us tech enthusiasts have been relatively unscathed from government interference. The only notable government prosecutions were a failed attempt by the DOJ to break up Microsoft, and a toothless FTC settlement with Intel, which was years ago. As limited as those actions were, what impact they did have was on those specific corporations, and not the end user.

That's changing, with governments around the globe targeting "Big Tech", which seem to be primarily U.S. based entities, including Amazon, Google/Alphabet, Facebook/Meta, and of course, Apple. I'm sure there are politics at play, perhaps with the EU's aggressive stance being notable. I would also note that, at least in the U.S., Microsoft bought its way out of anti-trust legislation. The bills working through the House Judiciary Committee originally had language that targeted all operating systems, but that was re-written to specify mobile operating systems. This was done after Microsoft Vice Chairman and President, Brad Smith, gave the maximum bribe...err, donation to the campaign of Representative David Cicilline, who is Chairman of the panel overseeing such matters and co-sponsor of the most prominent legislation making the rounds through the House or Representatives. (I would note that this is bi-partisan, so equal responsibility goes to both major U.S. political parties.)

Politics aside, I realize that there isn't going to be a purely libertarian solution to this, and at some point, sooner rather than later, governments are going to put a great deal of pressure on Apple and other large technology companies to find solutions to the CSAM issue. The old "think of the children!" straw man always wins, because nobody wants to be seen as siding with abhorrent abusive criminals. I realize that blocking such regulations is unlikely, so I'm curious how Apple and other technology firms will ultimately address the situation. There may be some level of preventative measures provided by the U.S. constitution, but most tech companies are going to want a singular solution, which is what Apple was working on. Apple, nor its developers and users, want product editions of an iPhone or Mac that depend entirely upon the jurisdiction of the region in which the customer lives in.

I think Apple could have done a better job in explaining their reasoning and implementation, but I'm not certain Apple is willing to directly come out and say that they wanted to do this before governments forced them. Apple's solution, as much as privacy advocates dislike it, is going to be more privacy-focused compared to what some ham-fisted politician or tech-illiterate bureaucrat believes is in the best interest of the public. I think Apple's secrecy probably hurt them in this case, got caught flatfooted by the negative response, and should have at least attempted to consult organizations like the EFF, but they botched the rollout, thus hurting both themselves and their customers. Perhaps nothing would sway the likes of the EFF, but some level of support from advocacy groups would have been welcome. Regardless of what ultimately arrises from this mess, I trust Apple more than the likes of Google or Microsoft to protect user privacy, but there's only so much they can do about "I'm from the government, and I'm here to help" mandates.

As always, thank you for the thoughtful commentary and perspective, @Cmaier, much appreciated. I like to think of myself as a realist, like most folks here are, and that this issue is going to be forced upon us, whether we like it or not. At this point, trying to find a compromise solution that doesn't destroy "privacy is a fundamental human right", as Tim Cook often says, and satisfies government mandates is the best we can hope for. Do you have an opinion about where Apple takes this from here, or do you think they'll just wait for governments to force their hand, essentially Apple will say "we tried, but you didn't want it, so now the wise folks in government are doing it for us"?
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
That is the crux of the argument. I realize that U.S. constitutional protections apply to the government only, not private entities like Apple. However, from my very much layman's perspective, it appears murky, since Apple would be acting as a government proxy.

Thus far, us tech enthusiasts have been relatively unscathed from government interference. The only notable government prosecutions were a failed attempt by the DOJ to break up Microsoft, and a toothless FTC settlement with Intel, which was years ago. As limited as those actions were, what impact they did have was on those specific corporations, and not the end user.

That's changing, with governments around the globe targeting "Big Tech", which seem to be primarily U.S. based entities, including Amazon, Google/Alphabet, Facebook/Meta, and of course, Apple. I'm sure there are politics at play, perhaps with the EU's aggressive stance being notable. I would also note that, at least in the U.S., Microsoft bought its way out of anti-trust legislation. The bills working through the House Judiciary Committee originally had language that targeted all operating systems, but that was re-written to specify mobile operating systems. This was done after Microsoft Vice Chairman and President, Brad Smith, gave the maximum bribe...err, donation to the campaign of Representative David Cicilline, who is Chairman of the panel overseeing such matters and co-sponsor of the most prominent legislation making the rounds through the House or Representatives. (I would note that this is bi-partisan, so equal responsibility goes to both major U.S. political parties.)

Politics aside, I realize that there isn't going to be a purely libertarian solution to this, and at some point, sooner rather than later, governments are going to put a great deal of pressure on Apple and other large technology companies to find solutions to the CSAM issue. The old "think of the children!" straw man always wins, because nobody wants to be seen as siding with abhorrent abusive criminals. I realize that blocking such regulations is unlikely, so I'm curious how Apple and other technology firms will ultimately address the situation. There may be some level of preventative measures provided by the U.S. constitution, but most tech companies are going to want a singular solution, which is what Apple was working on. Apple, nor its developers and users, want product editions of an iPhone or Mac that depend entirely upon the jurisdiction of the region in which the customer lives in.

I think Apple could have done a better job in explaining their reasoning and implementation, but I'm not certain Apple is willing to directly come out and say that they wanted to do this before governments forced them. Apple's solution, as much as privacy advocates dislike it, is going to be more privacy-focused compared to what some ham-fisted politician or tech-illiterate bureaucrat believes is in the best interest of the public. I think Apple's secrecy probably hurt them in this case, got caught flatfooted by the negative response, and should have at least attempted to consult organizations like the EFF, but they botched the rollout, thus hurting both themselves and their customers. Perhaps nothing would sway the likes of the EFF, but some level of support from advocacy groups would have been welcome. Regardless of what ultimately arrises from this mess, I trust Apple more than the likes of Google or Microsoft to protect user privacy, but there's only so much they can do about "I'm from the government, and I'm here to help" mandates.

As always, thank you for the thoughtful commentary and perspective, @Cmaier, much appreciated. I like to think of myself as a realist, like most folks here are, and that this issue is going to be forced upon us, whether we like it or not. At this point, trying to find a compromise solution that doesn't destroy "privacy is a fundamental human right", as Tim Cook often says, and satisfies government mandates is the best we can hope for. Do you have an opinion about where Apple takes this from here, or do you think they'll just wait for governments to force their hand, essentially Apple will say "we tried, but you didn't want it, so now the wise folks in government are doing it for us"?
I assume that Apple’s plan is to implement a system along the lines of the system it already proposed, though probably with some tweaks, and that they are lobbying behind the scenes to make sure that governments accept that in lieu of a solution that breaks end to end encryption.

On-device scanning makes the most sense from a privacy perspective. The on-device comparison hash can be seen by anyone who knows what they are doing (the “scary governments will abuse this” thing never made much sense to me because of that). Nothing gets sent off device unless there is a very high degree of certainty that there are many matching files. (The thing where researchers claim to have defeated that, if I remember correctly, involved a version of the algorithm that cannot possibly be the right algorithm, because the algorithm apple promised allowed for variant comparisons and the algorithm the researchers thought apple would use did not, and was apparently some testing code left behind). Apple can tweak the certainty level in various ways (hashing algorithms less tolerant, encryption algorithm improvements, hit thresholds, etc.), but in the end it’s the only way to make sure Apple is not allowing its systems to be used for CSAM transmission and storage while also protecting the privacy of the rest of us, at least as far as I can tell.
 
Last edited:

Colstan

Site Champ
Posts
822
Reaction score
1,124
I assume that Apple’s plan is to implement a system along the lines of the system it already proposed, though probably with some tweaks, and that they are lobbying behind the scenes to make sure that governments accept that in lieu of a solution that breaks end to end encryption.
I believe most of the CSAM debate, and technological implementation, revolved around iOS, if I recall correctly. AdGuard talked about potentially blocking it through their DNS service. However, I'm curious how Apple could enforce this within macOS, since it is a much more open platform. We see Windows getting substantial hacks from third-parties, such as a debloater tool to disable the built-in advertisements, miscellaneous bloatware, and "telemetry" (aka privacy invading garbage) that Microsoft already installs, no government mandate required. The new TPM 2.0 restrictions don't stop any of these substantial modifications.

I'm wondering if this is something that Apple could implement in Mac firmware, since both Apple Silicon and the iDevices share a similar boot process and Secure Enclave, or would they simply do it all in macOS proper, or a combination software and hardware implementation? Then, if there are unofficial third-party hacks to disable CSAM detection within macOS, that's not Apple's responsibility, since they are following government regulations with the shipped product?

Again, thank you for your insights, @Cmaier, always appreciated.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
I believe most of the CSAM debate, and technological implementation, revolved around iOS, if I recall correctly. AdGuard talked about potentially blocking it through their DNS service. However, I'm curious how Apple could enforce this within macOS, since it is a much more open platform. We see Windows getting substantial hacks from third-parties, such as a debloater tool to disable the built-in advertisements, miscellaneous bloatware, and "telemetry" (aka privacy invading garbage) that Microsoft already installs, no government mandate required. The new TPM 2.0 restrictions don't stop any of these substantial modifications.

I'm wondering if this is something that Apple could implement in Mac firmware, since both Apple Silicon and the iDevices share a similar boot process and Secure Enclave, or would they simply do it all in macOS proper, or a combination software and hardware implementation? Then, if there are unofficial third-party hacks to disable CSAM detection within macOS, that's not Apple's responsibility, since they are following government regulations with the shipped product?

Again, thank you for your insights, @Cmaier, always appreciated.

Yeah, I think what they will do is that prior to anything being allowed to touch apple’s servers (via iMessage or iCloud backup or whatever), it will get scanned on the mac. You may be able to disable the scanning pretty easily by finding the process and killing it, or disabling kernel protections and hacking the kernel or whatever. But that might not do you much good. Presumably the way it works is that when you send the file via apple’s servers, you also have to send a resulting hash value (after all, that’s how apple’s csam scheme worked). Part of that hash is probably a cryptographically derived value that apple can use to determine whether you’ve bypassed the scanner - if you don’t have Apple’s private key, you can’t fake it.

Of course, a lot depends on implementation, and it may be possible to hack your mac to just skip the scan but still produce a valid cryptograph signature, but I’m guessing it will take some effort.

(And, of course, this is only about stuff touching Apple’s servers. If you send something via your gmail account or whatever, that’s not apple’s problem)
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
I think Apple could have done a better job in explaining their reasoning and implementation, but I'm not certain Apple is willing to directly come out and say that they wanted to do this before governments forced them. Apple's solution, as much as privacy advocates dislike it, is going to be more privacy-focused compared to what some ham-fisted politician or tech-illiterate bureaucrat believes is in the best interest of the public. I think Apple's secrecy probably hurt them in this case, got caught flatfooted by the negative response, and should have at least attempted to consult organizations like the EFF, but they botched the rollout, thus hurting both themselves and their customers. Perhaps nothing would sway the likes of the EFF, but some level of support from advocacy groups would have been welcome. Regardless of what ultimately arrises from this mess, I trust Apple more than the likes of Google or Microsoft to protect user privacy, but there's only so much they can do about "I'm from the government, and I'm here to help" mandates.
I agree with this. I will also say Apple wasn't exactly clear about all of the technology behind their proposed system and measures that are in place to prevent abuse by governments. If they would have been more transparent about all of the technical details from the beginning, I think the backlash wouldn't have been as bad. There has been a push for putting back doors in encryption and breaking E2EE for years. Apple's system is certainly way, way better than whatever some tech-illiterate politician is going to propose. If forced on everyone, I would much rather have Apple's system put it place.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,291
Reaction score
8,447
I agree with this. I will also say Apple wasn't exactly clear about all of the technology behind their proposed system and measures that are in place to prevent abuse by governments. If they would have been more transparent about all of the technical details from the beginning, I think the backlash wouldn't have been as bad. There has been a push for putting back doors in encryption and breaking E2EE for years. Apple's system is certainly way, way better than whatever some tech-illiterate politician is going to propose. If forced on everyone, I would much rather have Apple's system put it place.

I feel like they were pretty transparent. I mean, this document is around 10 pages of technical detail: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
 

Similar threads

Top Bottom
1 2