The use of the detection algorithm is for iCloud only today.
Now that the technology is on-board the device, how many lines of codes do you think it will take to scan the full photo-roll?
Do you think that this ability will not tempt LEA, law makers, governments, to push for that ever-so-small change to the code, either for blanket monitoring (see if China is not tempted, using their own database of "illegal" content) or for targetted monitoring (some specific users, with or without valid court orders).
The main issue is that the wall has been breached: monitoring data that was otherwise only on-device is now possible with little to no change as the feature is now embedded in the OS.
You can argue we're not there yet and can trust Apple to do-the-right-thing and that the Rule of Law will protect citizen against abuse, but that's a big step into a worrying trend, and not all countries follow the Rule of Law and have checks and balances to avoid misuse. Don't forget that Apple abides by the laws of countries where it sells its devices. That means they will -forced or not- do what they are told.
Based on this article (and my first rebuttal[0]) I actually think the whole "only photos syncing to icloud photos" part of it is the defining factor making it legal - if Apple specifically only sent themselves photos that were detected as CSAM, it would likely be a felony, while the planned system can use the reasoning I laid out to likely not be charged with such felony.
>The use of the detection algorithm is for iCloud only today.
Yes, which is what I was saying in my comment. If or when it comes to Apple changing this then I would agree it's a battle worth fighting, but that is not what is happening here and that is not what I was correcting in this article itself.
>The main issue is that the wall has been breached:
The wall was breached when we opted to run proprietary OS systems. You have zero clue what is going on in that OS and whether it's reporting; you have to trust the vendor on some level and Apple is being fairly transparent here. I would be far more worried if they did this without saying anything at all.
Yes, which is what I was saying in my comment. If or when it comes to Apple changing this then I would agree it's a battle worth fighting, but that is not what is happening here and that is not what I was correcting in this article itself.
Isn't it too late to fight the battle then? They've already built the infrastructure to make it trivial to scan any file on your device.
I'd imagine that they do that by writing the bits to the hard drive without creating a fuzzy hash used to match your picture with those in some organization's list?
Sorry, perhaps I misunderstood what you meant by "infrastructure".
Apple have always had the ability to do great evil to a lot of people, this update doesn't change that. They haven't gained any power they didn't already have.
The government, for example, do not currently have the infrastructure to push updates to iPhone. If they passed laws, built servers etc to allow this then that would be a meaningful change that would be worth all this chatter.
The government, for example, do not currently have the infrastructure to push updates to iPhone
That's the point, this is a slippery slope -- without this system, governments have no way to compel Apple to scan for objectionable photos, Apple could claim, rightly so, that due to encryption technology and privacy, they have no way to do it. But now they've removed both the technological and privacy hurdle, and it's just a matter of logistics.
Now governments know that all they need is a database of banned photos and they can go to Apple and say "In our country, it's illegal to share photos that put our government in a bad light. Here's a database of banned photos. If you don't comply, you can't sell your phones to our 1.4 billion citizens".
Is the problem here, as so many commenters complain, that Apple won't resist the corrupt government, or is that that society as whole won't overthrow the corrupt government?
Now that the technology is on-board the device, how many lines of codes do you think it will take to scan the full photo-roll?
Do you think that this ability will not tempt LEA, law makers, governments, to push for that ever-so-small change to the code, either for blanket monitoring (see if China is not tempted, using their own database of "illegal" content) or for targetted monitoring (some specific users, with or without valid court orders).
The main issue is that the wall has been breached: monitoring data that was otherwise only on-device is now possible with little to no change as the feature is now embedded in the OS.
You can argue we're not there yet and can trust Apple to do-the-right-thing and that the Rule of Law will protect citizen against abuse, but that's a big step into a worrying trend, and not all countries follow the Rule of Law and have checks and balances to avoid misuse. Don't forget that Apple abides by the laws of countries where it sells its devices. That means they will -forced or not- do what they are told.