That's not the point of the outrage though (at least not for me). They enabled by default a feature that analyzes my pictures (which I never upload to iCloud) and sends information about them to their (and others') servers. That is a gross violation of privacy.
To be clear, I don't care about any encryption scheme they may be using, the gist is that they feel entitled to reach into their users' most private data (the photos they explicitly said they don't want to upload to iCloud) and "analyze" them.
This is the same as that time Microsoft enabled OneDrive "by mistake" and started slurping people's private documents and photos saved in default locations (arguably worse since no one takes pictures with their PC's webcams).
If you really didn't want your photos to be analyzed, would you be using an iPhone? Or any modern smartphone? Google photos doesn't have nearly the privacy focus and no HE whatsoever but I rarely see that mentioned here. It almost seems like Apple gets held to a higher standard just because they have privacy preserving initiatives. Do you use a keyboard on your iphone? You may not have heard but apple is tracking which emojis you type most often [0] and they get sent to apple servers.
> Google photos doesn't have nearly the privacy focus and no HE whatsoever but I rarely see that mentioned here. It almost seems like Apple gets held to a higher standard just because they have privacy preserving initiatives.
What is so surprising about this? If you make big claims about anything, you are held to your own standards.
> It almost seems like Apple gets held to a higher standard just because they have privacy preserving initiatives
It doesnt' seem this way at all. It is rare to see someone talking about current behavior, they are always talking about the slippery slope - such as landmark detection obviously being the first step in detecting propaganda on a political dissident's phone.
This isn't driven by trying to hold them to a higher standard; it is an emotional desire of wanting to see them caught in a lie.
Yes it does and the blogpost specifically explains why.
In short, both iOS and macOS are full of bugs, often with the potential of exposing sensitive information.
Also, it’s on by default - nobody in their sane mind would have bits of their photos uploaded somewhere, regardless of “we promise we won’t look”.
Finally, Photos in iOS 18 is such a bad experience that it seems the breach of privacy was fundamentally unjustified as no meaningful improvement was introduced at all.
Encryption does not automatically mean secure. Encryptions can and will be broken. Any flaw in their implementation (which nobody can verify) would render encryption useless…
Apple is a group of people. It's not run by AI. I was specifically talking about the people in Apple who work on the feature and wrote the documentation for the feature.
Yes, of course, the concern is the data being exfiltrated to begin with. Like someone else in this thread mentioned, if they upload a single pixel from my image without my consent, that is too much data being uploaded without my consent.
I'd want an explanation of why they want to send this data. They need to seek informed consent, and the default needs to be no data collection. Opt-in, not opt-out.
If I do opt-in, I can withdraw that consent at any time.
I can also expect them to delete any collected data within a reasonable (to me) time frame, tell me what they do with it, who they share it with, supply me with any personally identifying data they have colllected, and allow me to correct it if it's wrong. And if they use the data to make important decisions automatically, e.g. bank loan yes/no, I have the right to make them use human reasoning to reconsider.
There is no reason to let businesses non-consensually collect any data from you, even if that's their entire business model. Don't let their self-serving lies about "you can trust us" or "it's normal and inevitable" swindle you out of your privacy.
Incidentally,"a completely randomly generated integer" could describe Apple's Advertising Identifier, which allows third parties to track the bejesus out of you.
> If the data is encrypted, does the concern still apply?
Yes! For so many reasons!
If an adversary is able to intercept encrypted communications, they can store it in hopes of decrypting it in the future in the event that a feasible attack against the cryptosystem emerges. I don't know how likely this is to happen against homomorphic encryption schemes, but the answer is not zero.
I'm not suggesting everyone should spend time worrying about cryptosystems being cracked all day long, and I'm not saying Apple's encryption scheme here will prove insecure. Even if this particular scheme is cracked, it's very possible it won't reveal much of great interest anyways, and again, that is simply not the point.
The point is that the correct way to guarantee that your data is private is to simply never transmit it or any metadata related to it over a network in any form. This definitely limits what you can do, but it's a completely achievable goal: before smartphones, and on early smartphones, this was the default behavior of taking pictures with any digital camera, and it's pretty upsetting that it's becoming incredibly hard to the point of being nearly impractical to get modern devices to behave this way and not just fling data around all over the place willy-nilly.
And I know people would like Apple to get credit for at least attempting to make their features plausibly-private, but I feel like it's just the wrong thing right now. What we need today is software that gives agency back to the user, and the first part of that is not sending data off to the network without some form of user intent, without dark patterns to coerce said intent. At best, I can say that I hope Apple's approach to cloud services becomes the new baseline for cloud services, but in my opinion, it's not the future of privacy. The future of privacy is turning the fucking radio off. Why the fuck should we all buy mobile devices with $1000 worth of cutting edge hardware just to offload all of the hard compute problems to a cloud server?
I'd also like to ask a different question: if there's no reason to ever worry about this feature, then why is there even an option to turn it off in the first place?
I worry that what Apple is really doing with pushing out all these fancy features, including their maligned CSAM scanning initiative, is trying to get ahead of regulations and position themselves as the baseline standard. In that future, there's a possibility that options to turn off features like these will disappear.
> I'd also like to ask a different question: if there's no reason to ever worry about this feature, then why is there even an option to turn it off in the first place?
I mean for one, because of people like you that are concerned about it. Apple wants you to have the choice if you are against this feature. It's silly to try to use that as some sort of proof that the feature isn't safe.
My iPhone has a button to disable the flash in the camera app. Does that imply that somehow using the camera flash is dangerous and Apple is trying to hide the truth from us all? Obviously not, it simply means that sometimes you may not want to use the flash.
They likely chose to make it opt-out because their research shows that this is truly completely private, including being secure against future post-quantum attacks.
> If an adversary is able to intercept encrypted communications, they can store it in hopes of decrypting it in the future in the event that a feasible attack against the cryptosystem emerges. I don't know how likely this is to happen against homomorphic encryption schemes, but the answer is not zero.
Also, if you're going to wildly speculate like this it is at least (IMO) worth reading the research press release since it does answer many of the questions you've posed here[0].
> it's pretty upsetting that it's becoming incredibly hard to the point of being nearly impractical to get modern devices to behave this way and not just fling data around all over the place willy-nilly.
And honestly, is turning off a single option in settings truly impractical? Yes, it's opt-out, but that's because their research shows that this is a safe feature. Not every feature needs to be disabled by default. If most users will want something turned on, it should probably be on by default unless there's a very strong reason not to. Otherwise, every single iPhone update would come with a 30 question quiz where you have to pick and choose which new features you want. Is that a reasonable standard for the majority of non tech-savvy iPhone users?
Additionally, the entire purpose of a phone is to send data places. It has Wi-Fi, Bluetooth, and Cellular for a reason. It's a bit absurd to suggest that phones should never send any data anywhere. It's simply a question of what data should and should not be sent.
> I mean for one, because of people like you that are concerned about it. Apple wants you to have the choice if you are against this feature. It's silly to try to use that as some sort of proof that the feature isn't safe.
If they know some people will be against the feature, why not ask instead of enabling it for them?
> My iPhone has a button to disable the flash in the camera app. Does that imply that somehow using the camera flash is dangerous and Apple is trying to hide the truth from us all? Obviously not, it simply means that sometimes you may not want to use the flash.
Do you really not see how this is not a good faith comparison? I'm not going to address this.
> They likely chose to make it opt-out because their research shows that this is truly completely private, including being secure against future post-quantum attacks.
So basically your version of this story is:
- Apple knows some users will not like/trust this feature, so they include an option to turn it off.
- But they don't bother to ask if it should be turned on, because they are sure they know better than you anyway.
I agree. And it's this attitude that needs to die in Silicon Valley and elsewhere.
> Also, if you're going to wildly speculate like this it is at least (IMO) worth reading the research press release since it does answer many of the questions you've posed here[0].
I don't need to, what I said generalizes to all cryptosystems trivially. The only encryption technique that provably can never be cracked is one-time pad, with a key of truly random data, of size equal to or greater than the data being encrypted. No other cryptosystem in any other set of conditions has ever been proven impossible to crack.
Homomorphic encryption is very cool, but you can't just overwhelm the user with cryptosystem design and mathematics and try to shrug away the fact that it is not proven to be unbreakable. The fact that homomorphic encryption is not proven to be unbreakable is absolutely not wild speculation, it is fact.
> And honestly, is turning off a single option in settings truly impractical?
We all just learned about today! We don't even need to speculate about whether it is impractical, we know it can't be done, and that's before we consider that loss of privacy and agency over devices is a death-by-a-thousand-cuts situation.
> Additionally, the entire purpose of a phone is to send data places. It has Wi-Fi, Bluetooth, and Cellular for a reason. It's a bit absurd to suggest that phones should never send any data anywhere. It's simply a question of what data should and should not be sent.
Clearly I don't think the internet is useless and I don't disable networking on all of my devices because I'm talking to you right now. But the difference is, when I reply to you here, I'm never surprised about what is being sent across the network. I'm typing this message into this box, and when I hit reply, it will send that message over the network to a server.
The difference here is agency. Steve Jobs had a quote about computers being "bicycle[s] for the mind". Well, if you just found out today that your device was sending meta information about your private photos over the network, you would be right to feel like it's not you controlling the bike anymore. The answer to this problem is not throwing a bunch of technical information in your face and telling you its safe.
Honestly I'm a little tired so I'm not gonna completely/perfectly address everything you said here but
> If they know some people will be against the feature, why not ask instead of enabling it for them?
Honestly I would just say this is because you can only ask so many things. This is a hard to explain feature, and at some point you have to draw a line on what you should and shouldn't ask for consent on. For many people, the default reaction to a cookie popup is to hit "accept" without reading because they see so many of them. Consent fatigue is a privacy risk too. Curious where you'd choose to draw the line?
> Do you really not see how this is not a good faith comparison? I'm not going to address this.
Yes, my point is that your reasoning isn't good faith either. We both know it's silly and a bit conspiratorial to imply that Apple adding a setting for it means they know a feature is secretly bad. If they wanted to hide this from us, neither of us would be talking about it right now because we wouldn't know it existed.
> We all just learned about today! We don't even need to speculate about whether it is impractical, we know it can't be done, and that's before we consider that loss of privacy and agency over devices is a death-by-a-thousand-cuts situation.
That's fair, but there's honestly no perfect answer here. Either you appease the HN crowd on every feature but overwhelm most non-technical users with too many popups to the point they start automatically hitting "yes" without reading them, or you make features that you truly consider to be completely private opt-out but upset a small subset of users who have extremely strict privacy goals.
How do you choose where that dividing line is? Obviously you can't ask consent for every single feature on your phone, so at some point you have to decide where the line between privacy and consent fatigue is. IMO, if a feature is genuinely cryptographically secure and doesn't reveal any private data, it probably should be opt-out to avoid overwhelming the general public.
Also, how would you phrase the consent popup for this feature? Remember that it has to be accurate, be understandable to the majority of the US population, and correct state the privacy risks and benefits. That's really hard to do correctly, especially given "21 percent of adults in the United States (about 43 million) fall into the illiterate/functionally illiterate category"[0].
> Honestly I would just say this is because you can only ask so many things. This is a hard to explain feature, and at some point you have to draw a line on what you should and shouldn't ask for consent on. For many people, the default reaction to a cookie popup is to hit "accept" without reading because they see so many of them. Consent fatigue is a privacy risk too. Curious where you'd choose to draw the line?
Cookie consent fatigue is both good and bad. Before cookie consent, you just simply had no idea all of this crap was going on; cookie consent took something invisible and made it visible. I agree it sucks, but learning that basically everything you use collects and wants to continue to collect data that isn't essential has opened a lot of people's eyes to how absurdly wide data collection has become.
> Yes, my point is that your reasoning isn't good faith either. We both know it's silly and a bit conspiratorial to imply that Apple adding a setting for it means they know a feature is secretly bad. If they wanted to hide this from us, neither of us would be talking about it right now because we wouldn't know it existed.
Apple doesn't add options for no reason. It's useful to control whether the camera flash goes off for potentially many reasons. Similarly, if this option was absolutely bullet-proof, it wouldn't need an option. The option exists because having derivatives of private data flowing over to servers you don't control is not ideal practice for privacy, and Apple knows this.
And of course Apple isn't trying to hide it, they're currently trying to sell it to everyone (probably mainly to regulators and shareholders, honestly) that it's the best thing for user privacy since sliced bread.
> That's fair, but there's honestly no perfect answer here. Either you appease the HN crowd on every feature but overwhelm most non-technical users with too many popups to the point they start automatically hitting "yes" without reading them, or you make features that you truly consider to be completely private opt-out but upset a small subset of users who have extremely strict privacy goals.
> How do you choose where that dividing line is? Obviously you can't ask consent for every single feature on your phone, so at some point you have to decide where the line between privacy and consent fatigue is. IMO, if a feature is genuinely cryptographically secure and doesn't reveal any private data, it probably should be opt-out to avoid overwhelming the general public.
> Also, how would you phrase the consent popup for this feature? Remember that it has to be accurate, be understandable to the majority of the US population, and correct state the privacy risks and benefits. That's really hard to do correctly, especially given "21 percent of adults in the United States (about 43 million) fall into the illiterate/functionally illiterate category"[0].
I honestly think the answer is simple: All of this cookie consent bullshit exists because the real answer is relatively simple, but it's inconvenient. If online behavioral tracking is so bad for our privacy, and we can't actually trust companies to handle our data properly, we should full-on ban it under (almost?) any circumstances. There. Cookie consent fixed. You can't track users for "non-essential" purposes anymore. And no, they don't need to. There was a time before this was normal, and if we can help it, there will be a time after it was normal too. Protecting someone's stupid business model is not a precondition for how we define our digital rights.
This is exactly why I worry about Apple's intentions. For what it's worth, I don't believe that the engineers or even managers who worked on this feature had anything but good intentions, and the technology is very cool. Obviously, nobody is denying that Apple is good at making these "privacy" technologies. But Apple as a whole seems to want you to think that the root cause of the privacy problem is just that our technology isn't private enough and it can be fixed by making better technology. Conveniently, they sell products with that technology. (I do not think that it is any shocker that the first uses of this technology are as innocuous as possible, either: this is a great strategy to normalize it so it can eventually be used for lucrative purposes like advertising.)
But that's wrong, and Apple knows it. The privacy problem is mostly just an effect of the loss of agency people have over their computers, and the reason why is because the end user is not the person that software, hardware and services are designed for anymore, it's designed for shareholders. We barely regulate this shit, and users have next to no recourse if they get pushed and coerced to do things they don't want to. You just get what you get and you have to pray to the tech gods that they don't turn the screws even more, which they ultimately will if it means they can bring more value to the shareholders. Yes, I realize how cynical this is, but it's where we're at today.
So yes, it's nice to not inundate the user with a bunch of privacy prompts, but the best way to do that is to remove and replace features that depend on remote services. And hell, Apple already did do a lot of that, it's Google who would have absolute hell if they had to put an individual prompt for every feature that harms your privacy. Apple devices don't have very many privacy prompts at all, and in this case it's to the point of a fault.
(P.S.: I know I used the term and not Apple, but even calling it "privacy" technology feels a bit misleading. It's not actually improving your privacy, it's just making a more minimal impact to your privacy stature than the leading alternative of just sending shit to cloud services raw. It's a bit like how electric vehicles aren't really "green" technology.)
> And I know people would like Apple to get credit for at least attempting to make their features plausibly-private, but I feel like it's just the wrong thing right now.
To be clear, I don't care about any encryption scheme they may be using, the gist is that they feel entitled to reach into their users' most private data (the photos they explicitly said they don't want to upload to iCloud) and "analyze" them.
This is the same as that time Microsoft enabled OneDrive "by mistake" and started slurping people's private documents and photos saved in default locations (arguably worse since no one takes pictures with their PC's webcams).