Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even the App Store version of Signal is allegedly not the same as what's in the open source project. So unless you compile and install the applications yourself, there's no way of knowing anything.


That is why we must support initiatives like f-droid. They put a special focus on reproducibility.


We already know Signal team doesn't like F-Droid. They've got 100 totally outdated reasons why they won't put it there.


So Apple has their Xcode Build service, why not add a badge to verify that an app was built from a linked public Github/Gitlab Repo


if you can't trust Meta, why could you trust apple?


Apple has been building their brand on privacy and trust for at least a couple of years now. Can you be sure they're not sending everything to the NSA? Of course not. But they also make their money by directly charging users for services unlike the ad-based companies. There have also been many attempts by various governments to publicly force Apple to insert backdoors or prevent them from fixing security vulnerabilities which have failed.


> But they also make their money by directly charging users for services unlike the ad-based companies.

this does not make them more trustworthy

> There have also been many attempts by various governments to publicly force Apple to insert backdoors or prevent them from fixing security vulnerabilities which have failed.

Except in china, I suppose.


Apple's privacy is a marketing farce. They run data centers in China that provide full access to the government. Their anti-ad campaign was simply a push to gain dominance in the space themselves. They make a big fuss about end-to-end encryption but don't even bother to end to end encrypt your photos and backups!

I actually worked at Apple a few years ago in security. I was wondering why we didn't E2EE photos. The reason seemed to be - from what other engineers told me - is that it was at the behest of law enforcement. Lot easier to cooperate with LE and comply with NSLs when you can simply hand over the data they need.

Until Apple end-to-end encrypts these two things, it's all for naught. It doesn't fucking matter if your HomeKit data is E2EE if someone can take a look at your nudes without any cryptographic barrier.

Take that for what you will. Having worked at both companies during my career in a security capacity, I see no reason to trust one over the other wrt cloud services.

N.B. There are people at Apple that are very passionate about security and privacy. I was privileged to work with these people during my career. They really try to - and do - make a difference. My post is not an attack on them, but on the wider vision of the company, which is somewhat hypocritical.


I really need you to understand the difference between their marketing claims and reality. Apple is really not the champion for privacy they claim to be beyond the extent that they can try and hurt Google in their marketing.


Why would I think there is any truth in something apple's marketing department is saying?


Apple has a multi-billion dollar ads business and is going all-in to expand it.


That'd be cool.


If you're running iOS then I always assume the random number generator is backdoored by the NSA anyway. That's got to be the single juiciest target going; frankly if the NSA haven't backdoored that then what are they even spending tax dollars on?


That's interesting. Do you have any links for more info?


It's not a realistic danger and just fear mongering. I'm not sure why people on HN feel the need go after Signal so hard. I do think criticism is important (and Signal definitely deserves plenty) but these types of criticisms are off base and not specific to Signal, nor are they that relevant (kinda how people post on Signal's tweets about Iran complaining about lack of usernames. Not the time nor place).

It isn't meaningfully different from saying that Google/Apple can pretend to put the real App in the App Store but replace it with one that has a backdoor. This is entirely possible. But also the risk of this is extremely high and people do decompile apps like Signal, WhatsApp, and Telegram (albeit this can only go so far). These are all high profile and highly scrutinized apps. It is just fear mongering.


Before any backdooring purposes there is probably some marketing/analytics reasons, keys, OTF updates etc...


even if you compile yourself you can't be sure. [Reflections on Trusting Trust ](https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...)


Has that attack ever been observed in the wild?

While I don't know if the current incarnations of Nix/Guix will succeed, I think we are slowly making progress towards reproducible builds everywhere.


No one knows for sure, though compromised compilers are not far fetched - there has been an implicit trust on compiler toolchains. Reproducible builds are a few years out from full general adoption.


Assembly code can be read to see if it matches.


> Has that attack ever been observed in the wild?

Yes: https://www.quora.com/What-is-a-coders-worst-nightmare/answe...

Also, I remember in the 90's, people talking about a virus that infect pascal source code files. Memory is spotty about it.

> While I don't know if the current incarnations of Nix/Guix will succeed, I think we are slowly making progress towards reproducible builds everywhere.

Fortunately, the answer is also positive here.


Not with Guix and Mes.


Reproducible builds make an attack like this as likely as "the whole world is a big conspiracy".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: