Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but privacy isn't black or white. A donation to signal does not compromise the content of your messaging.

So what you've leaked is the information that you have an interest in private conversations. This might be a problem in some countries, but I think it's fair to ask folks in affluent countries with working (sorta) democracies to shoulder that burden. I.e. you don't donate if there's elevated threat to your safety, there are enough people who aren't under elevated threat.

There's also the possibility of using a donation mixer like Silent Donor, though I'd evaluate that very carefully. (There's a record of the transfer in, and the mixer needs to keep temporary records for transferring out. There's also the question how you verify the mixer doesn't skim.)

Some donation mixers accept crypto currency, so for maximum paranoia, I suppose crypto->crypto mixer->donation mixer->charity might be workable. Or hand cash to a friend who donates in your stead.

As always, the best path is to set aside paranoia and build a threat model instead to see what the actual risks are.



There's never enough talk like this and I'm not sure why. It's always about the threat model. In this respect I always like to think of it in terms of probability. Probabilities and likelihoods aren't just about capturing randomness like quantum fluctuations or rolling dice, they are fundamentally about capturing uncertainty. Your threat model is your conditions and you can only calculate likelihoods as you don't know everything. There are no guarantees of privacy or security. This is why I always hated the conversations around when Signal was discussing deleting messages and people were saying that it's useless because someone could have saved the message before you deleted them. But this is also standard practice in industry because they understand the probabilistic framework and that there's a good chance that you delete before they save. Framing privacy and security as binary/deterministic options doesn't just do a poor but "good enough approximation" of these but actually leads you to make decisions that would decrease your privacy and security!

It's like brute forcing, we just want something where we'd be surprised if someone could accomplish it within the lifetime of the universe though technically it is possible for them to get it on the very first try if they are very very lucky. Which is an extreme understatement. It's far more likely that you could walk up to a random door, put the wrong key in, have the door's lock fall out of place, and open it to find a bear, a methhead, and a Rabbi sitting around a table drinking tea, playing cards, and the Rabbi has a full house. I'll take my odds on 256 bit encryption.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: