There seem to be many privacy solutions, but little adoption. Sorry, but I have to blame the consumers, not developers. I am actually amazed at how much manpower is being poured into doomed privacy projects. Like how many encrypted messenger apps are there, that never stand a chance against WhatsApp and Facebook?
I'm not sure you can really blame the consumers. All products claim to be "secure", and consumers have no way of knowing what is and isn't secure. Plus of course security isn't a binary and all security solutions have problems of one kind or another, which makes things extra-confusing.
But how can you change the situation, if consumers don't care enough to educate themselves? A government decree wouldn't help in this case, as governments can't be trusted with protecting privacy.
Since you ask, I think a neglected part of the solution might lie in getting businesses to communicate securely. Unlike individuals, businesses actually care quite a lot if their data leaks.
I think maybe people have wasted a lot of time trying to peddle crypto to hippies and politicos, when lawyers and insurance companies might have been a more receptive audience. The only way PGP was ever going to get any adoption was if people feared getting fired for sending unencrypted private info.
And of course once there's a critical mass of people who know what a private key is due to their work, it's a smaller step to get individuals to encrypt things voluntarily.
>businesses actually care quite a lot if their data leaks.
How are you coming to that conclusion? Companies may say they take security seriously and they want to avoid becoming the next Sony or Home Depot, but how many actually allocate resources accordingly? It's much more efficient to just issue a press release and offer to pay for credit monitoring services that virtually nobody will actually use.
To be fair this is HN and that's undoubtedly true of most startups. But from my experience large, established, boring companies spend a lot of money on covering themselves against this sort of thing. Or at least on CYA security rituals. If they have money to spend on security theatre, why not try to sell them something that actually works?
I would speculate that it's because they are more concerned with checking boxes for their auditors or insurers than they are about the actual data. As for convincing the KPMGs of the world to take security seriously instead of calling for security theater, well, "It is difficult to get a man to understand something, when his salary depends upon his not understanding it".
What do you think that lawyers and insurance companies have to gain from better crypto (than HTTPS)? Most leaks come from poorly secured servers and compromised credentials. I have a hard time thinking of a realistic threat that an insurance company or law firm could mitigate with PGP everywhere.
I'm curious how other industries handle similar issues. It's possible this is one of those things where we just need enough people to be hurt by it before anyone pays attention. I hope we can stop it before it gets to that level, though.
I'm the opposite to you, I blame developers. They are the ones who build and understand the tech that powers online tracking.
If a user switches from their Google account to a non-related Google website, how are they to know they are stealthily still being tracked thanks to an invisible bit of Google Analytics code?
Do the consumers who purchase a ChromeBook realise everything they do in the OS is tracked and recorded by Google (even simply printing to their desktop printer)?
Are the parents of school children aware of the privacy implications of their kids using ChromeOS in the classroom? ChromeBooks are becoming ubiquitous in US classrooms, and yet there's barely any discussion of the privacy aspects.
Even if these companies assure us they only aggregate the data they collect (never personally identifying individuals), that's still a frighteningly large volume of personal information they capture. Can you imagine the humongous volume of aggregated data that Google must hold on it's users? They probably have the ability to mine that data in ways that most us probably can't even imagine.
What do developers do about this? Nothing. They built this tech and few ever call out these companies on their behaviour. In fact, a great many rush to the defence of these companies and happily recommend their products ("just bought my mom a Chromebook!").
So no, I don't think it's fair to blame consumers, but yes you can certainly blame developers.
I thought it was meant in another way, that the article blames lazy developers for not providing good privacy solutions. Of course certain developers implemented Facebook and whatnot, and are therefore to blame. But "developers" in general can not influence how big companies handle people's data. I can not change how Facebook or Google work, at least not without some genius idea (which may not exist).