Hacker News new | past | comments | ask | show | jobs | submit login
Attorney General Barr and Encryption (schneier.com)
328 points by hsnewman on Aug 14, 2019 | hide | past | favorite | 220 comments



> Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.

This is very naive. Just assume the scenario where a CEO or politician has a certain fetish that he/she rather would not like to be splattered over the front page of a national news paper. This person is likely going to use his personal email, phone, voice and data applications to indulge in this fetish. If there is a backdoor, it is almost guaranteed China, Russia or any other country will eventually break it and potentially use it for blackmail to get their hands on corporate or national secrets.


Yes. We went through this same fight with the Clipper chip already, but we can have it again with these incompetent, ignorant, appointee bureaucrats.

It's a solved political, human-access issue that doesn't need an easily-hackable technological hammer in search of a nail: if they can get a warrant, then they demand the password or one goes to jail. Adding backdoors would inevitably give unimaginable power to a host of foreign governments, random hackers and political adversaries... little good could ever come from it that couldn't come from warrants and traditional channels of legally-compelling disclosure without the damage to the already tenuous image of the integrity, security, and trust in most every type of system. The attack surface of modern, complex systems is big enough without a hole the punches right through it if you have the master key to everyone. For example, think of the immense risk and target posed by a master escrow key database, and some random subcontractor losing or "losing" a laptop, compromising everyone.


>but we can have it again with these incompetent, ignorant, appointee bureaucrats.

I agree with your sentiment completely, but do want to serve as a footnote on the sentence I am quoting: Being an "appointee" does not imply competence or incompetence on technological details. Being elected does not imply competence or incompetence on matters of technological details.

When the officials in charge of the decision are entirely elected, it is not as if they are elected because they have a technical pedigree, presented at DEF CON, or have a high score on leetcode. They are elected because of a myriad of complex factors, most of which relate to the prevailing cultural memes of the day. Heck they only stand for election based on a similar set of complex factors and impulses, and awareness of technological specifics is eternally low on the list.

I say this having known one low-level federal bureaucrat well (he might even qualify as part of the "deep state", if such a thing existed), and interacted with a few others in passing. In some ways they are ridiculously competent in their area of expertise. In other ways, they are depressingly in favor of the inertia of the status quo. But in all cases, I can't say that replacing their hiring process with an election would change a damned thing.


Or let's forget a fetish. Let's just say they sent someone a nude. Or how about an inappropriate joke (in or out of context)? These are pretty common things that people do and can easily be used as blackmail. Everyone has things that they are embarrassed about.


Playable deniability.

Mutual transparency.

Assured secrecy.

Non-indulgement.

The last two aren't realistic. The first two might be.


Those things work for nation state actors, not quite so well for just some random hacker. Or even if a nation state actor goes rogue (or "rogue").


>it is almost guaranteed China, Russia or any other country will eventually break it and potentially use it for blackmail to get their hands on corporate or national secrets.

In fairness, this is pretty naive too.

The idea that a backdoor exists, and our intelligence community is not monitoring the proclivities of, say, Halliburton's chairman, is fanciful in the extreme.

I think it's safe to assume that everyone would be recorded, and some people would be watched in such a scenario. Anyone above VP level at Oscar Meyer, Boeing, Procter and Gamble, Booz Allen Hamilton, etc would be pretty high up on that "watch" list. Probably even anyone who knows the CEO at Boeing, or Intel, or whatever would be high up on that list.

I always look at it this way, the intelligence guys are at least as smart as I am, and if I had a back door, that's the minimum list of who I would watch. I think it very likely that we would have gotten to those guys and gals long before any Chinese, Indian, Israeli, or whatever nation's operative would.

This is part of the insidiousness of backdoors.


> and our intelligence community

What happens when a foreign or hostile intelligence community gets their hands on some of those secrets? Just because "your own" know them already doesn't mean they're completely worthless for all the others.

An adversary armed with such secrets can change the course of an election, undermine certain people and initiatives to promote others or change a company's leadership, etc. And they can do it with 0 accountability since they never have to ever directly leverage the information, just release it in very targeted ways to the people who will then do all the work, knowingly or not.

Having backdoors is a bad idea and it has proven as much time and time again. And coming out in public to say "corporations can have their way but regular people have to be kept on a leash" is exactly the kind of thing you'd expect from someone on the wrong side of such a backdoor, someone with no choice but to prostrate themselves.


There's no way there aren't corporate secrets,and probably national security too, going over those same email, phone, voice and data applications already, whether considered "personal" devices or not. You don't even gotta get to finding something else to use as blackmail.


Or even an unscrupulous domestic agent of law. If you look at the handling of the Silk Road investigation by the FBI, you'll see that people are easily compromised (Secret Service agents investigating tried to pocket some of the bitcoin). They may tell you they are only using backdoors when necessary, but who's liable when they abuse it?


I was thinking of a another counter-argument:

Should we allow prisons in the united states to have "master keys" or "back doors" to allow firemen to evacuate prisoners in the case of fire?


The issue there is that it is too easy for your opponent to say that no, we should not have those master keys, because convicts are not worth saving whereas I am a regular average Joe, etc.


Not just China, could be any country really.


Including their own. How soon before the government is utilizing its security apparatus to blackmail its own wealthy and powerful?


Please allow me to introduce you to J. Edgar Hoover.

https://en.wikipedia.org/wiki/J._Edgar_Hoover

Not that he was the first in US history, by a long shot.


I think that ship likely sailed long ago.


The fact that something has gone wrong before (even if repeatedly!) does not imply that we should not attempt to prevent it from happening again, or that it's no longer possible to take steps that would move us in that direction.


Could even be our own country. Given that Barr is the son of a CIA scion I'm sure he wouldn't mind if U.S. intelligence agencies were given the power to pick and choose who is electable.


This is the greatest threat. Existing administration hacks into their opponents digital devices and then exposes all of their secrets to the world. This ensures that the party in power never loses an election again.


Forget about parties, it allows them to control who wins the primaries for both parties.


> If there is a backdoor, it is almost guaranteed China, Russia or any other country will eventually break it and potentially use it for blackmail to get their hands on corporate or national secrets.

Hell yes. Solidarity.


Also, everyone knows, only people break the laws, businesses never do.


I've never seen a business in jail. They're filled with people.


When a business "goes to jail" it simply ceases to exist. Enron, Theranos, etc.


Those went bankrupt. They didnt go out of business as punishment for wrong doing.


Fines so large that they lead to bankruptcy is how the death penalty works for businesses.


I think the opposite is more common: the government steps in to save a company that would otherwise be forced out of business due to the harm they caused others.

See for example tobacco and asbestos companies.

And did Enron really die if all their assets were sold to competitors and continue to operate?


No, they get fined, limited in where and how they can act, broken up, dissolved.

Cells make no sense for the business entity, however justice definitely does.

When we regain sanity and quit insisting corporations are people, and recognize them as the constructs and devices they are, then these discussions can make more sense.


I guess Arthur Andersen (Enron's accountants) is the only one that comes to mind. They were penally convicted and had to fold, if memory serves me right.


But I thought corporations are people too?


> China, Russia or any other country will eventually break it and potentially use it for blackmail

Interesting choice of bad guys there. Snowden already showed us that you've omitted the biggest bogeyman.


As Schneier says, it's better than the "nerd harder" attitude exhibited previously.

However, I take issue with this part:

> After all, we are not talking about protecting the Nation's nuclear launch codes. Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.

I think he is intending to argue that the quality of the encryption should be tiered, that "consumer" comm crypto should be weaker than business operations crypto should be weaker than gov't crypto. If so, I think this is a bad policy.

I just got through a security audit by one of our customers at work. One of the points that their security team made was that they want us to do more to protect the mobile devices used at our company. Even though said mobile devices do not, and will never have, access to the customer's sensitive data, the customer argued that a mobile device being breached gives an attacker a foothold, and allows said attacker to move closer to getting access to their data.

I think the same principle applies to this notion of lower-tier crypto for the peasants. Lower-security systems interact with higher security ones, and if those lower security systems are breached because their security was artificially lowered, they become threats to the higher security systems with which they interact.

I don't doubt that if crypto for Not-Government were kept artificially weak, that China or Russia would leverage that to their advantage, moreso than they already are. I think it would also make things easier for not just state actors, but lower-tier criminals, as well. If Barr wants to have an easier time catching bad guys, he should probably not make it easier for bad guys to do bad things.


There are ways to backdoor encryption without weakening encryption. In fact defcon hackers are constantly talking about how vulnerabilities in encrypting can make decrypting easier.

Allow me to openly think about this:

Take an image for example. If I hash it and encrypt it I can still recognize similar pictures. If I use a different set of analysis like color distribution I can also identify the picture with more time. If I release tiny bits of encryption key, I can weaken encryption but still cause a significant time of consumption to break down the code. Meaning it still won't be easy without effort.

I believe NSA tried to weaken encryption in a way that only they could break the code with enough time. They ran some sort of limitation on the kinds of keys that were generated.

This still brings a high cost to decrypting but streamlines it so that only certain people will have the power of doing so.


And then you will have actors who build their own dedicated hardware (like e.g. BitMain did for BitCoin mining) that is on par with state level actors -- many of which are likely now comparable to the NSA.

A little farther, you'll have services where you bring your own message ("really really mine. I just forgot the encryption key!"), pay, and get a decryption.

There is no way to select who the encryption is "weak" for and who it is "strong" for, except with a backdoor (e.g. DUAL_DRBG, or backdooring an RSA private key generation so the public key includes an encrypted version of the seed used for the private key) -- but a backdoor, once open, is open for everyone.


> I think he is intending to argue that the quality of the encryption should be tiered, that "consumer" comm crypto should be weaker than business operations crypto should be weaker than gov't crypto. If so, I think this is a bad policy.

I don't think this is strange, since this is how it works in physical. A home has a simple lock, a business a few better ones and a security system, and a military base has 24/7 armed soldiers on guard.

Impossible to do right now for encryption of course.


There are issues of scale that make that a poor analogy. You can't leverage a botnet to automate attempts to pick the locks of every home in the country. Nor can you pick a normal lock from thousands of miles away. Even if you get inside, you can't duplicate the entire contents of a house in a matter of seconds to rifle through later, and it's darn near impossible to break in and rummage around without being detected.


Indeed, and I suspect that's part of where the mentality comes from. One could argue that said simple hierarchy is appropriate for the threat model in most cases in the real world.

Where this falls down, is that if my threat model is different, rightly or wrongly, I can still change the locks on my home to something more fitting the threat model. If we were to try to fit the crypto policy that that Barr seems to want to real-world locks, then that could get problematic, quicky. I want to lock my hunting rifle's case with an Abloy, because using the toy locks from Home Depot is fucking irresponsible? Nope, not allowed. That's miltary-grade; civilians don't need military-grade tools. Requiring one be a business before one can get/use moderately-strong encryption could all too easily become a regulatory burden that causes a lot of small and medium businesses to become less secure, and further entrench the largest companies. And as the current top root-level comment points out, it would make getting blackmail on the rich and powerful that much easier.

I think it should remain impossible to do for crypto, because as another reply to you points out, the real world differs from the digital world in important ways. Trying to force the digital world to be more like the real world is likeable to make both suck more, not less.


If you put secret things in a military base. But then move them away and 5 years later the locks are broken there's nothing there anymore.

The physical world can't have all locks past, present and future broken simultaneously across the world, possibly without you knowing.

Knowingly and arbitrarily weakening encryption is risking those stakes for IMO not a good enough reason.

About the only things in common thinking about physical and digital security is the word security.


But that's just an artifact of the available resources. There is no law that prevents you from putting a better lock on your home or even hiring armed guards to protect your residence. Are you arguing there ought to be?


> I don't think this is strange, since this is how it works in physical. A home has a simple lock, a business a few better ones and a security system, and a military base has 24/7 armed soldiers on guard.

Yes, but this is not how it works in cryptography and there's no reason it has to. Should we deliberately weaken crypto systems so that they work similarly to some other arbitrary system? The question seems to answer itself.


That does seem to be the policy for copyright..


It’s better for the user to assume they’re totally pwned and should act like they’re in the open, than for them for assume the bad encryption works.


I have found that the best way to educate people on the importance of no government backdoors in encryption is to use the example of TSA approved locks.

Everyone knows when they put a TSA lock on their luggage it does almost nothing to improve the security of their luggage. Any serious criminal has a key to TSA locks.

Adding backdoors is like putting a TSA lock on your bank password. It keeps honest people from seeing it but doesn't do much else.


Actually you can 3-D print a set of TSA keys now.

https://github.com/Xyl2k/TSA-Travel-Sentry-master-keys


> Actually you can 3-D print a set of TSA keys now.

Unless that repo is backdoored and the code has been altered.


The point though is why would anyone even bother, since no one uses TSA locks as anything resembling legitimate security anyway.


I've never bothered locking my luggage. I figured it would cause more problems than it solves.


I use TSA-approved locks that have a little red indicator that pops up if the lock is opened with a key. You can reset the indicator only if you know the combination.

So, when I collect my bags, I can tell whether they've been opened since I last saw them.


Decoding the combinations (and thus resetting the indicators) on those is surprisingly easy. They're actually great for locksport beginners, the key lock is really easy to pick and the combination is easy to decode, AND they're cheap!


It's a great way to make simple and tangible all this nebulous talk of "keys" and "backdoors." I like to include this part about how it only takes one mistake to permanently ruin it all:

>The Washington Post inadvertently published a photograph of all seven of the TSA master keys in an article about TSA baggage handling. The photograph was later removed from the original article, but it still appears in some syndicated copies. In August 2015 this gained the attention of news sites. Using the photograph, security researchers and members of the public have been able to reproduce working copies of the master keys using 3D printing techniques.

https://en.wikipedia.org/wiki/Transportation_Security_Admini...


You can just buy them on amazon. Not even joking. No need for 3d printing or other tomfoolery.


" As computers continue to permeate every aspect of our lives, society, and critical infrastructure, it is much more important to ensure that they are secure from everybody -- even at the cost of law enforcement access­ -- than it is to allow access at the cost of security."

This.

I love that Schneier weighed in on this.


Nice quote.


I don't like to be the tin-foil hat guy, but I'm getting tired of the gov't rhetoric around how this is "to catch criminals". No, it's not. It's to protect and further enable passive mass surveillance. And nothing good comes from that. I don't trust anyone with that power not to break the rules and do bad things sometimes.


Exactly. So the police and government were never able to catch criminals before 1990? Terrorism or kiddie porn are just red herrings to scale people into giving up control. When someone asks for this kind of backdoor, I want to ask them to hand me their unlocked phone. "But you have nothing to hide, right? You aren't breaking any laws, right? I promise not to share this information with anyone!"


We've been bombarded by the same FUD in the UK and across Europe since 9/11.

Anyone capable of rational thinking knows it's nonsense, a power-grab by politicians, law enforcement and the security services looking for mass surveillance; overreaching powers that will inevitably be misused, abused and slowly creep into other areas of government.

And yet the bombardment continues, banging on about terrorists, paedophiles, the Russians, the Chinese, the bugbear de-jour.

Honestly, I fear for what is happening to Western democracies just now - it feels like we're on a slow but steady march to an Orwellian nightmare I don't want to live it.


Barr seems intelligent despite all the news and politics we have going on right now. I agree with Schneier’s assessment about have a policy discussion.

Let’s pretend the Big Tech companies build something robust and unbreakable (impossible) for the US govt. Now the EU and former English colonies want the same.

Now Syria wants the same access and full history of anyone in Syria. China would like the same for Hong Kong.

It’s slippery slope that goes down hill very fast. The line between criminal investigations and persecution are blurred.


That's just the nature of the beast. As long as strong encryption without a backdoor is physically possible, people will use it online regardless of legality. The thought that we could regulate that effectively is laughable. Even if you take down every single project, it's already out there. Heck, it's built into Java, which is installed on "over 2 billion devices" or whatever... It's also built into pretty much every decent programming language out there (Rust, nim, go, crystal, others). Mirrors of old versions of those are everywhere, and tons of people already have them installed. This would be as pointless as deciding to censor all .pdf files. Hell even WinZip has AES built into it. This would require significantly more oppressive censorship and monitoring than even China is capable of.

And that's not even touching on SSH and HTTPS and GDPR complications.

If we don't laugh at and discredit the idiots pushing this, people are going to take them seriously and we will have to deal with the consequences.

Regarding companies (like facebook) that collude with the gov to bypass encryption with MITM snooping, we need to continue to expose them, and major players like Google and Apple need to actively disobey any orders from the gov, and send an army of lawyers at it, and I think that's what will happen.


A government doesn't need to launch an all-out war on encryption to get its way.

They only need to lean on relatively few people - the humans who live in the US and who run Facebook, Apple, etc. to put back-doors in their services. Or worse, they put back-doors in their services while denying that they have done so.

The fact that strong encryption still exists will be of little use if it's not what the bulk of people actually use.


That is a legitimate concern, which is why we have to fight this.


There is also the possibility that using less secure communications hardware and software could be exposing individuals and companies to criminal and civil liabilities due to other legal requirements. In the US this is already the situation with regards to handling classified information.

Despite all of the rhetoric and public statements, the legal burden and actual enforcement seems to be going very much the opposite direction in the US and EU. There is good reason to suspect that the monitoring (and centralized censorship) infrastructure built up in other nations could end up producing the opposite desired goals in the long term.


Yes, I would absolutely love to see a GDPR suit against FB for allowing US gov to see some European person's data. Would be tremendous.


> laughable

This is seriously wrong, and that word "laughable" is a key-word for others who share you larger assumptions.

Let's assume that protocol enforcement is heaped upon those who can do nothing about it, not "laughing" techies with time and skills; .. until it is. Common access to network infrastructure is clearly being monitored and more and more requirements and restrictions are added each year, not less, in a dizzying number of ways. The ability to find a transaction from a "regulated IP address" used by those who do not have the skills or background to understand what is happening, only increases each year.

This cavalier analysis is counter-productive to someone who wants to a) stay our of the security swamp and b) live life somewhat un-monitored with individual choices.


You're missing my point. People should be laughing at Barr, in his face, in his inner circle, every day for this. If we don't facilitate that on the outside, how the hell will it happen on the inside. So laughable as in a call to action, not laughable is in negligible.


It's more complicated than that. Let's say the US adds back doors. What is to stop China, Russia, or some other nation state from using it in the US to intercept the communications of lawmakers, CEOs, and others? These are policy implications worth raising and discussing. Do many realize implications like this?


Exactly. An actual strong encryption backdoor would be impractical and/or would be immediately broken and leaked / dead on arrival (the entire crypto community would be motivated, on a personal level, to undermine this). The thing we have to worry about is players like Facebook allowing MITM snooping on behalf of the government without having an actual backdoor, as they have already agreed to do. I think Apple and others would sooner disobey a court order and throw lawyers and money at the problem than snoop for the gov, though, which would be a very good thing. We need to keep pressure on Facebook and keep exposing companies that give in to requests like this.


Yeah, Cloudflare is already mitm-ing tons of traffic every day. And have admitted through their ceo to scan the traffic for law enforcement!


> have admitted through their ceo to scan the traffic for law enforcement!

I'm not disagreeing, but you gotta cite that one. To educate the readers.


> What is to stop China, Russia, or some other nation state from using it in the US to intercept the communications of lawmakers,

Because NSA can't legally "spy" on US citizens, they would be happy to allow UK, or say NZ, to access those backdoors and record everything US citizens do. Then they just have to find a loophole in how to access that information later. Granted they can probably just redefine what "searching" and "access" means, which do already, but this would open even more loopholes and possibilities.

It's sometimes useful to think of these government agencies not as working for the US citizens but as adversaries who work against our interests.


> Because NSA can't legally "spy" on US citizens, they would be happy to allow UK, or say NZ, to access those backdoors and record everything US citizens do.

This is, modulo using the hypothetical backdoors under general discussion, pretty much exactly what the "Five Eyes" member states (which set includes .nz and .uk) are doing for one another already.


> communications of lawmakers, CEOs

> Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.

It is AG Barr's suggestion (as far as I can tell from the quote above) that these are not regular consumers so they should get "the real thing". Everyone else would get the "inspectable encryption" version.


That's my understanding as well, but it's a pretty insulting suggestion. Why is a business's privacy more important than my own?


Because businesses donate more money to politicians than you do.


I think most of us around here on HN and "tech people" in general understand that particular set of issues around encryption backdoors.

It seems to me that certain lawmakers and people like Barr either don't put the time in to understand or are willfully ignorant to further their own goals.


Supposedly Upton Sinclair said, "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

I think that applies here.


How about perfectly informed and extremely malicious?


What is hilarious, is they already do. Apple services in China are not run by Apple, but by Guizhou-Cloud Big Data Industry Co Ltd.

[0] https://gizmodo.com/apple-moves-chinese-icloud-encryption-ke...


Same with Azure and AWS


> Let’s pretend the Big Tech companies build something robust and unbreakable (impossible) for the US govt.

Didn't Chaum write up a quick-and-dirty spec specifically to stop that little parenthetical from being repeated?

Edit: I guess it was more than a quick-and-dirty spec:

https://www.wired.com/2016/01/david-chaum-father-of-online-a...


After a quick scan, I could not find any spec linked in that article. Claims are nice, but an actual spec to confirm those claims would be nicer!


I think this is it, but I don't have time to read it to confirm:

https://eprint.iacr.org/2016/008.pdf


> “ Here, some argue that, to achieve at best a slight incremental improvement in security, it is worth imposing a massive cost on society in the form of degraded safety. This is untenable.“

I’d suggest if they really want to push this line of reasoning, then we should also abolish the 2nd amendment. It is there in order to revolt against the government. But it’s place also degrades the security of the country, as evidenced by the constant shooting and mass shootings that occur in American life.

Encryption is much the same way. It protects individuals against government spying, whatever their situation may be. And these products exist not just for the US, but are made by Americans for export with the American ideal of individual liberty in mind. If we remove that, it’ll get moved for China, Burma, Russia, etc.

It’s quite a price to pay for society.


The 2A and crypto have been linked for a long time. The original crypto fight in the 90s was over exportable encryption (because it was listed along with munitions and other ITAR-restricted items).

One idea I saw floated around the time of the San Bernadino iPhone was arguing that encryption is protected by the second amendment as a means to not having to fight against backdoored encryption every few years. I don't have the case law familiarity with the 2A & encryption to say if it would go well but I did find it a fun idea.

Edit: I should add that this would be in addition to the clear 1A protection code and therefore encryption has.


The 2nd amendment was to defend against Indians.


Bad guys will certainly know that given phones includes backdoor access and they’ll figure out devices that does not.

Law enforcement will end up spying on regular citizens and catching small fish while big fish and bad hackers will laugh watching.


> they’ll figure out devices that does not

Just owning one of them could then become probable cause for a search of everything else you have, and then you'll be held in contempt and jailed until you hand over the password.

Or it could just be made illegal to own one at all. Anyone with a secure smartphone could probably be tracked just by traces it leaves when it connects to a cellular network.


> > they’ll figure out devices that does not

> Just owning one of them could then become probable cause for a search of everything else you have, and then you'll be held in contempt and jailed until you hand over the password.

Could you explain your reasoning? Just owning a safe in my trunk doesn't give people probable cause to search my trunk, so it's not clear how you came to that conclusion.


Check this out: https://www.nytimes.com/2019/03/30/us/politics/dea-money-cou...

DEA was looking though records of who bought money counters.


Onde cheep phone per message and criminals are back. But hosest people are now under constant mass surveilence AND the agency only have metadata anyway because the message was encrypted


They will also catch political trends, which I suspect is the whole point.


I liked this comment from below the article.

> I prefer to live in a free but insecure world than in a perfectly safe but not free world.

Which I interpret to mean, we're free to use whatever protections we see fit. But institutionally we can't promise security, since bad actors will always exist.


I don't like it because it suggests that freedom and security are somehow at odds with each other, which is already an authoritarian framing of the situation, and really just nonsense. I want to live in a perfectly secure world. I don't think that that is actually achievable, but there is absolutely nothing wrong with that goal.

But: There is no such thing as "security" in and off itself, security is always relative to something that you value that you try to protect. So, if you value freedom, you can not achieve security through limiting freedom, because that would mean destroying what you value, supposedly in order to protect it ... but that would obviously be a total failure at achieving that goal.

The point is: Part of living in a world that is as close as possible to perfectly secure is that you have to mitigate the risk of concentration of power in the hands of authoritarians and corrupt people. Limiting the power of the state is a security mechanism, and authoritarians who want to obtain more and more power by calling their power "security" are simply lying.

If you accept that authoritarians dismantling security mechanisms is somehow an increase in security, you have already fallen for their propaganda.


> I don't like it because it suggests that freedom and security are somehow at odds with each other, which is already an authoritarian framing of the situation, and really just nonsense.

Agreed. I think part of the issue stems from how people define freedom, seemingly thinking that freedom means "no rules".

For example: If having rules is intrinsically against freedom, then why would anyone who desires freedom play sports, where rules define the game. If you eliminate the rules, you eliminate the game and your freedom to actually be able to play it.


Freedom means you get to pick the rules; they can't be imposed on you by others. The thing is, you can't pick one set of rules for yourself and a different set for everyone else. Whatever rules you choose to live by must apply equally to everyone. Don't like private property? Fine, but you can't object when others retaliate by seizing the fruits of your labor. Think kidnapping for ransom is harmless fun? Locking you up in prison is essentially the same thing.

The problem is when certain people want others to live by their rules and are willing to apply disproportionate force to get their way. Capital punishment for theft, fines and imprisonment for copyright infringement, penalties for refusing to aid an official investigation, etc.


Freedom and security are at odds with each other, though.

If you’re free to own guns, you’re not free from someone else owning guns and shooting you with one.

If you’re free to drive a car, you’re not free from someone else who drives a car not running into you and killing you.

Sure, they will be punished for it, but the harm to you has already been done. You are not safe from it. It is an unlikely, but possible danger.

The only protection from guns and cars is universal disallowment of guns and cars and the immediate catching and ban of all people that begin the process of creating or thinking about guns or cars.

This example can be spread to almost anything that has any potential of harm at all.


> Freedom and security are at odds with each other, though.

No, they are not.

> If you’re free to own guns, you’re not free from someone else owning guns and shooting you with one.

Or in other words: Different freedoms are at odds with each other.

> If you’re free to drive a car, you’re not free from someone else who drives a car not running into you and killing you.

Or in other words: Different freedoms are at odds with each other.

You might as well be saying that if you are not free to own guns, you are not secure from having your guns taken away. None of that is fundamentally about security vs. freedom, it is only about conflicts between different freedoms that have to be weighed against each other. Arbitrarily labeling one of those freedoms as "security" is a lie.

> The only protection from guns and cars is universal disallowment of guns and cars and the immediate catching and ban of all people that begin the process of creating or thinking about guns or cars.

No, it's not. The only protection from guns and cars is to have everyone agree that owning guns is bad, so noone does, or that owning cars is bad, so noone does, or whatever. The moment you suggest "catching and banning", you are talking about giving some people guns so that they can use them to force others to get rid of their guns, and that is the moment where everyone is at risk of being shot at using one of those guns, be it by mistake, due to corruption, oe whatever the reason might be, so obviously you are not "protected from guns". That is exactly the authoritarian propaganda lie that I was talking about.

There is nothing inherently secure about giving some group of people power, no matter for what purpose you do it. Giving people power is a danger. It's a danger that may be well-justified due to the other dangers that you might be able to control this way, but it is always a danger. It is always about weighing one danger against another, about weighing one freedom against another--framing it as "security vs. freedom" is an authoritarian propaganda lie that tries to convince you that one of those dangers isn't a danger by mislabeling it as "security".


> I prefer to live in a free but insecure world than in a perfectly safe but not free world.

People on Hacker News will upvote that when it is about encryption, but downvote it when it is about financial markets (e.g. the SEC). Just an observation.


Mandatory government access to all communications (including saved encrypted communications) is not compatible with systems where the government or control of the government changes.

The thing that I don't really see discussed in this is the question of who do you trust to have the keys?

Personally I can't imagine entrusting decryption keys to anyone appointed by or simply hired by the Trump Administration given its history of choosing people for sensitive positions. I guarantee that there are a ton of people who think that Hillary should be locked up that would feel exactly the same way about security keys under the control of any Democratic Administration. What's more, unless the policy becomes that all communication must go to the United States government to then be retransmitted onto a final destination then security keys would be vulnerable to disclosure by anyone from a former Administration who had access to them, and it only takes a limited number of compromised or dishonest individuals to compromise the entire system.

Don't discuss whether people are comfortable with the FBI or Department of Justice or William Barr having the authority to get to all of their Communications. Discuss whether they'd be fine with Barack Obama or Hillary Clinton or Eric Holder or whoever ends up being the Democratic nominee having that access.

Edit: moved last paragraph to first.


I for one look forward to our federal overlords thinking they have any clue what kinds of transmissions deserve security /s.

Seriously though, the legal tradition just hasn't caught up to the idea that somebody can possess information outside their person that can't be forced into the eye of the law.

I've thought about this a lot and my best conclusion so far is "give up the encryption keys in exchange for immunity against legal cases that aren't yet open that the secured data might reveal, or you're guilty by default". Seems like the kind of thing prosecutors might think twice about, and it gives them a trump card in extraordinary circumstances. I don't love it, but all the other ideas i can come up with leave one party so heavily favored that either freedom dies or the government won't stop whining.


Subpoena individuals and do investigative work. Identified individuals can choose to comply and reveal their digital artifacts that are specifically identified as being part of a crime or be held in contempt of the court order. If they don't use locks or use easily bypassable biometrics then the court order already allows latitude for that.

Just give up on being able to image and examine people's encrypted artifacts, and just give up on trying to tap a firehose of data from the providers now that they have an interest in encrypting what they store.

The state had a good 30 years of unprotected digital artifacts being available, and now it is just going back to the heuristic analysis of the days before, a level of accepted intrusion that government is built around.


> Identified individuals can choose to comply and reveal their digital artifacts that are specifically identified as being part of a crime or be held in contempt of the court order.

Abuse of "contempt of court" to compel third parties not accused of any other crime to assist in an investigation is also a problem. The court can ask for the data, of course, but one ought to have the legal right to refuse; refusal to provide the requested data, on its own, should not be considered evidence of guilt or probable cause to conduct a search.


Agreed. The point is that is a separate issue and the tools for investigation are already here.


To empathize with Barr, I can see how he doesn’t draw much distinction between his experience at Verizon and the question of encryption.

He experienced the pain and consternation of regulation, and now feels erudite calling BS on “his old industry” as the defender of All Things Right.

But of course, when Hacker News is called upon to “nerd harder” we always rise to the challenge. Except when the difference is, in fact, fundamental.

Back doors have a way of becoming front doors for the wrong people. And absolute power corrupts absolutely.

All of that was true before. It’s just that now getting the wrong set of keys could now open a billion locks at once.


99% is good enough?

If 1% of all secure public data transmissions were compromised there would practically be no point in trying to secure anything.

And who is Barr to decide what's worth encrypting on a societal level? I'll put my photo album through 5 different algorithms if I want to. Such is my right.

Go ahead Barr, make laws for corporations to follow and watch as all that data you're trying to capture disintegrate from the networks where you gave yourself access and reappears on even more secure smaller scale distributed systems.

What will happen is instead of Google hosting your data for you they'll just create devices that let you store your data at home, offline while only beaming back the telemetry and meta.


There is also a false dichotomy based on the idea that law enforcement are the “good guys”. In the Bay Area, traffic officers routinely get caught blackmailing women. Federal prosecutors routinely use “alternative reconstruction” where they keep the real evidence secret, and just lie to courts about the facts. US government computers are routinely breached by government employees, contractors, and foreign governments, and the results are then used to blackmail our politicians, influence elections, etc.

Barr should know about that last one, since he’s in the middle of the current impeachment proceedings, and Trump has also been routinely calling for the prosecution of his political opponents for doing some of the things I just listed.

It is clear that even Barr is uncomfortable with giving these surveillance powers to the executive branch while a Democrat is serving as president.


>If one already has an effective level of security say, by way of illustration, one that protects against 99 percent of foreseeable threats -- is it reasonable to incur massive further costs to move slightly closer to optimality and attain a 99.5 percent level of protection?

Except that's not the case. Encryption is mathematically proven to be 100% secure. Assuming the application was coded / implemented correctly, the only thing capable of unlocking encrypted secrets is the secret key.

AG Barr wants to move it from 100% secure to "99.5% secure". Not the other way around.


> Encryption is mathematically proven to be 100% secure

Well... no. We're pretty sure that factoring large primes is intractable, but nobody has actually proven it. And we know it's not that hard for a quantum computer. Modern cryptography is certainly not 100% secure for all time.


> mathematically proven to be 100% secure

We still don't know how to prove the difficulty of any computing problems which the security of conventional cryptosystems rely on. https://en.wikipedia.org/wiki/Computational_hardness_assumpt...

Some systems, like the one time pad and quantum cryptography have security which doesn't rely on computational hardness, however.


IMO, this almost-but-not-quite-100% attitude is still the right frame of mind. The remaining percentage represents exfiltration by means of hacking the device through its various vulnerabilities. Sometimes that exfiltration takes the form of encryption keys, and sometimes its the user data that the attacker was after in the first place.


>>Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone.

And the bad guys can be the state. People need to acknowledge the phenomenon of systemic failure. People at large can know that the system is deeply flawed, yet not be able to fix it, because the problem is too complex to fix.

One could argue that situations where members of Congress see their compensation increase by an average of 1,800 percent when they leave office and become lobbyists, where fewer than 3 teachers in an entire state are fired in a year due to sweetheart collective bargaining agreements, where police unions protect officers from facing disciplinary action for miscondict, where the wage gap between federal employees and private sector workers has increased significantly since 1950, where more than half of the 20 wealthiest counties in the US are suburbs of Washington DC, where intelligence agents are known to use the state's surveillance apparatus to spy on lovers and exes through a practiced coined LOVEINT, where there are approximately 1 million federal regulations which hold potential criminal convictions for breaching, and where the prison population is 1 percent of the total population, are all examples of systemic failure, and the frequency of such failures requires us to check government power by enshrining legal principles like the right to privacy.

Information is power and centralizing information in the hands of a small government elite through mass surveillance leads to power asymmetry that is dangerous to society.


Keep in mind, while you cheer on ending the tech industry's free reign and bringing in adult supervision and regulation, that this is what it means.


A shallow sentiment. Every policy should be weighed on its merits. This would suck, plenty of other policies would be fine.


As stated, this is simply fallacious - "Supporting some regulation means you must support all regulation."


Can someone help me understand why technologists and entrepreneurs on HN cheer on "ending the tech industry's free reign", when they are part of the very same tech industry?


Because those closest to a thing can see the bad things it does. If you then conclude that doing the bad thing is game-theoretically rational, then regulation is needed. Hence, you cheer on adding regulation.


I have heard several stories about industry groups preferring government regulation to the status quo because game theory made it impossible to self-regulate effectively.

It makes sense to me, no agreement will hold if any one person can violate it with nothing to stop them.


One of the problems with the regulation debate framed in terms of "more vs. less" is that it's easy to assume increased regulation will fix what you consider bad and preserve what you consider good, or that deregulation will only cut pointless bureaucracy.

It's distressing that the support is for more regulation in general (which can be easily coopted to support stuff like this), rather than specific policies or outcomes.


It's a response to the conservative voice saying 'deregulate everything'. You first need to argue for regulation per se before you can argue for specific regulations. The nuanced position of 'regulation isn't inherently good, but this specific policy is' is blown away in the public debate by 'REGULATION BAD'.


Contrarian voices are amplified. No one writes, or at least upvotes articles saying "everything is pretty ok."


Well that and the fact that everything is not OK. But hey, maybe I'm just a contrarian. /shrug


Whether you're right or wrong, status quo still isn't newsworthy.


I think a lot of the entrepreneurs here are having difficulty in competing on the open market with the FAANGs and not-quite-as-big-as-FAANGs-but-still-big companies. They don't really want to end the tech industry's free reign. They just want to see themselves replacing the current top dogs.


Just as the government has been pretending that backdoors are possible without a loss of privacy, the tech community has been pretending that backdoors are indistinguishable from a complete loss of the benefits of encryption.

Don't get me wrong: I'm totally against backdoors, and I do consider them a significant weakening of any encryption.

But it's just disingenuous to insist that, for example, a system with the same sort of requirements as exist for search warrants is entirely equivalent to a system without those. Or that one-out-of-two encryption schemes do not exist. Or that it is totally impossible for an agency to keep a keyfile secure.

As but one example for the last: SSL certificate authorities are already entrusted with keys whose loss has just about the same sort of security implications as breaking messaging encryption might have. And that system is working somewhat decently, including in cases where those keys were breached and certificates had to be revoked.


> SSL certificate authorities are already entrusted with keys whose loss has just about the same sort of security implications as breaking messaging encryption might have.

A leak of the private key for a SSL certificate authority (or even for the server itself with modern TLS) doesn't allow decrypting past messages, while these backdoor proposals aim to allow precisely that.


Relevant.

* Honest Government Ad | Anti Encryption Law - YouTube || https://www.youtube.com/watch?v=eW-OMR-iWOE

Also worth noting that Australia already went ahead with their anti-encryption laws. Feels inevitable that we'll lose encryption to the nanny state. Really sucks, but I haven't got a clue how to convince the grandmothers out there why the cops shouldn't be trusted here. It's so frustrating.

* Government Surveillance: Last Week Tonight with John Oliver (HBO) - YouTube || https://www.youtube.com/watch?v=XEVlyP4_11M

I think it'll be one of those things that society won't wake up to needing until it's long gone. Depressing as fuck.


I feel like it is inevitable that any back door you legislate ... will get more use out of enemies of that given state than the state itself.

Any state with strong individual rights and courts will be limited in their usage to some extent. Outside actors, unlimited.


I guess it should be illegal for Ransomware providers to generate encryption without government backdoors too, eh?

On a side note, with the new Australian laws, is it now illegal to use tools like LUKS and Veracrypt there?


“Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys.“ No it does not. By definition if the backdoor exists the bad guys won’t use it. This is clearly an excuse for monitoring all people not just the bad guys, and has little to do with security and everything to do with control. Please stop accepting the argument that collective security is improved with eavesdropping, it really really isn’t.


Why would anyone listen to Attorney General Barr regarding information security issues instead of actual experts?

> This is exactly the policy debate we should be having

Why is this debate needed? It's beating the same dead horse since this debate already happened in the past. It was made clear, that backdoors are not an option.


Because AG Barr is the expert on law enforcement capability, which is being weighed against security. Nobody including Barr himself denies that his proposal would weaken security.


I meant in the context of "it's worthwhile to weaken information security". This debate is over and he is just beating the dead horse.


DES was built to be brute forced by govt computers.

Don't trust any cryptographic standard put forth by the NSA, ever. They have always been about backdoors.


Source? Wikipedia says exactly the opposite: https://en.m.wikipedia.org/wiki/Data_Encryption_Standard

NSA made DES more resistant.


No they didn't.

They reduced the key space down to 56 bits. Which was small enough for government computers to hack even decades ago. The original proposal by IBM was for 128 bits IIRC.

https://en.m.wikipedia.org/wiki/56-bit_encryption

That is interesting that the Wikipedia page you linked claims the contrary and makes me suspect tampering.

Because this story has been told over and over again, even in textbooks. See Springer's Understanding Cryptography chapter on DES for instance.

EDIT: In fact, the Wikipedia page you linked states

DES, as stated above, is insecure. This is mainly due to the 56-bit key size being too small.

So where is this nonsense about the NSA making DES more secure?


http://swarm.cs.pub.ro/~mbarbulescu/cripto/Understanding%20C...

The original cipher proposed by IBM had a key length of 128 bits and it is suspicious that it was reduced to 56 bits.

The official statement that a cipher with a shorter key length made it easier to implement the DES algorithm on a single chip in 1974 does not sound too convincing


From https://www.schneier.com/blog/archives/2004/10/the_legacy_of...

"It took the academic community two decades to figure out that the NSA "tweaks" actually improved the security of DES. This means that back in the '70s, the National Security Agency was two decades ahead of the state of the art."


Talk about cherry picking info. From TFA, if you had bothered to read it:

By the mid-1990s, it became widely believed that the NSA was able to break DES by trying every possible key. This ability was demonstrated in 1998, when a $220,000 machine was built that could brute-force a DES key in a few days


Two NSA issues are getting conflated:

(1) Keeping the key size within brute-forcing range. Agreed, they did that and surely wittingly. By the mid-90s nobody doubted they had the computational strength for it; the 1998 demonstration was that any modestly-funded organisation could too.

(2) Improving the resistance to differential analysis by changing the S-boxes. This is what the Schneier quote is about: the NSA making a change and refusing to disclose the basis, causing suspicion that this installed a backdoor for them, whereas in fact it was (as far as any public cryptographer has disclosed) strengthening it against a cryptanalytic technique that the NSA didn't want to reveal.


Perhaps OP was thinking of the criticisms around Dual-EC-DRBG?

https://en.wikipedia.org/wiki/Dual_EC_DRBG


That is another example, yes.

But the NSA purposely dictated that DES use 56 bit keys (a strange choice of key length to begin with), so that it would be possible for brute force attacks using government mainframes. The original proposal was for 128 bits IIRC.

This was way back before many people studied cryptanalysis, which is why no one cared at the time.


Let's not pretend that Law Enforcement (aka The Government) is asking for these powers in a vacuum. Every time a terrorist event happens, or pedophile ring is broken up the public asks why it was not prevented or stopped sooner.

The argument of data security vs physical security needs to take place with your neighbors and not just your Senators.

Don't get too mad when Law Enforcement does what your neighbors ask.


I see this trope a lot around these kinds of discussions, but don't really see this going on as much when such events actually happen. Sure, it happens to some extent but I really wonder whether this claim (saying that large numbers of people basically demand more and more surveillance whenever any major crime happens) is instead hugely exaggerated and pushed by those very people trying to push more and more surveillance...


The thing I have never heard anyone talk about is what the repercussions should be for those asking for a back door when the door is used by malicious actors. We have seen companies like Wells Fargo commit company wide bank fraud...no one goes to jail. We have seen companies like Equifax lose millions of people's worth of personal and financial information...no one goes to jail.

When Barr and Trump push through legislation that requires back doors put into all our security, who is going to do jail time when all our personal information is leaked again? I try to put the least amount of info out there but when I do I use crypto that I know works, that I know who and how it was designed, who and how it was audited. When the government comes in and some intern loses a USB stick with keys to the back doors I wanna know who's head is gonna roll for it.


I see this as fundamentally un-American. To quote Benjamin Franklin

* Those who would give up essential liberty to purchase a little temporary safety, deserve neither liberty nor safety.*

Additionally I think there is a fundamental misunderstanding, or lack of understanding, of Pareto[0] and fundamental statistics. At this point we are scraping the bottom of the barrel for safety. We are in the safest time in world history. We are in one of the safest times in American history too^. Certainly in the last couple decades. Yet for some reason we are treating issues of safety as if they are worse than the 90's. And if we can make a statement that is only 99% secure then we're pretty much screwed. 1% of attacks/opponents/people being protected (whatever that measurement of "threats" means) is really low. If it's attacks, well give it a few minutes (that'd be consistent with previous back door implementations). If it's opponents, then really anyone we actually care about is going to happen access (there are far more than 100 countries with computers). If it's people being protected, well there are 350m Americans. That leaves 3m Americans vulnerable. Any of these cases are unacceptably low and I'd argue actually set us back.

The other thing is that even implementing mass surveillance wouldn't help us. In many ways it has more potential to harm us. Like many in the thread have said, it isn't just personal privacy at hand. Politicians, high profile business people, etc can easily be blackmailed. It doesn't even have to be some kink (as others have suggested). Just something like sending nudes to a partner or a charged joke that is taken out of context (how many of you have dark humor or use jokes to illustrate a point?). Nation States will definitely gain access to these backdoors. It's highly likely hackers will as well. Additionally everyone does have something to hide. Banking passwords, sensitive information, personal thoughts and feelings^^.

So we are going to give up a fair amount of liberty for a minute amount of security? (Possibly negative security!) This does not sound like a good deal for anyone involved. I don't think it even helps law enforcement. They already can't handle the information that they have. We've seen that with data they have, or could easily obtain, that things are obvious in post hoc (like someone on 4chan saying they are going to shoot up a school).

How does this help us as American people? That needs to be honestly answered. Otherwise all I see this as a ploy on fear and overreach. We used to fear Big Brother. I'm not sure why or how we have come to embrace him.

[0] https://en.m.wikipedia.org/wiki/Pareto_principle

^ the issues at hand I do not believe would be solved in any way by monitoring because monitoring does not fix the root causes, which are clearly solvable.

^^ lack of being able to share these will only increase our problems.


I wanted to write a post about how the encryption would interact with free speech, especially given the leaked memo to censor the Internet.

Let me start by stating my views on free speech and rights in general, and then how they are shaped by these events. I think that human rights and freedoms are just that: personal freedoms. Freedom of religion is about personal religious observance without harming others. These freedoms philosophically should not mean entitlement to unlimited exercise thereof. The right to bear arms doesn’t mean you should be able able to stockpile unlimited amounts of ammunition and incendiary devices etc.

Similarly, FREEDOM of speech to me is a PERSONAL human freedom. You can say what you want, and not be punished by the government for it. You can say it in a car, you can say it in a bar, you can say it very far, you can wish upon a star. But there are limits to how many people can hear you. Maybe 10 or 100 people at an event.

Once you get into situations where 5,000,000 people can hear a tweet, that’s clearly not about FREEDOM of speech in its strict sense. It is about entitlement to use a PLATFORM, maintained by an ORGANIZATION that involves many people, to broadcast arbitrary, unfiltered one-to-many messages to everyone.

I think this latter thing is toxic, in both directions. Society listening to tweets of celebrities cheapens public discussion and civic thought. And being reachable by the whole world using email (rather than through networks of shared invited/capabilities) leads to constant spam and papparazzi for celebrities. What happened here is an ORGANIZATION put on a show or movie and catapulted this celebrity into the limelight and carefully maintains their stature, along with their own publicists, social media team on twitter, etc. This is the society we live in, where we have heroes. But entitlement to unlimited unfiltered megaphones is NOT the same as freedom of speech, any more than being a leader if a paramilitary group of unlimited size is the same as the right to bear arms. So, freedoms and rights have limits. Where those limits lie is the heap paradox - as you take away grains, when is a heap no longer a heap? etc. So what is the alternative to this type of misnamed “free speech” aka megaphones run by organizations, super PACs, mainstream media, and so on? It is COLLABORATION.

  Look at Wikipedia.
  Look at peer reviewed journals and science.
  Look at large open source projects
There, individual contributions are filtered and often butt up against changes, revisions, etc. The result is that when the general public sees something, it is the result of a collaborative process of filtering and refining the presentation of information, citing sources, etc. There are no heroes on wikipedia, and only a few in science and open source. Most contributions are filtered by a community of experts, not state governments or platforms employing boiler rooms of low paid workers to determine what’s true. I would like to see more of that COLLABORATION and less of COMPETITION.

I would like to see a patentleft movement in drug research, instead of big pharma. I would like to see news reported like Wikipedia with footage submitted by everyday people on the ground instead of “intrepid reporters in a warzone”. CNN used to have a motto that they have “no celebrities”. News agencies tried to stay lukewarm and neutral. FOX News changed the game, lots of people copied the model. The Internet eliminated newspapers and classifieds. News had to adapt because capitalism and cutthroat competition for the same ad dollars means MORE clickbait and MORE lockin to one type of audience. For-profit Social networks further use this content to herd us into echo chambers of outrage, because that’s what drives the most engagement, which the social networks need to monetize. They send notifications in an increasingly desperate attempt to grab your attention in a tragedy of the commons where the commons is our attention.

This has had a corrosive effect on society. The capitalist (competition based) news has made us more polarized and outraged, while the capitalist (competition based) social networks have made us more addicted to our notification slot machine, with smaller attention spans and self control, responding to that stranger on the internet over that latest outrage.

THIS is the culture that leads to more mass shootings. The fact that we have giant platforms instead of peer to peer is another problem. By banning extremist people from platforms, a platform can pop up which attracts the worst extremists, and feeds them. This platform should ABSOLUTELY be a honeypot for the FBI to watch these people. In our world of centralized platforms, Platforms like this should be RUN by the FBI. Instead, our government takes the wrong approach. They shut down the Craigslist and Backpage hookers sections instead of using them to entrap and catch traffickers. Then they threaten large platforms with SESTA (2018) when they should be the ones catching the people who are out there. The platforms should be honeypots!

Anyway. So although I feel my stance is correct, and beneficial to society, there are three practical problems with it:

1. First Amendment is not interpreted as I do. In fact Citizens United even allowed our politics to be run by PACs with huge money and megaphones (although nonprofits could have always done that). So legally my literal understanding of limits of freedoms is not matching the traditional ones (slander, yelling fire etc.)

2. This may be the more serious one. As we have more end to end encryption and better personal technology, all well-meaning ideas about limits of freedom of speech and arms melt away. Imagine Alex Jones on SAFE Network with 1,000,000 people subscribed to his encrypted feed. Or imagine 3d printed guns from illegally shared 3d models, stored in 10% of the homes in NYC. Can’t stop people using a turing complete language to turn out banned material.

3. Even with numerical limits on each person’s audience, a hateful message can attract people who make plans to use technology to asymetrically perpetrate criminal acts. And end-to-end encryption means we won’t know what they’re saying. However, I believe that if we took the freedoms in the way I defined them, and moved to collaborative platforms instead of competitive ones, our society’s health would measurably improve.


> Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.

This very American attitude of separating "business enterprises" from "consumers", which to me sounds like separating the noble and important from peasants or cattle, is utterly sickening. I am not a "consumer", I am a person and I deserve and demand more privacy and freedom than a corporation.

We are not consumers, we are people.


Please don't post unsubstantive comments to HN, and especially not indignant riler-uppers. Those lead to noticeably worse discussion, and we're trying to drive in the other direction.

https://news.ycombinator.com/newsguidelines.html


Noted, but in what way was this not substantive? I'm expressing my opinion on a particular subpoint which is quite relevant to this discussion. If Barr was was not pushing for an artificial separation between the corporations and consumers, there wouldn't be any discussion to have, since taking encryption away from corporations is a non-starter.

Just as a disclaimer, I've read the guidelines and know them well.


If I take out the nationalistic slight ("This very American attitude"), the sarcastic rhetoric ("separating the noble and important from peasants or cattle"), the denunciatory venting ("utterly sickening") and megaphone language ("demand"), it's not clear to me what information is left. What is the comment really saying? Something about how businesses aren't people? To me it reads like an angry reaction to some shallowly triggering phrasing in the article. Angry reactivity is the opposite of thoughtful reflection, which is what we're hoping for here.

It's also off topic. Whimsical off-topic digressions can be interesting, but generic rhetorical ones are never interesting. Those discussions have been repeated countless times already, thus are predictable, thus are tedious, so we ask people to avoid them. The more generic a subject, the more shallow its discussion—and when it's angry as well, that's much worse. Angry plus shallow equals riler-upper, which is close to flamebait.


You're right, I was a bit angry when I read this. It's rather hard not to get angry given the topic. I agree I could have phrased it more tactically.

I understood your point and I'm not trying to prolong the discussion. However, I consider the implication that my comment was entirely devoid of a point a bit unfair, so I'll try to rephrase.

I think there is no good argument to be made for stripping away the privacy of citizens, with the implication that it's okay since they are somehow less important than businesses. The fact that this is now getting somewhat regularly proposed is scary and a danger to liberty. To me, Barr's statement reads as a long-winded way of saying "it's not too bad if we punch holes in your encryption because your systems weren't secure to begin with and you're also not that important since you are just consumers". My point was to call this out explicitly and try to invalidate it as an argument for breaking encryption.


If you had posted that last paragraph originally it would have been a fine way to make your point. That's the basic idea here.


You can mentally tone down the rhetoric and easily see the point. Corporations do not deserve more rights than people.


From your comment I gather that you can, but that's not true of all readers, and in any case the use of fiery rhetoric to dress up a point like that is flamebait and so against the site guidelines.

https://news.ycombinator.com/newsguidelines.html


It may not be your favorite comment, and it may have been slightly inflammatory, but I disagree wholeheartedly that it's unsubstantive. We should be concerned about this sort of rhetoric passing uncriticized.


Ok, it's against the site guidelines to post that way, i.e. with such a high ratio of indignation to information. We can have different ideas about what's substantive but that comment was flamebait in the sense the guidelines ask people not to post to HN.

https://news.ycombinator.com/newsguidelines.html


Don't you know the Fourth Amendment?:

>The right of large business enterprises to be secure in their employees, locations, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the large business enterprises or things to be seized.


They broke the 4th Amendment into 300 million pieces, lied about it to congress, got caught by Snowden, got locked out by the nerds, and now they don't merely want back in, they want a key to the back door for the express purpose of continuing to break the rules they swore to uphold! The mind boggles.

Who do I (continue to) donate to in order to express my opinion that this is inexcusable? EFF?


The dialogue has been dual monologues for so long that buying more strongly-worded letters is unlikely to help anyone but the ink-sellers.

Donate directly to the open-ecosystem cryptographers, and seek a technical solution whenever a political solution is impossible.


EFF, Demand Progress, and FFTF are great ones


And by extension the FSF and free software in general.


>We are not consumers, we are people.

I often base my vote, when all the candidates in a given election have similar policies, on how they refer to people.

Use of "Consumers" loses my vote instantly. "Taxpayers" is a yellow flag - if used in specific situations that involves concern over major public funds going to private businesses (like building a stadium for a pro sports team) I'll let it slide but otherwise it only raises suspicion in me. But if a politician consistently uses "citizen", I'll give them a little mental gold star.


Citizen sounds like a pretty awful word choice in a country with so many non-citizen immigrants though (and non-immigrants for that matter). It's sound like they're deliberately being excluded.

I used to hate the word consumer too (I don't love it now either) but honestly basing your vote on it seems like a really coarse way to distinguish between good and bad candidates...


>Citizen sounds like a pretty awful word choice in a country with so many non-citizen immigrants though (and non-immigrants for that matter). It's sound like they're deliberately being excluded.

I was referring to its use in the context of an election, which is often limited to the citizenry (in any democracy).

Also, where I live, public figures who use the word "citizen" to refer to ordinary people often don't use it in the literal sense of the word when discussing public-interest matters. They're often using it to emphasize the responsibilities and rights that all people have in relation to the broader society. In other words, where I live it's not often used in an exclusionary xenophobic way, but instead in an inclusive common-purpose way that emphasizes our shared humanity. The xenophobes here tend to prefer "taxpayer".


Some countries allow legal non-citizen residents to vote (NZ for example) - you know, that whole no taxation without representation thing.

In the US some states or cities allow non-citizen legal residents to vote in some state and local elections (SF for examples allows non-citizen residents to vote for school boards)


> The xenophobes here tend to prefer "taxpayer"

The irony, I guess, being that the xenophobes probably intend to raise a distinction better delineated by the word "citizen".


I think it's because those with political influence who espouse xenophobic views also tend to dislike the poor. "Taxpayer" as an identity has connotations of "only the landed gentry should have a voice in society's management", whereas "citizen" connotes equal rights and a voice in society regardless of income.


It's also incredibly outdated, as business enterprises today rely on the exact same computing and encryption technologies that consumers do. We all carry iPhones, use chat applications to have business conversations, etc., and of course TLS is TLS.

Heck, the President and many members of Congress, the military, intelligence services, etc. use an iPhone. Maybe let's think about that before we stick a backdoor in that piece of "consumer" technology.


> the President and many members of Congress, the military, intelligence services, etc. use an iPhone

But with extra software that isn't on the iPhones ordinary consumers get, to implement and enforce the government's security policies for its officials.


So we need backdoors to catch a subset of consumers who commit crime. Is there not a subset of corporations that also commit crimes? Are those crimes not important enough for us to investigate using the same tools?


The only way you (the person) will ever gain freedom from them (any large group organized well enough to consider your person worth exploiting for profit), is to keep making a new world for yourself, creatively, artistically, formatively and personally - i.e. become the non-consumer and build your own things.

Corporations are people too. Groups of people, ganged together and agreeing to do big and enormous things to other big, enormous groups of people, is how we get into the corporate nightmare.

The spirit of the individual is what needs to persist in all of this. If you want privacy, you have to be willing to gain it and maintain it yourself.

We are not consumers, we are people - but we can be consumed as well as any other resource. To stay ahead of that, produce!


I agree with this, but it is essential to realize that everyone (except perhaps a handful of hermits) is a consumer in most situations, even if they are a producer. There is no running away from being a consumer in modern society and this is not a bad thing: it means we are cooperating for mutual benefit, so that we don't have to build our own mud shacks and pottery.

However, when I read the word "consumer" in pronouncements like this, it seems like something else. The people using it usually don't mean it in a way that implies they are themselves consumers. They mean it in a class-forming way. The only way I would personally be able to refer to other people in this way would be to become the controller of a large global corporation and to be willing to exploit and submit others for my personal gain. This is not something I would ever want to do.


Ironic considering corporations are people.


They've always been legal persons, otherwise they couldn't be on either end of a court case. They can even be charged criminally. That's what corporate personhood generally used to mean.


I think it's the use of the word "person" in any sense that is confusing and foments anger. Better to call them "legal entities" or something.

It's a very weird contortion of logic to say something like, "Only people can be the subject of a legal instrument; We need corporations to be the subject of a legal instrument; We must make corporations people."

I mean, we have control over the underlying premise here. We can change it. Maybe that makes some people less angry. Maybe that makes it easier to reason about corporate issues without accidental conflation of the subtly different aspects of "person".


I think it's a kind of path-dependency in how Anglo-American common law was agglomerated. It was easier to patch in corporations as "legal persons" than to create a parallel designation.


Easy is not always appropriate or best.

That patch in has had grave consequences.


Very importantly, it also means that a corporation's assets are separate from those of its owners, so if you own a piece of a company and it gets sued, you don't lose your own home and assets if the company loses.


Perhaps the hope is that corporations would behave more ethically if their owners were personally liable for them, for more then just their initial investment.

A liability shield is a fantastic invention for capital owners, because when things go pear shaped, they can leave the rest of us holding the buck.

Most of us aren't capital owners, in any meaningful sense, though.


If only the super wealthy could create large companies... I'm not sure things would be a lot better. Indeed, I could imagine things being significantly worse.

This is an interesting book looking at how legal structures may have held back otherwise very advanced cultures in the middle east:

https://www.economist.com/business/2011/01/27/the-crescent-a...

It refers to this book:

https://press.princeton.edu/titles/9273.html


Retirement savings, IRAs, and pensions tend to make many people minor owners of capital.


Most Americans have less than a year's wages invested into capital, and won't be receiving a lick of pension upon retirement.

If you look at the developing world, the situation is even bleaker.

So, as far as ownership structures of capital go, for 95% of the population, it is a rounding error.

Also, damages that required liability doesn't just magically disappear, if you fail to pass it on to a company's owners.

The harm that caused the liability has already been done, the shareholders have already been paid from the profits of it - but the victims aren't getting compensation for it. In any just world, these profits would be clawed back.


>Most Americans have less than a year's wages invested into capital,

That's still investment. Make up your mind on if an investment implies criminal and financial liability or not.

>and won't be receiving a lick of pension upon retirement.

That's because pensions aren't very popular in the US. You should look into IRA's, 401ks, etc to understand how the current working class has to prepare for retirement. Pensions (at least the ones that used to be popular in the US offered by companies) are complete garbage for the very reason that they can be raided.

>The harm that caused the liability has already been done, the shareholders have already been paid from the profits of it

No, it's actually pretty rare for modern companies to immediately issue all profits to shareholders as dividends.

>In any just world, these profits would be clawed back.

Well the company gets hit with a huge fine and all of the shareholders (and bondholders in bankruptcy) lose their investment. It's very rare to find a company that did something illegal long enough to generate 100% dividends to cover the initial shareholder investment and then get caught and wiped out.


I think you are right - https://www.cnbc.com/2018/05/11/how-many-americans-have-no-r... - so, I guess I should have said "some people" have capital.


You don't find OP's wording ironic in light of that fact? I do.


Corporations are legal persons, which is a legal designation and a sort of fiction. Humans are natural persons, which is both a legal reality and an actual reality. If someone says "I'm a person, corporations aren't" they're definitely talking about natural personhood, in which case they're completely correct.

"Humans and corporations are both persons" is kind of facile- corporations don't have quite the same set of rights that humans do.


> This very American attitude of separating "business enterprises" from "consumers", which to me sounds like separating the noble and important from peasants or cattle, is utterly sickening

I'm kind of on the fence about this. My counterargument to your comment is that the iPhone, Google search, global shipping, oil production, etc. could not be created by individuals. These things exist because huge groups of humans organized into things called "businesses".

Protecting businesses is like protecting the economies of states. There are massive effects for lots of people if businesses are not treated well. So yes, businesses get special deals sometimes. Unfortunately, this is a slippery slope, and people often fall down it.

To me, the truth in your comment is that US culture tends to give credit to CEOs for the efforts of all the other people in the organization. And this can be easily conflated with all kinds of other issues like "fines are just things that are illegal for poor people".


My counterargument to that is that businesses do not need protection to function, certainly not as much as they are given. In fact, the amount of "protection" given to large businesses today is nothing more but the result of a runaway process of allowing businesses to acquire too much power, which they can use to rig the system in order to acquire even more.

Businesses are a good and necessary concept, but that does not mean they are above the people. Businesses are servants of the people and of society.

To get back on the topic at hand, businesses should definitely not get special allowance to use some technology that "ordinary" people are outlawed from using. That is a path to disaster.


Based on your argument then, Linux is a business and not created by an open-source community.


More than 90% of contributions to Linux now are from developers sponsored by corporations: https://thenewstack.io/contributes-linux-kernel.


I actually appreciate Barr's point. Honestly given the cost of not saving lives that could have been saved, abducted children, potential active shooters, terrorist threats all surrendered for the benefit of me getting end to end encryption to ask my wife what kind of drink she wants me to buy from Starbucks. I, speaking only for myself, find lawful access to such personal messaging, given probable cause of a crime being committed and a warrant signed by a judge, tolerable and reasonable.

I also get the sentiment that people have more of a problem with the idea of government access to private communications than with weakening the encryption to allow it. I see a lot of the conversation being all about privacy, and not about encryption.

In discussion of the latter I don't have a lot to add, but in the former, the courts have decided many times over that law enforcement and the state can access information about you if they have probable cause that you're committing a crime. If you enjoy living in a society governed by laws and don't tend towards libertarian extremes of personal freedom it's something worth accepting and the discussion on the highest level of whether we should or shouldn't do this doesn't even consider this side of the argument, despite it filling up 90% of the user generated discussion about this topic whenever it comes up.


This perspective really bothers me— I’m sure I’m not the only one. It seems to presume that the current state of affairs, where the police lack some ability to access some of the evidence which is in principle available, is a novel situation demanding of a novel response.

But, like, it’s not like the original ratifiers of the Bill of Rights didn’t understand what they were doing. They _knew_ that by outlawing common law enforcement practices like arbitrary searches and coerced confessions, they would be giving up some of their ability to “punish” certain types of “criminals”. They got that it meant they couldn’t listen in on a man’s conversation with God. That’s the deal they struck to try to form a country governed by laws, not by men.

It really seems like it’s not such a new situation, after all; we just have new values.


Founding fathers had no way of knowing of the current state of affairs, technologically speaking.

Imagine billions of unbreakable safes containing secret messages to and from anyone globally, instantly transported to anywhere in the world.

Like, dude, anyone with the appropriate security clearance that is aware of the implications of the encryption status quo isn’t spending time posting here.


People being able to communicate securely while leaving no trace behind is not new.

Granted, it is new that we can communicate world wide this way, but I don't see how that changes much. "The world" to an individual has always been, and still is mostly, filled with people they could easily meet with for a secret conversation.

Also, are not our minds like an unbreakable safe? Human minds have been around for a long time.


No wonder you're using a throwaway to post this


I can’t downvote you but your response doesn’t add any value to the conversation.


I was going to respond, but there is no point arguing with trolls posting pro-authoritarian opinions from a fake account because they're too scared to have their thoughts tied to their identity in any way


This response is exactly what I'm talking about. The conversation about allowing government access to private conversations is already addressed by court precedence.

The weakening of encryption, which may allow bad actors to exploit what the government has access to, is the actual discussion that lawmakers are considering.

People have a real hard time with coming to terms with the powers that police have. Those are not in question though. You can get mad about it, downvote me, but powerful law enforcement is a fact of modern life and it's not going away. In the US, you'd need a constitutional amendment, and there's no one calling for it, it's not an issue on any of the major parties platforms.


>You can get mad about it, downvote me, but powerful law enforcement is a fact of modern life and it's not going away

You're right. The US, China, Russia, and many other authoritarian regimes and politicians are working on further militarizing law enforcement and chipping away at the protections of citizens, such as supporting backdoors in encryption. Not a world I want to live in


History teaches us that methods used for access with a warrant will be abused to get access without a warrant. It teaches us that warrants will be obtained fraudulently. It teaches us that law enforcement personnel are no more trustworthy than any other person.

The instincts of these organizations is to vacuum up every bit of communication that they can, and figure out how to justify access to it later. In such a world, there is no true privacy. The only method we have to combat this instinct is widely-available and widely-used strong encryption.


If we allow this we have now given control of all future politicians over to the party in power. If the admin. in power does not like what you are saying or that you are running against them, without encryption it would be trivial for them to leak to the world what type of pornography you like or the fact that you are a closeted homosexual etc. This becomes especially dangerous in countries where your sexual preferences can be a death sentence.

Imagine China's social credit score policy expanded in the US to factor in every text message you send or website you visit. It does not end well.


The techies who agree tend to frequent circles other than hacker news.


Such as?


As a security researcher, I would like to present a counterpoint to the general discussion here, unpopular though it may be: Access to the internet is a privilege, not a right. There is nothing in the US Constitution regarding the internet. An individual is not required to use the internet to communicate with another person, this is a matter of convenience, ergo privilege, but not a right. Along with face to face meetings, letters and phone calls, individuals are also, through the power of software, allowed to implement their own encryption over existing communications channels.

The internet has been a force of good, but also a force of evil, in this world. Rapid dissemination of personal opinion masquerading as fact has lead to extremism and polarization across the globe, this is undeniable. Some degree of accountability needs to be introduced into the system for the internet to reach the next level of maturity. The government is allowed to access your telephone records. The corporation holds your records to a certain date as mandated by law, and hands them over when a lawful request (warrant) is made. Full-on disk encryption and end-to-end encryption make it impossible for the government to access those records even when a lawful request is made. Note that the Fourth amendment states unreasonable searchs and seizures. That does not mean the individual is allowed to be impervious to searches and seizures. The reasonableness clause protects the interests of the state and allows courts to decide yay or nay on a case by case basis. That is the very intent and spirit of the law.

Currently technology, not law, is the gatekeeper, and technology is controlled by corporations. In a lawful society, this is untenable in the long-term. If anything, it enables tyranny by corporations, since they are unelected and not responsible to the public, whereas elected governments, in fact, are. The history of US is replete with cases where corporations have grown too powerful and governments required new laws to counter the threat they presented to society.


> Access to the internet is a privilege, not a right. There is nothing in the US Constitution regarding the internet. An individual is not required to use the internet to communicate with another person, this is a matter of convenience, ergo privilege, but not a right.

I think the recent 8-0 ruling by the Supreme Court completely invalidates this statement in the context of the government being able to block ones access to the internet in general.

> Justice Anthony Kennedy began by outlining what he described as a “fundamental principle of the First Amendment”: that everyone should “have access to places where they can speak and listen, and then, after reflection, speak and listen once more.” And even if once it may have been hard to determine which places are “the most important” “for the exchange of views,” Kennedy concluded, it isn’t hard now. Instead, he reasoned, it is “clear” that the Internet and, in particular, social media provide such opportunities, with “three times the population of North America” now using Facebook. Emphasizing that Packingham’s case “is one of the first this Court has taken to address the relationship between the First Amendment and the modern Internet,” Kennedy warned that the court should “exercise extreme caution before suggesting that the First Amendment provides scant protection for access” to ubiquitous social-networking sites like Facebook and Twitter.

Source: https://www.scotusblog.com/2017/06/opinion-analysis-court-in...


That's an excellent ruling, thank you for sharing that. Brings the internet closer to a "right" for certain, but does not require the government to provide access to the internet.

And, it may not matter: When making its case, the Justice Department will disassociate the use of encryption technology from access, arguing that limiting encryption options on consumer devices does not limit participation in an online forum.


A lot of what you describe is only true/relevant to the US.

In other countries access to the Internet is a right [1]. Of course it depends on how one defines what is a "right". It's not a right in the same way that everyone have the right to food or clean water (even though we cannot provide even these consistently to everyone), but it is a right and an increasingly important one.

The fact that many international companies are based in the US is indeed quite unfortunate and I find extremely appalling that they can just hand over my data to anyone asking, especially if that "anyone" can also define what is "lawful".

Things should be end-to-end encrypted for everyone, with absolutely no back doors or any kind of weakening of the encryption.

The argument that "this enables bad guys to do bad things" can only stand if they can show us hard data about how many bad guys they have caught because they were communicating in clear text and how much this number decreased because of the increased adoption of https and end-to-end encryption. But of course they cannot do that, because "National Security".

Well, they can't have it both ways.

1: https://en.wikipedia.org/wiki/Right_to_Internet_access


I'm really disappointed by this comment. I'm not sure it's as bad as the newspaper analogy but let me try another.

In the US you do not have a right to drive a car (s/car/transportation). But lack of access to a car greatly hinders you in almost every aspect of life. Time spent traveling can easily be 2x-100x without access to a car. Some places it's impossible to travel without one. This is such a problem that in America it is all but a right to drive. Drivers licenses are easy to obtain because of this need for access. Overall this low barrier provides much more production and utility that it does downfalls (crashes and fatalities from underexperienced drivers, driving while drunk, etc). Access to transportation is all but a right. In today's day and age it is an essential part of life. Without it life becomes extremely difficult in modern society.

As to your second paragraph, I'll remind everyone that this is not the first time in history that we've had these problems. They are generally rooted from other societal problems. I'm not convinced that the internet amplifies this problem. There's historical examples of rumors spreading faster than a horse could travel between cities. The difference here is really number of people. It's node connections after all that's the issue, not really speed of information between nodes.


When Daniel Bernstein sued the government, he settled the question over whether code is protected by 1A. It is. Publishing cryptographic algorithms, publishing ciphertext, and privately recording ciphertext are all constitutionally protected rights. Given the right circumstances the government may search your property to recover that ciphertext, or intercept it in some way, but if they are unable to find the keys, you also have a constitutionally protected right to not provide them.

If you think the government should have the right to regulate speech via ciphertext, then that would require a constitutional amendment.


Let me edit this in the way I read it

>Access to the [grocery store] is a privilege, not a right. There is nothing in the US Constitution regarding [grocery stores]. An individual is not required to use [grocery stores] to [obtain food], this is a matter of convenience, ergo privilege, but not a right.

People will always do bad things, and taking away the tools to do bad things isn't going to stop it. Without even touching privacy rights (where America has less rights than many other countries), there are many other ways the government can lawfully spy on you. Encryption technology is great; the genie is out of the bottle, and I doubt anything can really stop it.


This is a strawman argument.

I do not believe (and have not said) that Congress will outlaw encryption. As you said, the genie is out. However, they can, and eventually will, restrict technology companies from providing high-quality, turn-key, encryption solutions on consumer devices. This is what Barr is building up to. Individual users will be free to implement custom encryption on top of said platforms.


This is like saying that access to newspapers is a privilege and not a right. What use is free speech if there is no common medium on which to distribute it? This is another case of people blaming the tool instead of the people that are doing bad things.


If we lived in a lawful society, your argument would have some merit. Unfortunately in the case of the USA, agents of the state can literally throw hand grenades into the cribs of sleeping toddlers and face no consequences. We need to minimize the power of the state to harm the people, weakening encryption does not do that.

https://www.nbcnews.com/news/us-news/ex-georgia-deputy-acqui...


How do you propose law to become the gatekeeper of tech when most US laws today are written and controlled by corporations, via campaign donations and lobbyists?

How can US law understand/regulate tech, given lawmakers are woefully uninformed, especially relative to those tech companies they seek to regulate?

How do you defend looking back to the Constitution in your argument, given its authors couldn't have envisioned the Internet or all its technological brethren? Do you think that male slave-owners in 1776, "representatives" chosen in a plutocratic fashion, should define what our rights are for all time, or is it possible we may gain new rights to previously un-thought of inventions?


This comment is woefully wrong on all fronts, but the most galling is the final paragraph. The Fourth Amendment is touted time and time again when full-disk encryption is challenged. To turn around and say "How could the founders have foreseen the Internet??" and therefore imply that it wasn't meant to cover the Internet is hypocrisy of the highest order. You can't have your cake and eat it too!


My personal files in my home are protected by my data on my phone is not. How convenient.


The first two points are absolutely not "woefully wrong". Many laws are essentially written and controlled by corporations via lobbyists, and most politicians are incredibly uninformed/misinformed about tech


Something not being in the US Constitution doesn't make it not a right.

>If anything, it enables tyranny by corporations, since they are unelected and not responsible to the public, whereas elected governments, in fact, are.

I feel the opposite. Corporations can easily be taken down when you stop paying them. Governments are extremely hard to fight against, especially with rampant corruption and gerrymandering. There's also the fact that almost all tyranny in the world is enacted by governments telling people what their "rights" are.


> Full-on disk encryption and end-to-end encryption make it impossible for the government to access those records even when a lawful request is made.

Is this not what courts are for? The government has plenty of tools to enforce lawful orders against companies.


They can have access to the records - they don't have the ability to decode them. Their powers are still intact but they are useless to them.


I believe the state has methods to compel obedience. Again, this is what the courts are for. Do you think any actor in this scenario can resist the power of the government, who has the ability to jail or kill you?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: