Hacker Newsnew | past | comments | ask | show | jobs | submit | vhcr's commentslogin

This is not a hypothetical, people already do it like that in my country (Argentina), you send your money to a person that buys tokens using cryptocoins, since these websites don't comply with the local regulation, even kids are addicted to gambling.

When your government is as incompetent as you can get while avoiding a revolution people start to gamble.

The rise of gambling in the US does indicate an economic hopelessness that mirrors Argentina, but it’s not quite to the same level yet.


Depend on one less third party, you still depend on the DNS Root servers, your ISP / hosting, domain registry, etc.

Third party root servers are generally used for looking up TLD nameservers, not for looking up domainnames registered to individuals publishing personal blogs^1

Fortunately, one can publish on the www without using ICANN DNS

For example http://199.233.217.201 or https://199.233.217.201

1. I have run own root server for over 15 years

An individual cannot even mention choosing to publish a personal blog over HTTP without being subjected to a kneejerk barrage of inane blather. This is truly a sad state of affairs

I'm experimenting with non-TLS, per packet encryption with a mechanism for built-in virtual hosting (no SNI) and collision-proof "domainnames" on the home network as a reminder that TLS is not the only way to do HTTPS

It's true we depend on ISPs for internet service but that's not a reason to let an unlimited number of _additional_ third parties intermediate and surveil everything we do over the internet


> inane blather

And this is why it's a good thing that every major browser will make it more and more painful, precisely so that instead of arguments about it, we'll just have people deciding whether they want their sites accessible by others or not.

Unencrypted protocols are being successfully deprecated.


You have some weird definition of "root".

https://en.wikipedia.org/wiki/Alternative_DNS_root, so you could (and people have/are) run your own root server.

Definition of "root server"

Authoritative DNS nameserver that serves root.zone, e.g., the one provided by ICANN, or maybe a customised one

In own case it is served only to me on local network

Many years ago, one of the former ICANN board members mentioned on his personal blog running his own root


People using the web can choose what software to use. This includes both client software and server software. Arguably the later ultimately determines whether HTTP is still available on the internet, regardless of whether it is used by any particular client software, e.g., a popular browser

One advertising company through its popular "free browser", a Trojan Horse to collect data for its own purposes, may attempt to "deprecate" an internet protocol by using its influence

But at least in theory such advertising companies are not in charge of such protocols, and whether the public, including people who write server software or client software, can use them or not


Let's Encrypt pushes me to run its self-updating certbot on my personal server, which is a big no-go.

I know about acme.sh, but still...


They're focused on the thing that'll get the most people up and running for the least extra work from them. When you say "push" do you just mean that's the default or are they trying to get you to not use another ACME client like acme.sh or one built in to servers you run anyway or indeed rolling your own?

Like, the default for cars almost everywhere is you buy one made by some car manufacturer like Ford or Toyota or somebody, but usually making your own car is legal, it's just annoyingly difficult and so you don't do that.


As a car mechanic, you could at least tune... until these days when tou can realistically tune only 10..15 years old models, because newer ones are just locked down computers on wheels.

>usually making your own car is legal

It may be legal but good luck ever getting registration for it.


It's actually not that bad in most states, some even have exceptions to emissions requirements for certain classes of self-built cars.

Now, getting required insurance coverage, that can be a different story. Btu even there, many states allow you to post a bond in lieu of an insurance policy meeting state minimums.


Usually making one car, or millions of cars, is doable.

It’s trying to make and sell three or four that is nearly impossible.



I counted by hand, so it might be wrong, but they appear to list and link to 86 different ACME client implementations across more than a dozen languages: https://letsencrypt.org/docs/client-options/

I've used their stuff since it came out and never used certbot, FWIW. If I were to set something up today, I'd probably use https://github.com/dehydrated-io/dehydrated.


Plus, it's one of the easier protocols to implement. I implemented it myself, and it didn't take long.

So you're absolutely not dependent on the client software, or indeed anyone else's client software.


There is a plethora of other clients besides certbot or acme.sh.

Let's Encrypt does not write or maintain certbot

ISRG (Let's Encrypt's parent entity) wrote Certbot, initially under the name "letsencrypt" but it was quickly renamed to be less confusing, and re-homed to the EFF rather than ISRG itself.

So, what you've said is true today, but historically Certbot's origin is tied to Let's Encrypt, which makes sense because initially ACME isn't a standard protocol, it's designed to become a standard protocol but it is still under development and the only practical server implementations are both developed by ISRG / Let's Encrypt. RFC 8555 took years.


Yes, it started that way, but complaining about the current auto-update behavior of the software (not the ACME protocol), is completely unrelated to Let's Encrypt and is instead an arbitrary design decision by someone at EFF.

As far as I remember, since the beginning certbot/let's encrypt client was a piece of crap especially regarding the autodownload and autoupdate (alias autobreak) behavior.

And I couldn't praise enough acme.sh at the opposite that is simple, dependency less and reliable!


Honestly I abandoned acme.sh as it was largely not simple (it’s a giant ball of shell) and that has led to it not being reliable (e.g. silently changing the way you configure acme dns instance URLs, the vendor building their compatibility around an RCE, etc)

Host an onion website at home using solar energy, and the only third party your website will depend on is your internet provider :)

Onion websites also don't need TLS (they have their own built-in encryption) so that solves the previous commenter's complaint too. Add in decentralized mesh networking and it might actually be possible to eliminate the dependency on an ISP too.

> they have their own built-in encryption

What does this mean? Is that encryption not reliant on any third parties, or is it just relying on different third parties?


The onion URL is itself a public key - https://protonmailrmez3lotccipshtkleegetolb73fuirgj7r4o4vfu7... for example.

Proton Mail burned CPU time until they found a public key that started the way they wanted it to.

So that is the public key for an HTTPS equivalent as part of the tor protocol.

You can ALSO get an HTTPS certificate for an onion URL; a few providers offer it. But it’s not necessary for security - it does provide some additional verification (perhaps).


If everyone who wants a human readable domain did this, it would environmentally irresponsible. Then 'typo' domains would be trivial. protonmailrmez31otcciphtkl or protonmailrmez3lotcciphtkl.

Its a shame these did put in a better built-in human readable url system. Maybe a free form text field 15-20 characters long appended to the public key and somehow be made part of that key. Maybe the key contains a checksum of those letters to verify the text field. So something like protonmail.rmez3lotcciphtkl+checksum.

But this being said, I think being a sort of independent 'not needing of third parties' ethic just isnt realistic. Its the libertarian housecat meme writ large. Once you're communicating with others and being part of a shared communal system, you lose that independence. Keeping a personal diary is independent. Anything past that is naturally communal and would involve some level of sharing, cooperation, and dependency on others.

I think this sort of anti-communal attitude is rooted in a lot of regressive stuff and myths of the 'man is an island' and 'great man' nonsense. Then leads to weird stuff like bizarre domain names and services no one likes to use. Outside of very limited use cases, tor just can't compete.


>If everyone who wants a human readable domain did this, it would environmentally irresponsible

Could we finally stop acting like we know how other people's energy is being produced?


And an army of volunteers and feds to run relays

What about the Tor directory authorities?

There is no magic do it all yourself. Communicating with people implies dependence.


I gave up trying to build a solar panel.

What about all the third parties running relays and exit nodes?

If you think about it the spirit of the internet is based on collaboration with other parties. If you want no third parties, there's always file: and localhost.

CAs are uniquely assertive about their right to cut off your access.

My hosting provider may accidentally fuck up, but they'll apologise and fix it.

My CA fucks up, they e-mail me at 7pm telling me I've got to fix their fuck-up for them by jumping through a bunch of hoops they have erected, and they'll only give me 16 hours to do it.

Of course, you might argue my hosting provider has a much higher chance of fucking up....


So what does "CA fixes the problem" look like in your head? Because they'll give you a new certificate right away. You have to install it, but you can automate that, and it's hard to imagine any way they could help that would be better than automation. What else do you want them to do? Asking them to not revoke incorrect or compromised certificates isn't good for maintaining security.

Imagine if, hypothetically speaking, the CA had given you a certificate based on a DNS-01 challenge, but when generating and validating the challenge record they'd forgotten to prefix it with an underscore. Which could have lead to a a certificate being issued to the wrong person if your website was a service like dyndns that lets users create custom subdomains.

Except (a) your website doesn't let users create custom subdomains; (b) as the certificate is now in use, you the certificate holder have demonstrated control over the web server as surely as a HTTP-01 challenge would; (c) you have accounts and contracts and payment information all confirming you are who you say you are; and (d) there is no suggestion whatsoever that the certificate was issued to the wrong person.

And you could have gotten a certificate for free from Lets Encrypt, if you had automatic certificate rotation in place - you paid $500 for a 12-month certificate because you don't.

An organisation with common sense policies might not need to revoke such a certificate at all, let alone revoke it with only hours of notice.


You didn't answer my question. What would the CA fixing it look like? Your hosting example had the company fix problems, not ignore them.

And have you seen how many actual security problems CAs have refused to revoke in the last few years? Holding them to their agreements is important, even if a specific mistake isn't a security problem [for specific clients]. Letting them haggle over the security impact of every mistake is much more hassle than it's worth.

> if you had automatic certificate rotation in place - you paid $500 for a 12-month certificate because you don't

Then in this hypothetical I made a mistake and I should fix it for next time.

And I should be pretty mad at my CA for giving me an invalid certificate. Was there an SLA?


CAs have to follow the baseline rules set by Google and Mozilla regarding incident response timelines. If they gave you more time, the browsers would drop them as a supported CA.

The CAs have to follow the baseline rules set by the CA/Browser Forum which CAs are voting members of.

Mark my words, some day soon an enterprising politician will notice the CA system can be drawn into trade sanctions against the enemy of the day....


The BRs already have a deliberate carve out where a CA can notify that their government requires them to break the rules and how they'll do that, and then the browsers, on behalf of relying parties can take whatever action they deem appropriate.

If you're required to (or choose to) not tell us about it, because of active monitoring when we notice it's likely your CA will be distrusted for not telling us, this is easier because there's a mechanism to tell us about it - same way that there's a way to officially notify the US that you're a spy, so, when you don't (because duh you're a spy) you're screwed 'cos you didn't follow the rules.

The tech centralization under the US government does mean there's a vulnerability on the browser side, but I wouldn't speculate about how long that would last if there's a big problem.


Swimming and Jefferson curls.


That should only happen once, you should store the password for the second domain too.


Second best option is to whitelist USB's PID and VID.


You also have to take into account the browser and OS call stack.


This does not change if you write pure Javascript that directly mutates DOM without calling any intermediate functions.

Given the speed of rendering that browsers achieve, I would say that their call stack during this is highly optimized. I don't see OS doing much at all besides sending the drawing buffers to the GPU.


And also, that with React you are not only buying into React, but also a JavaScript dependency manager/package manager. Be it NPM, or any other. Installing JS package itself already comes with its own problems. And then probably people buy into more stuff to install through that package manager. Some component library and a "router" here, some material style and a little library to wrap the root node with some styling provider or what it is called there, ... before you know it, a typical FE dev will have turned your stack into 80% React and related dependencies and the maintenance on that will continue to grow, as new features "can be solved sooo easily, by just adding another dependency" from the NPM/React world.


No, JOINs are pretty much always faster than performing N+1 queries.



I don't want to check my luggage, I had them damaged or lost more than once.


This is solved with better infrastructure.

https://en.wikipedia.org/wiki/Curb_extension


Great solution but until then I think the gp's point still stands.

We as humans need to ensure our actions are done with care are forethought. You can't control others but you can control yourself and influence others (like this comment attempts to do)

Plus, it's a lot cheaper for all of us if we don't need to constantly redesign once someone figures out how to "beat the game" (see Goodhart's Law). We're social creatures and the Tragedy of the Commons is a much more common occurrence than people think, especially in large cities.

Our actions affect others.


Yes, I've seen some of this. While it certainly helps, it seems like a waste of limited resources. Why can't some people just follow such a simple rule?


Waste of what resources?

If anything I'd expect the sidewalk to be cheaper.


There's a difference between starting from scratch and modifying existing infrastructure


There is.

But the post saying it's a better method isn't suggesting extra labor to do modifications. That's useful just as pure knowledge, and also it can be applied into future designs or when parts of the road wear out.


In Seattle, I've mostly seen approaches such as this as a modification. Though I see your point in a new construction situation.


the picture on that article looks like a nice stripy parking space


A car could push into the first third of it, but visibility would be fine in that case. Trying to use the whole thing in a car would mean you're jutting into the traffic lane, and anyone willing to do that is causing bigger problems. And if a bike parks in the stripes that's fine for visibility too.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: