Hacker News new | past | comments | ask | show | jobs | submit login

how the hell do we create a webserver that will work reliably for the next 30 years if we have to embed SSL certificates with built-in expiration dates in the range of 1 to 3 years

If there's a requirement that your code needs to run for 30 years without an update then current web technology is probably the wrong choice.




A common use case is a device serving a status report as a single simple, static, read-only (i.e. the backend has no intended ability to receive data) html file with limited or no markup. This can reasonably be expected to run for 30 years without an update - there's nothing to update, you have a fixed hardware system showing the same type of numbers to the user until it dies.

Serving this over http in a secure network would be reasonable.

Serving this over https would be reasonable if you can embed a certificate with an appropriate lifetime. Which you can't.


30 years ago browsers didn't exist, so the assumption that in 30 years time they're still going to be compatible with today's servers seems wrong to me. The equivalent would be that you were building this device in 1987 and you embedded a Gopher server on it. It'd still be accessible today but users wouldn't be very happy about how they access it.

As another user said, you own both sides of this problem so write your own client software.


My browser can show a 26 year old web page (http://info.cern.ch/hypertext/WWW/News/9211.html) just fine, and I have all reasons to assume that it'll be able to do it also after 26 more years.

If I write my own client software, that's far less likely to be the case. Writing your own is exactly the thing that you should avoid if long-term compatibility is important, sticking to open standards is a much better way to do that.

Using your own example illustrates it perfectly: if I had a 30 year old device running a Gopher server, that would be usable and I would easily be able to get a client running on a modern computer/OS - but if it needed 30 year old custom client software, then porting it would likely be a quite nasty project.


That example you linked uses absolutely NO CSS or JS! Which yes, is fine and in theory it should work "forever", but any extra functionality and you can't be sure.

Open standards change, software changes and things move on. If you are seriously relying on anything technological working for a decade or more you're out your mind.

You need to keep up and adapt because nobody gives a crap about you or your use case. Here we see Google doing what it thinks best and you need to live with this.


Of course it doesn't use CSS and JS, none of those existed back in 1992 when the website was made.

However, content using CSS1 and javascript 1.0 standards (1996?) should also work in modern browsers; but I couldn't easily find any sites that actually use them and haven't been updated for the last 20+ years to test it; the sites from e.g. 1997-1999 seem to avoid it like the plague since back then you couldn't rely on CSS1 and javascript 1.0 being properly supported.

So yes, just as modern browsers support all web technologies since their very start - a CSS3 compliant browser should be compatible also with CSS2 and CSS1 (20+years old), I'm quite convinced that 20 years into future when we might have CSS 5 or 6, the current CSS3 content will also work as well. New javascript doesn't work on ancient browsers, but ancient javascript works well in modern browsers, that's a core design principle of the web standards - we build them for compatibility.


> The equivalent would be that you were building this device in 1987 and you embedded a Gopher server on it. It'd still be accessible today but users wouldn't be very happy about how they access it.

By using firefox 56? I'm not seeing the issue.

> As another user said, you own both sides of this problem so write your own client software.

Running 30 year old DOS software is a lot more painful than accessing a Gopher server. Is that really the comparison you want to make?


Sadly it's more of a 'wrong' requirement from the customer. Today's entreprise customers expect things to Just Work. They say 'just sell us a goddam appliance, we'll point a browser at it and we'll call it a day!'. I'm quoting real customers' documents here: 'it's should be as easy as Apple'. We've done it to ourselves.


I expect things to just work and I expect them to work for a reasonable lifespan. I often think that a piece of software ought to live as long as a car, and for something commercially created (e.g. costs as much as a house) then it ought to last at least a generation (30years).

I don't think I'm being unreasonable here. In a closed system this should be doable.

edit: one of the reasons I think this is because we are supposed to be engineers, and other engineering disciplines do it (and their disciplines involve computes too). Consider jets and boats.


Jets, boats, and cars all require regular servicing. Parts need fixing and updating and it isn't one and done. The allegory isn't exactly the same as software updates but the allegory of software systems to cars isn't exact either.


Our devices do not normally need servicing. They do age, but this is compensated.


I get what you say. But are you, let's say, a Catia ISO programmer who has been thrown as a leader of a project team which purpose is to buy some piece of a tier software/project that will automagically cut your programming time by half?

He may be an engineer, whatever that means, but still: the average joe in that position will not spend a single second on thinking if he expects the solution he's buying to work for a reasonable lifespan. He's been thrown in a project, that's all. If you force him to state a requirement on the expected lifespan, he will surely say 'Well, as long as possible seems good to me'.

Disclaimer: statistics applies here, of course some people will care, but I'm speaking about the 80%, if not more.


Until there is tangible motivation (i.e. multiple players get publicly put out of business by lax security), customers will continue to refuse to understand things they cannot put their hands on.


But I recently had to watch from the sidelines as our management decided against .Net and Qt as the future stack for user interfaces. In the case of Qt the arguments were clearly misinformed, but I only got to comment on that after the fact. So now we will have to deal with web stacks by decree.


It may at least serve as a fallback, considering that the vendor may cease to exist and client software is rendered permanently unavailable.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: