Plus, Firefox is soon implementing HTTPS-Only by default if I remember correctly. What was it, maybe 2016 there was a big push for SSL and the majority of the web, even login and payment pages, were HTTP? Now only a small percentage of the web isn't HTTPS. I have HTTPS-Only enabled in Firefox and rarely do I have to click the 'Continue Anyway' button to browse an HTTP page. For most general users that only use popular services, I'm sure it's even more rare.
I have a site from 1997, pure html, with drivers, install disks, documentation for computers from the 80s/90s.
It works. It's fine. No, it does not need ssl. What, someone is going to hack a floppy driver for a computer, which doesn't even have a built in network stack?!
No, I am not going to do work on it, any work, at all.
Depending on what the drivers are for, you may be a prime candidate for MitM. People already go to your site to download software they're going to run in the most privileged mode. This is a perfect candidate for a type of watering hole attack.
Considering you're providing those for 90s machines, you could be the last resort website for a few interesting industry computers with no security restrictions around them.
> Depending on what the drivers are for, you may be a prime candidate for MitM.
Doing that MitM is technically very easy, but in practice pretty hard. You'd have to have an adversary on your network path watching for connections to this particular esoteric low-volume site hosting drivers for machines from the 80s and 90s.
That is extremely unlikely.
I have a much easier way to target that content: Just put up a new site hosting the same content with malware attached. No need for MitM shenanigans.
Security isn't about absolutes, it is about risk managment and being aware of the likelihood and consequence of the risks is important.
> No, I am not going to do work on it, any work, at all.
Without HTTPS, the content can be replaced entirely. Last time it was JavaScript that DDOS'd github. If you don't want to serve content over HTTPS, then you don't care what your users receive. Just delete the site and they all get 404's instead, since you already admit that you don't care either way.
If it makes you feel any better, HTTP without HTTPS was a mistake we all made together. It should never have happened.
Given that HTTP without TLS can provide backwards compatibility while anyone and their dog is advocating for deprecating TLS versions and them being too complex for most people to maintain on their own, I respectfully disagree that plain HTTP was a mistake.
You're at a coffee shop or library using their WiFi. Your computer sends a plaintext HTTP message. The attacker just needs to be able to see that message and get a response back to you before the real site does, and the real site is a lot further away than the guy sitting at the table next to you (or the hacked router, if he doesn't want to be there in person). Then they can feed your browser whatever they want.
A login form to phish you, perhaps?
They can even start replying, then go off and fetch from the actual site before finishing the response, if it helps to incorporate the real data.
That is fine. The site itself is safe. Accessing it over untrusted transits is not. What has changed since 97? Well, attacks became far more sophisticated, and the transits that people access stuff over became far less trustworthy.
There is nothing wrong with your website. However, you shouldn't be surprised when modern browsers stop working with it. Progress doesn't come free.
You are hosting executable data of some kind on a non-authenticated protocol. That's totally not dangerous at all. A MITM definitely couldn't cause any damage by altering executable data in transit on unsuspecting users. This has never happened to anyone.
>are safe
No, they are not.
>No, I am not going to do work on it, any work, at all.
If you are too lazy to do it securely maybe you just shouldn't do it at all.
HTTPS everywhere by default can't come fast enough. There is no excuse at all to not have HTTPS support today and browsers should deny access to these lazy and careless sites by default. Anyone who can't spend the 5m to set it up for their website can go kick rocks as far as I'm concerned.
It is all fun and games until one of the downloads from your site picks up malware in transit and the user goes "why did this web admin infect my computer? Sue!"
Not caring about whether some segment (possibly even a majority) of users can or are willing to jump through hoops to access your site is a valid choice, just like publishing through gopher is. You do you.
You could host hashes of the downloads on an https page. Should be quite simple. Malware can still work on a computer without a built-in network stack and if users are getting downloads onto that computer, then data can leave through the same means.
And update all links to not go back to the HTTP site...
And troubleshoot weird issues (TLS errors are generally not helpful)...
And maintain that setup for years...
Not an insurmountable effort for sure, but if you estimate 30 min for the total additional effort of adding HTTPS to a site then I have a bridge to sell you.
Recently I noticed that FF doesn't even let you accept invalid (meaning no longer recognized as valid by FF because they changed the rules to requrie SAN) certificates for HSTS-enabled sites. The bug report's response was that the HSTS standard specifies that. Fuck that, the users should always be the one in control of such decisions in the end.