I’m aware of all of that, I’m just saying that this has not translated into more reliable and better software in the end, interestingly enough. As said, I’m not blaming Docker, at least not directly. It’s more that the whole “ecosystem” around it seems to have so many disadvantages that in the end overweigh the advantages of Docker.
It has translated to reliable legacy software. You can snapshot a piece of software, together with its runtime environment, at the point when it's still possible to build it; and then you can continue to run that built OCI image, with low overhead, on modern hardware — even when building the image from scratch has long become impossible due to e.g. all the package archives that the image fetched from going offline.
(And this enables some increasingly wondrous acts of software archaeology, due to people building OCI images not for preservation, but just for "use at the time" — and then just never purging them from whatever repository they've pushed them to. People are preserving historical software builds in a runnable state, completely by accident!)
Before Docker, the nearest thing you could do to this was to package software as a VM image — and there was no standard for what "a VM image" was, so this wasn't a particularly portable/long-term solution. Often VM-image formats became unsupported faster than the software held in them did!
But now, with OCI images, we're nearly to the point where we've e.g. convinced academic science to publish a paper's computational apparatus as an OCI image, so that it can be pulled 10 years later when attempting to replicate the paper.
The price for that kind of backwards compatibility is a literal army of engineers working for a global megacorporation. Free software could not manage that, so having a pragmatic way to keep software running in isolated containers seems like a great solution to me.
There’s an army of developers working on Linux as well, employed by companies like IBM and Oracle. I don’t see a huge difference to Microsoft here to be honest.
You'd have a better time working with Windows 7 than a 2.x Linux kernel. I love Linux, but Microsoft has longer support Windows for its operating systems.
What are you even talking about? Being able to run 10 year old software (on any OS) is orthogonal to being able to build a piece software whose dependencies are completely missing. Don't pretend like this doesn't happen on Windows.
My point was that a lot of older software, especially desktop apps, did not have such wild dependencies. Therefore this was less of an issue. Today with Python and with JavaScript and its NPM hell it is of course.
> My point was that a lot of older software, especially desktop apps, did not have such wild dependencies. Therefore this was less of an issue.
Anyone who worked with Perl CGI and CPAN would tell you managing dependencies across environments has always been an issue. Regarding desktop software; the phrase "DLL hell" precedes NPM and pip by decades and is fundamentally the same dependency management challenge that docker mostly solves.
I think the disconnect is in viewing your trees and not viewing the forest. Sure you were a responsible disciplined tree engineer for your acres, but what about the rest of the forest? Can we at least agree that docker made plant husbandry easier for the masses world-wide??
Im not sure I would agree here: from my personal experience, the increasing containerisation has definitely nudged lots of large software projects to behave better; they don’t spew so many artifacts all over the filesystem anymore, for example, and increasingly adopt environment variables for configuration.
Additionally, I think lots of projects became able to adopt better tooling faster, since the barrier to use container-based tools is lower. Just think of GitHub Actions, which suddenly enabled everyone and their mother to adopt CI pipelines. That simply wasn’t possible before, and has led to more software adopting static analysis and automated testing, I think.
This might all be true, but has this actually resulted in better software for end users? More stability, faster delivery of useful features? That is my concern.
For SaaS, I'd say it definitely improved and sped up delivery of the software from development machine to CI to production environment. How this translates to actual end users, it's totally up to the developers/DevOps/etc. of each product.
For self-hosted software, be it for business or personal use, it immensely simplified how a software package can be pulled, and run in isolated environment.
Dependency hell is avoided, and you can easily create/start/stop/delete a specific software, without affecting the rest of the host machine.