Is there something I'm missing on the Docker story? Seems like they built something that everyone uses as an integral part of their workflow, and they're looking to get paid for it, which, I may be out of date here, was the entire ethos of this entire site for quite some time. Did the founder of Docker kick a bunch of puppies or something? Is there some reason I'm missing why we should be angry about being asked to pay for something that everyone uses for everything and derives a lot of value from?
(This isn't to take away from Rancher, by the way - good on 'em, I'm all for competition in developer tools)
> Seems like they built something that everyone uses as an integral part of their workflow, and they're looking to get paid for it, which, I may be out of date here, was the entire ethos of this entire site for quite some time
1. Not every widely-used tool needs to be a VC-powered unicorn startup. I'd be pissed if Linus/the Linux Foundation started to demand a per-core licensing fee. I'd probably convince my organization to switch to a BSD, on principle.
2. No one likes a bait-and-switch, and lately, lots of companies see Free/Open Source Software as a "growth hack technique" rather than an actual philosophy, because they'll otherwise face headwinds with a closed-source product. This is akin to the underwear gnome strategy:
1. Author Open Source product
2. Get wide adoption
3. ???
4. Profit
The problem with this is that if Docker Inc goes under, you can say goodbye to Docker Hub: https://hub.docker.com/
Sure, there are alternative repositories and for your own needs you can use anything from Sonatype Nexus, JFrog Artifactory, Gitlab Registry or any of the cloud based ones, but Hub disappearing would be a 100 times worse than the left pad incident in the npm world.
Thus, whenever Docker Inc releases a new statement about some paid service that may or may not get more money from large corporations, i force myself to be cautiously optimistic, knowing that the community of hackers will pick up the slack and work around those tools on a more personal scale (e.g. Rancher Desktop vs Docker Desktop). That said, it might just be a Stockholm Syndrome of sorts, but can you imagine the fallout if Hub disappeared?
Of course, you should never trust any large corporation, unless you have the source code that you can build the app from yourself. For example, Caddy v1 (a web server) essentially got abandoned with no support, so the few people still using it had to possibly build their own releases and fix the bugs themselves, which was only possible because of source code availability, before eventually migrating to v2 or something else.
Therefore, it makes sense to always treat external dependencies, be it services, libraries, even tools like they're hostile - of course, you don't always have the resources to do that in depth, but for example seeing that VS Code is not the only option but we also have VS Codium (https://vscodium.com/) is encouraging.
Docker hub going down would be a disaster for sure, but I consider "pull image/library from 3rd party hub over the internet on every build" to be an anti-pattern (which is considerably worse with npm, compared to docker). That said,if this is where the value is being provided, perhaps they ought to charge for this service? I guess it's difficult because it's easily commoditized.
> but can you imagine the fallout if Hub disappeared?
I wish that would actually happen - not forever - if it'd go down for a day or 2 with no ETA for a fix, and the thousands of failed builds/deploys will force organizations to rethink their processes.
I think Go's approach on libraries is the way forward - effectively having a caching proxy that you control. I know apt (the package manager) also supports a similar caching scheme.
Large orgs started hitting the rate limits since many devs were coming from the same ip. Most places probably put in a proxy that caches to a local registry.
That's what we did, put a proxy in front that caches everything. Now that Docker Desktop requires licensing, we're going down the road of getting everyone under a paid account.
I'm sure Rancher is great for personal desktop use, but there's no reason large companies can't pay for Docker.
Or even small. At work, I advised that we just pay for Docker Desktop. We got it for free for a long time. Our reason for not paying is that we're an Artifactory shop, so their Docker Enterprise offering wasn't really attractive to us. But we're easily getting $5/dev/mo worth of value out of Docker Desktop.
And I don't really see this as an open source bait and switch, either. Parts of Docker are open source but Docker Desktop was merely freeware.
That said, I believe in healthy competition, and so it was quite worrisome to me that Docker Desktop seemed to be the only legitimate game in town when it came to bringing containerization with decent UX and cross-platform compatibility to non-Linux development workstations. So I'm happy to see Rancher Desktop arrive on the scene, and very much hope to see the project gain traction. Even if we stay with Docker, they desperately need some legitimate competition on this front in order to be healthy.
> but can you imagine the fallout if Hub disappeared?
> I wish that would actually happen - not forever - if it'd go down for a day or 2 with no ETA for a fix
Do people not run their own private registry with proxying enabled? If Docker Hub went down at this point, I think my company would be fine for _months_. Only time we need to hit Hub is when our private registry doesn't have the image yet.
You can already cache dockerhub via the docker repo container very easily. In fact, due to the number of builds, it would be foolish not to do this to avoid GBs of downloads all the time.
> Hub disappearing would be a 100 times worse than the left pad incident in the npm world
This is really overdramatic. If Docker Inc. went out of business and Docker Hub was shutdown then the void would be filled very quickly. Many cloud providers would step in with new registries. Also, swapping in a new registry for your base images is really easy. Not to mention the tons of lead time you’d get before docker hub goes down to swap them. Maybe they’d even fix https://github.com/moby/moby/issues/33069 on their way out, so we can just swap out the default registry in the config and be done with it.
> Also, swapping in a new registry for your base images is really easy.
This is the exact problem! Sure, MySQL, PHP, JDK, Alpine and other images would probably be made available, but what about the other images that you might rely on, but the developers of which might simply no longer care about them or might not have the free time to reupload them to a new place.
Sure, you should be able to build your own from the source and maintain them, but in practice there are plenty of cases when non-public-facing tools don't need updates and are good for the one thing that you use them for. Not everyone has the time or resources to familiarize themselves with the inner workings of everything that's in their stack, especially when they have social circumstances to deal with, like business goals to be met.
In part, that's why I suggest that everyone get a copy of JFrog Artifactory or a similar solution and use it as a caching proxy in front of Docker Hub or any other registry. That's also what you should be doing in the first place, to also avoid the Docker Hub rate limits and speed up your builds, not downloading everything from the internet every time.
Otherwise it's like saying that if your Google cloud storage account gets banned, you can just use Microsoft's offering, while it's the actual data that was lost that's the problem - everything from your Master's thesis, to pictures of you and your parents. Perhaps that's a pretty good analogy, because the reality is that most people don't or simply can't follow the 3-2-1 rule of backups either.
The recent Facebook outage cost millions in losses. Imagine something like that for CI/CD pipelines - a huge number of industry companies would not be able to deliver value, work everywhere grinding to a half, shareholders wouldn't be pleased.
Of course, whether we as a society should care about that is another matter entirely.
Its only job is to run containers on a particular schedule, no more no less. There are very few attack vectors for something like that, considering that it doesn't talk to the outside world, nor processes any user input data.
Then again, it's not my job to pass judgement on situations like that, merely acknowledge that they exist and therefore the consequences of those suddenly breaking cannot be ignored.
If you depend on it, you should keep a local copy around that you can host if needed.
Things get abandoned all the time. When you make them part of your stack, you now are forever indebted to keeping them alive yourself until the point in which you free yourself from that burden.
If only we could have a truly distributed system for storing content addressed blobs ... perhaps using IPFS for docker images. This way you could swap the hosting provider without having to update the image references
I’d love for others with more knowledgeable to chime in, since this feels close to the logical end state for non-user-facing distribution.
At a protocol level, content basically becomes a combination of a hash/digest and one or more canonical sources/hubs.
This allow any intermediaries to cache or serve the content to reduce bandwidth/increase locality, and could have many different implementations for different environments to take advantage of local networks as well as public networks in a similar fashion as recursive DNS resolvers. In this fashion you could transparently cache at a host level as well as eg your local cloud provider to reduce latency/bw.
I’m not super well versed, but I thought BitTorrent’s main contribution was essentially the chunking and distributed hash table. There is perhaps a hood analog of the different layers of a docker image.
Hub disappearing would be the best thing that happened to Docker in years. People really shouldn’t be running the first result from Hub as root on their machines.
I miss a version of hub with _only_ official images.
Given that it is extremely trivial to run your own container registry, I think the focus on this as some great common good is overstated. As it is 99% of the containers on it are for lack of a better word absolute trash, so it is not very useful as it stands.
VSCodium doesn't add anything other than build VSCode source without telemetry and provide real FOSS build of VS Code. If VSCode development stopped then VSCodium will stop also.
> The problem with this is that if Docker Inc goes under, you can say goodbye to Docker Hub: https://hub.docker.com/
So you think that Docker Hub is Docker Inc's entire value proposition?
And if Docker Inc is nothing more than a glorified blob storage service, how much do you think should company be worth?
Oh, not at all! I just think that it's the biggest Achilles' heel around Docker at the moment, one that could have catastrophic consequences on the industry.
- you no longer can use your own images that are stored in Hub
- because of that, you cannot deploy new nodes, new environments or really test anything
- you also cannot push new images or release new software versions, what you have in production is all that there is
- the entire history of your releases is suddenly gone
I don't pass judgements on the worth of the company, nor is there any actual way to objectively decide how much it's worth, seeing as they also work on Docker, Docker Compose, Docker Swarm (maintenance mode only though), Docker Desktop and other offerings that are of no relevance to me or others.
Either way, i suggest that anyone have a caching Docker registry in front of Docker Hub or any other cloud based registry, for example the JFrog Artifactory one. Frankly, you should be doing that with all of your dependencies, be it Maven, npm, NuGet, pip, gems etc.
Most widely-used tools are not VC-powered unicorn startups and nobody said they needed to be. You're free to create all the tools you want, while others can raise money to develop theirs.
If open-source helps the product grow and the community benefits then what's the problem? Who lost here? And why are there headwinds with closed-source products anyway? Open-source doesn't mean free, so what's the objection?
Docker the company executed poorly in monetizing their product but there's a lot of undue hate compared to the value it has created. If you don't like it when it's closed-source, and you don't like it when it's open-source, then what do you want exactly?
Counterpoint: yes it does. Hardly anyone pays for external open-source products. Managed solutions, yes, but we've seen multiple times that trying to close an open system so you can charge for it is very unpopular. For example, my workplace has a company-wide edict against using or even downloading the Oracle JDK.
As I understand it, that’s not entirely correct, and the Docker Desktop we all use and … well, _just use_ … is built from a number of components that are or used to be OSS: Docker, Docker Compose, docker-machine, and Kite, amongst others. Granted, Docker Desktop is more polished than Kite was, but it’s also had years of VC money thrown at it so that it’s almost as bloated in appearance as the Dropbox client.
And that’s sort of the problem. I don’t want the Docker Desktop that exists. I want something that does all of the behind-the-scenes stuff that Docker Desktop does and gives me a nearly-identical experience to developing on Linux even though my preference is macOS. I might even pay a _reasonable_ subscription for it.
But the Docker Desktop that is? Not exactly something that I think is worth paying for.
Free means free - but if versions 1.0 to X are free today and version X+1 is paid tomorrow, that is a bait-and-switch. There's no hate here, it's just that I (and any competent client company) have no way of knowing if they "won't alter the deal any further".
The problem is not with the open-source approach: in chasing growth, they commoditized both areas they could have monetized - the client and the service. If they had charged for either (or both) at first, they wouldn't have gained traction, and some other company would have ate their lunch.
So they commoditized and failed, or they could've been commercial from the start and someone else would've commoditized it and they still fail. So what? That's the point of a startup, they tried to build something and it didn't work out as a business model.
The community still benefited greatly from all the development and new projects that came from this. And what is this other company that would've ate their lunch? How would that company survive exactly?
The only objection seems to be the license change, which is still free for the vast majority. Only larger commercial users have to pay, but that seems commensurate with the value they gain from it. Should companies never try to alter terms as the market changes? I don't see why people are entitled to products and services forever, and then hate the company if they try to be sustainable but also hate them if they abandon it.
> I don't see why people are entitled to products and services forever, and then hate the company if they try to be sustainable but also hate them if they abandon it.
Nobody is entitled to anything. Users aren't entitled to free services/products in perpetuity, but the other side of the coin is that companies also aren't entitled to those users. Nor companies are entitled to being free of any criticism.
Let me distill my thinking: a tool does not have to be a company, or be backed by a single-product company.
IMO,the more successful tools tend to be backed by a maintainer & contributors who work on it in their free time, or by a consortium of companies that do not directly make money from the tool, but are willing to put money into it. Docker-like functionality can be replaced by such models, so we are not stuck in a perpetual cycle of ${ToolName}, LLC
If CUDA were a startup, it could be a VC-powered unicorn (not sure about the deserving, but they'd have decent shot at monetization). Unfortunately, my tool knowledge is not broad due to the limitations of the few tech-stacks I'm familiar with.
Honorable mentions: R and Rust, maybe? But I don't see how they'd make the money back (which perhaps is the challenge Docker is running into)
edit: Also SQLite!
2nd edit: I completely misunderstood your question, I think. The answer is "none" - there are no tools that need to be unicorns, at least for those that are downloaded and can be run locally. Those that I listed could be.
Community edition and paid Enterprise plug-ins with support is a standard pattern in the OSS market.
Not quite bait & switch as you put it, and frankly polishing and idiot proofing tools for production workloads is expensive, and requires competent professionals that definitely need to feed themselves and their family.
It's usually a full open source solution and then the enterprise edition gets introduced later to make money. In other words, it's dumping to gain market share and then later trying to use the cornered market to extract profit.
There is absolutely nothing wrong or illegal with companies offering new additional products.
You're not cornered when you can freely choose to buy the enterprise product depending on whether you get any value from it or stick with the open-source version which remains available and has plenty of competition (like Rancher).
In fact that entire issue with Docker is that they have too little value to charge for and too much competition to defend against, the exact opposite of dumping to clear out the market.
Indeed. I’ve worked with several “enterprises” and often the OSS option was discarded for lack of support for certain enterprise use-case or integration with other commercial products; or for lack of professional support for production issues and SLAs.
Obviously this stuff costs effort and time, you can’t expect that to be available for free.
It’s also a great opportunity to commercially exploit your knowledge and taste and make a living out of it; rather than curse “the powers that be” on a daily basis, while struggling with closed software whose only purpose has always been milking as much profit possible
People hate it if you take the free toys away, especially after they've started using them because they were useful and free and now have grown used to them.
+ opinions about Docker are generally not universally positive, which doesn't help. It's not a niche thing only used by superfans, but something that has been promoted and pushed widely. Not everybody who uses it likes it, and it's more infrastructure than "cool tool".
Many of us pay for developer tools so it's not a question of wanting to monetise their product. It's the way they've done it. Constantly harassing people to update their desktop versions as a means to drive them to pay is unacceptable.
I have never seen such user-hostile behaviour from an established company like this and I for one will never ever give them a cent to reward them for it.
They also have an “upgrade” button in docker desktop that means purchase, rather than a version upgrade. I understand that they aren’t the first to call buying a licence an upgrade (although I’m not sure you get much software wise for doing so to warrant the term), but when you closed the update nag screen and then open it, it sort of looks like the button is for that.
Chrome for example has update button in similar placement.
There were competitors back then, I remember when docker was announced and their huge investment. To a large degree that broke us, we could not compete with free. Nor was docker safe to bundle in our application, now about a decade later it barely would be but at least others have done it now.
To me they are the definition of worse is better, but clearly only for a while.
Moving from a freely installed tool to a paid license tool changes the workflow. Its a pain in the ass to manage seats.
Not only that, but it was a bait and switch. Lots of people are using docker because of the previous licensing rules and might have chosen another tool if the new licensing was already in place.
I don't fault docker but I can see why people are annoyed and looking for a drop in replacement as well.
Then everybody realized Docker was just idiosyncratic configuration layered on top of Linux cgroups and if they decide to encumber their base software we will just use the latter directly. And this is how podman/containerd/... came to be.
There are no gifts, Docker isn't owed billions for trying to monetize cgroups.
Docker, as the most used container tool, became a utility and hence it is expected to be free and unencumbered so it remains a helpful tool instead of a hindrance and liability to daily development for – well, perhaps the majority of developers by now. To try and monetise it would just assure that a free and open version will take over; which, I guess, is what we see happening now.
People are generally jealous. Docker feels small enough that "anyone could code it over the weekend", and so they think there is no way this is worth so much money. Of course nobody is trying to actually code something like this over the weekend, or once they do they just shut up.
The cognitive dissonance and entitlement with some is just funny to watch. They have no problem buying stuff from billion dollar companies who treat workers as crap and don't pay taxes, but a smaller developer wants to make money off of their hard work? Nah...
It's probably because a good chunk of the community is composed of privileged people, who never experienced the struggle.
Angry implies we're somehow out of line for believing that they are charging too high a price for a product, especially ones that we provide much of the value to.
Think of those "by the pound" frozen yogurt and toppings places. One day they charge a fair price and sometime later they up their prices and start charging different amounts for different stuff, etc.
I'm not angry if I dip on over to the grocery store and get my own stuff instead. Disappointed that we lost something cool, maybe, but that's on THEIR dumb choices.
Some people think developers should always work for free to enrich the executive and investing classes.
I think that making enterprise pay for stuff that matters for enterprises while still providing open source for society should be seen as a positive, sustainable model.
the opensource monetization playbook is pretty simple: you charge for the higher tier feature that local joe dont need but big corporate management/laywer/compliance mandate.
anything from centralized authentication/auditing/monitoring/provisioning/reporting to checklist certifications gov demands.
I for one don't expect to get free tools, but expact that licensing of something (cough elastic, docker cough) doesn't change overnight in a very hostile manner so I feel ripped off.
I for one mostly use GPL or similarly licensed tools for my development workflow to be able to make sure my building infrastructure doesn't rot, and can be completely replicated by someone when I open the code (with a xGPL license, no less).
OTOH, I pay for some good developer tools, since they make my life easier. However, they're not irreplaceable and they're definitely not "code pipeline infrastructure" tools.
It's not a misapprehension: selling tools to tool-makers requires that you walk a very fine line. A lot of closed-source tools have pissed off developers enough to be replaced by usually-superior open-source versions, e.g. Git replacing BitKeeper because kernel developers were not happy with BK.
Outside of very specialized tools, open source tools tend to attract more contributors and quickly overtake incumbents (see compilers). JetBrains is one of the few companies that bucks the trend, and one I gladly give money to regardless of my increasing VSCode usage.
Had Dropbox been marketed at developers only, it would have been a spectacular failure in a world where inotify, rsync, cron and scp already existed.
> JetBrains is one of the few companies that bucks the trend, and one I gladly give money to regardless of my increasing VSCode usage.
Large parts of IntelliJ are open-source as well, and sometimes they do get used in other open-source editors. Afaict they're pretty good citizens in terms of F/OSS.
This is the (in)famous comment https://news.ycombinator.com/item?id=9224 though, as I wrote, I don't think it's particularly special, it's just the one that gained notoriety. I actually responded to one just this week, they're rife.
The optimism bias is a problem even for programmers.
(This isn't to take away from Rancher, by the way - good on 'em, I'm all for competition in developer tools)