Hacker News new | past | comments | ask | show | jobs | submit login

Debian is like democracy: the worst way of producing an OS, except all the others that have been tried from time to time.

BeOS, AmigaOS, Solaris, most other 80s OSes - they’re effectively dead. Windows and macOS have effectively died once already. The BSDs can stall for years at times. Most Linux distributions (including RedHat) are typically only as good as the fortunes of the commercial (or occasionally public) entity they have behind. In all this, Debian endures, with its slow but inexorable progress, simply because its ideological foundations - not its technical ones - are eminently superior to all the others. Debian contributors don’t do it for the money, so they will be there when money runs out; and they don’t do it for being cool either, so they will be there when OS work is not cool. People will come and go, but the ideal of the “democratic OS” will always be there - hence, Debian will be too.




> Debian is like democracy: the worst way of producing an OS, except all the others that have been tried from time to time.

How many of us would be happy working at a software company with a bug tracker from the 90s, artifact management done with FTP, little to no tooling to manage large changes and do code review, no standards around source control, etc.? Those are symptoms of a software development culture stuck in the past.

It would be pretty frustrating to go home from my day job, where we have a much better development workflow, and try to make a contribution to the Debian project using Debian's tools and processes. And I'm sure I'm not the only software engineer who would feel that way. That can become a real problem for the future of Debian if it's not addressed.


Cannot say for a company, but in Debian it seems that source code control is being settled with https://salsa.debian.org. For upstream code it is pretty flexible and low expectations are set, a tarball is enough. It also works with git (branches and/or tags), cvs, hg, etc.

There is no need for complexity in order to transfer files, that is why File Transfer Protocol works well. If you refer to public FTP services, those were deprecated 3 year ago. See https://www.debian.org/News/2017/20170425


Funnily I feel the exact opposite. I see Debian as Stable while most other development these days are like webdesign: Learn a tool or framework and start using it - but wait! There's a better one out now and hey look at this shiny new tool. I like stable.


And to be honest, even if you do find that stack objectionable, the tools you wrote to work around the rough edges 20 years ago still work.


> How many of us would be happy working at a software company with a bug tracker from the 90s

How many of us would be happy deciding things in groups with processes defined 100 or 200 years ago? But that's what representative democracy effectively does, every day, in most of the West. People just find ways to cope and move on, since the process is just a mean to an end.

> That can become a real problem for the future of Debian if it's not addressed.

Yes and no. Yes, processes should be improved all the time. No, it's not a real problem in the long run - I've been hear more or less the same story basically since Debian started, but it's still arguably the biggest and most relevant Linux distribution in existence. People come and go, the ideal endures.


And on top of that you also have to involve yourself into endless political discussions about minor topics while dealing with "complicated" maintainers

Thanks but no thanks


It's ironic this is the top comment, since the vibe I got from the post was that people waste way too much time discussing ideological things and not fixing any of the actual problems he encountered as a maintainer.

The thing that's wrong with the Debian project is that ideological stuff doesn't anymore attract talented engineers who are interested in working for free on something dry like package management.


From last two month there are 6 new Debian Developer and 10 new Debian Maintainers. This information is available at https://bits.debian.org

For me, working with packaging is a joy, once I learned how it works I see it as a thin wrapper around an upstream code base that, after built with whatever upstream tooling, is copied within a package alongside it's dependencies information.


I got a different vibe with the association to a Churchill quote[1]

I moved from Ubuntu to Debian a few years back, if I need anything beyond the ordinary I can always set it up manually and I am pretty content with that.

[1] https://www.goodreads.com/quotes/267224-democracy-is-the-wor...


Debian is well known to not be a democracy. This is a do-ocracy. Votes are very rare and, except for systemd (2 in 10 years), are quite non-technical. The second key aspect is that you can't force anyone to do anything. The third key aspect is the project is unable to reach any consensus (there is always someone to disagree and do enough noise for the discussion to go nowhere and we cannot vote).

Most of the points listed by Michael derive from that. It's impossible to change something. We can only do small things for which no cooperation is needed.


> It's impossible to change something. We can only do small things for which no cooperation is needed.

And yet the migration to systemd, which required lots of changes in disparate packages, happened. And the migration away from python2 will happen too, albeit perhaps not as fast as the people driving it would like. And the new source format happened. And for repeatable builds - Debian leads the world.

Methinks "we can only do small things for which no cooperation is needed" might be overstating the case a smidgen. Lots of things with aren't small happen in Debian on a regular basis.


Migration to systemd is painful and it needed two GR and a lot of drama to move forward. We had to wait debhelper 10 for it to not be a hack in packaging (2016).

Migration away from python2 is wanted by doko, the Python maintainer. If he didn't want that, nothing would move. We were stuck for a long time with Python 2.6 because he didn't want to migrate to Python 2.7. As he is also maintainer of gcc and Java, nobody wanted to vote him out.

I may have missed the headlines around the new source format. Ack for repeatable builds.

What about bikesheds/PPA? Many discussions a few years back but mostly blocked because FTP masters want it to be integrated into DAK and under various other non-technical constraints.


> Debian contributors don’t do it for the money, so they will be there when money runs out; and they don’t do it for being cool either, so they will be there when OS work is not cool.

This is a great sentence, probably one of the most important (and underrated) ideas in FOSS and engineering more generally. A lot of critical work is not lucrative or glamorous - does your project recognize and support the people who do that work?


Windows and macOS died?


Windows NT and MacOSX are both scratch rewrites. Though the author of the comment may be referring to the fact that Android and iOS dominate the space now.


No no, I referred to your point precisely: both Win and Macos had to be rewritten from scratch at some point.


> macOS died

Classic Mac OS died, and what they call "Mac OS" now is really NeXTSTEP.


“Death” seems a bit dramatic in this context.


> “Death” seems a bit dramatic in this context.

Classic Mac OS is about as dead as software can get: it's no longer developed or developed for, there's no backwards compatibility in its successors, and they don't even make hardware that can run it anymore.



What about Fedora?

I went SLS, Slackware, Debian, then Ubuntu for something like 15 years, and now just switched to Fedora + RPM Fusion, and so far much happier with it.


Fedora is the experimentation lab for RedHat. Test on Fedora, include in RedHat enterprise, retire to CentOS. That's a similar model of Testing, Stable, OldStable of Debian. Just with a different flavor.


Fedora will be around as long as Redhat does, since it exists primarily as the RHEL unstable branch. Considering RH is now part of IBM, that might well be forever, but still, it’s largely about commercial involvement from a given company, like Ubuntu and Canonical etc.


Arch Linux is much superior to Debian, imo.


While I like and use (and even recommend) Archlinux, Debian’s track record is absolutely venerable. It’s a huge accomplishment to carry so many people and an ecosystem along with you over decades, doing all the unsexy tasks (the number of packages!) and serving as a stable platform in support of user freedom, on top of which others can build nimbler and sexier offerings. The Debian project deserves our utmost respect for its effectiveness in organizing a community around a goal.


I'm a huge fan of Arch Linux and would say that it is the best for personal use.

However I wouldn't dream of running on a fleet of several. I currently have 6 nspawn containers I run, and it's not as consistent to ensure an update won't break it.

Debian is great if you are running many servers. It's slow moving rate is due to care of not breaking the world.


I like Debian as a community, but I think it would benefit from decentralization to speed up development.

The community is great, but current package management techniques and processes are the equivalent of SVN, with modern approaches like Nix or Guix being the equivalent to Git. In Debian, the whole tree of packages has to be in sync. That works well for Arch, as it is a rolling release, but IMHO that slows down Debian as it doesn't use their manpower efficiently.

Longtime ago, when Nix was not popular, there was a discussion in debian-devel about adopting Nix. It was probably premature. This discussion has resurfaced a number of times. I think currently they would benefit enormously from Nix or rolling out their own tooling that implemented equivalent ideas.

With such a big community and large package set, packages should be able to be decoupled from each other so that they can depend on different library versions and move at their own pace. Also, Nix-like tooling would allow to automate and test most package updates when upstream changes, or find common vulnerabilities and exposures (CVEs) automatically. Currently, there's a lot of manual intervention needed to do this.

This would also be advantageous for end users, as they could mix and match packages from different channels. PPAs are an inferior solution.


These are great points. You might consider bringing it up with Debian developers again.

Despite some of the difficulties the author mentioned in the article, Debian has successfully spearheaded some ambitious project-wide initiatives, like reproducible builds. So I don't think it's out of the question that they could vastly improve the packaging experience for both users and developers with something like Nix or Guix.

Of course the biggest question is: how does one get there from here? For example-- can the Nix packaging approach coexist and play nice with the current Debian packaging system for years to come?


> can the Nix packaging approach coexist and play nice with the current Debian packaging system for years to come?

Yes, Nix or an equivalent implementation like Guix stores all packages in a separate tree (e.g. /nix). In fact, Nix can be used outside NixOS. It's in fact quite popular in some distros and macOS.

Hence, rolling out Nix or an equivalent tool can be done smoothly. Both can co-exist nicely.


It really sounds quite workable as a solution, then. IIUC anyone in Debian could start work on this at any time, with really no disruption of the current system.

Of course the devil is in the details-- graphics drivers, bootstrapping, etc.


Debian is one of the only distributions releasing images for 32-bit x86, old PPC, and other less popular architectures[1]. Arch just targets 64-bit x86.

[1] https://wiki.debian.org/SupportedArchitectures


At this point there are no porters for i386 and it no longer has the porter waiver from the release team, so it is likely at this point that Debian bullseye will support 32-bit x86. Old PPC and many other architectures were dropped from releases many years ago. There were no replies to the roll call for porters yet, so it looks like Debian bullseye will just be amd64 too, unless people are replying privately for some reason.

https://lists.debian.org/msgid-search/CAM8zJQvyaL0quk57Tyzqb... https://news.ycombinator.com/item?id=25166634 https://news.ycombinator.com/item?id=24974822


I expect all replies are private. There are plenty of people working on ARM, just to mention one other arch.


Arch isn’t x64 only. Manjaro’s an Arch derivative, and is the preloaded OS on Pine Book Pro’s (they used to preload debian).

The switch is disappointing for me, since I’d prefer Debian with a minimalist wm. However, manjaro + kde is good enough for light usage, and definitely easier for more mainstream users.


Sounds like Arch is still x64 only, if you have to use a derivative to use other architectures?


Technical comparisons are not relevant to my comment.

What is Arch’s charter? How are its leaders elected? How are its processes defined?


I'll start off with a hyperbole: We don't have any of that.

But that isn't really true. Arch historically has always been a DIY distribution with an equally DIY contribution structure. Our leaders has been BDFLs for close to two decades until the process was formalized and we held our first project leader election this year.

https://wiki.archlinux.org/index.php/DeveloperWiki:Project_L...

https://www.archlinux.org/news/the-future-of-the-arch-linux-...

There isn't any RFC process, but some consensus making on the mailing list and who wants to work on stuff.

The only other formalized structure is the Trusted Users which are elected in a formalized process.

https://aur.archlinux.org/trusted-user/TUbylaws.html

There is probably a lot of bad things with a less formalized process, but it allows Arch to move fairly rapidly and decide things without a lot of internal politics.


Let me make a prediction: if Arch survives as long as Debian has, getting the same amount of contributors as Debian got, by the end its internal organisational structures will look a lot like Debian’s. It looks like at the moment it’s where Debian was about 20 years ago.


Debian is only 9 years older then Arch though. Debian was 7 years 20 years ago, Arch is 18 years this year.


What does any of that matter if it doesn't lead to technical superiority? You use an OS for its technical qualities, not for its charter.


When did technical superiority ever matter when it came to an OS?

Your living in a world where OS/2 kills windows 3.11 for workgroups. Where Sega's master system outsold Super Nintendo.


Yes but in neither of those cases the winner got there because of "better democracy" or governance

Win 3.11 was more user friendly than OS/2.

Not sure about the Sega/Nintendo issue, the Master System was comparable to the NES, the SNES to the Genesis/MegaDrive


For personal use, maybe. For fleet usage, production, set'n'forget servers and any critical role, no.

I can provision 100+ Debian servers in any configuration I want under 15 minutes by utilizing the features of the OS itself and, forget them after setting them up.

We actually lost one Debian server in a system room (in a rack of unlabeled cluster of identical servers) and, it was working flawlessly when we re-found it months later.


I love the idea of Arch Linux and it probably is worth everyone who's really interested in Linux trying it at least once. But it's also the only distro I have used for probably more than 10 years where I found myself having to edit my X config to try and get something to work. At that point I just backed away from the keyboard slowly and realised it wasn't worth my time.

I do still have a throwaway cheap VPS with Arch, but even then I can't recommend it because the security story is largely non-existent.


Can you at least expand on why that's your opinion?


Up-to-dateness, no unnecessary distro-specific patching.

EDIT: My comment was ambiguous, I didn't mean that there are no Archlinux-specific patches; rather that there's more of an effort with Archlinux to let upstream be upstream.


No "unnecessary" patching, like seriously? I see so many patches in the Arch packages repo I can't even count them all:

https://github.com/archlinux/svntogit-packages/search?q=patc...

https://github.com/archlinux/svntogit-packages/search?q=sed

But putting that aside, all distros need huge amounts of patching to make each package get along with the rest of the system. Without patching, many of them won't even build in the first place.


There are duplicate PKGBUILD files in the repository, so depending on the PKGBUILD and where it is, there might be 3 results for every 1 patch. In many cases there are two hits pr 1 patch.


You have not seen what Debian did to exim, yet.


> Up-to-dateness

I've read this at least a dozen of times, mostly on HN, and mostly by Archlinux advocates. Many people seem to ignore that Debian testing and Debian unstable are continuously updated (rolling releases). Please stop propagating false claims that taint Archlinux's community reputation.


There are some reasonably good metrics available at Repology:

Pretty picture: https://repology.org/repositories/graphs

Numbers for the X axis: https://repology.org/repositories/statistics/total

Numbers for the Y axis: https://repology.org/repositories/statistics/newest

Summary for people who like neither pictures nor tables:

* Debian Unstable (31k) has way more packages than Arch (9k without AUR), but the AUR (57k) has way more packages than Debian.

* The total number of packages that are at the latest upstream version are about equal for Debian (17k) and AUR (15k). Arch (without AUR) has way less total updated packages (7k).

* Arch has about the highest percentage of fully updated packages (85%), Debian is lower (72%), and the AUR is even lower (69%).

* NixOS rivals the AUR in number of total packages (53k), has a big margin in total latest upstream versions over everything else (24k, thus 30% more than Debian or Arch), but does not have as high as an update percentage (79%) as as Arch.

The numbers are not perfect because of split-packages and alternative packages (e.g. the AUR often has addtional `-git` variants), but they give a rough idea.

Hope this helps!


I used to run Debian testing and for a short time sid before switching Arch, and I had to reinstall them from 6 months at worst to two years at best, because of packages always breaking, system becoming unbootable after updates, etc...

In contrast, Arch has been both up-to-date and rock-solid - my current install has been carried over through three PCs since 2015.


I used to run Arch (2011-2014ish) on a personal server. I'd generally go a few months without updates, and large batches of updates were often painful, requiring manual steps... like the move to systemd, merging /bin and /usr/bin, and others I've forgot.

I have also had update issues with Ubuntu. There was a bug with Ubuntu 20.04 where a server would lose its default route when it had multiple network interfaces. And another bug where, after an update, network interfaces were renamed on a reboot rendering the server inaccessible. Is having a server with more than one network interface that unusual?

I have yet to find a distribution where updates are not problematic.


NixOS is designed so that updates won’t break the system in non-reversible ways. If an update didn’t work out well for you, you can always roll back to the previous version and withhold the update until you’re ready. I’ve used NixOS as a daily driver myself for years, and hadn’t needed a reinstall even once.


That's neat and good to know! I think I'll check it out.


Never been brave enough to run Arch, but I've had Manjaro in a VM as a torrent/vpn/media server and it's been rock solid for like 4 years. I use ubuntu LTS for most things but I can't complain about Arch/Manjaro stability.


Have you tried running Debian unstable (Sid)? If not, I'd suggest you try it before advocating that it's comparable to running Arch.

I think the parent comment was silly, but let's not pretend that Sid is a meant to be used as a daily driver.


Testing is usually around 2 weeks behind unstable, and I use it as my daily driver.


I'm not familiar with any evidence of that, but I'd like to point out that Debian unstable has a higher percentage of outdated packages than Arch.

Compare "outdated projects percentage":

- https://repology.org/repository/debian_unstable

- https://repology.org/repository/arch


Looking at that data, Debian unstable maintains roughly 3x more packages than Arch official.


That's correct, although "maintains" is questionable since Debian testing/unstable aren't meant to be used as daily drivers.


That's not how it works. Debian maintainers maintains packages from the very beginning of the process. They won't just wait until a package has entered stable.

Moreover, when comparing different distributions, it would make more sense to have a closer look at the release process rather than compare how they label their packages. Since Debian tests its packages for a longer period of time than Arch, Debian testing should be just as stable as Arch stable.


I think we're using the word "maintains" differently. Packages in Sid have no guarantees that they'll work, no security team, and no support system if you get stuck. Sid isn't meant to be used as a daily driver, and if your computer stops working that will be expected in Sid but a gigantic bug in Arch.

> Debian testing should be just as stable as Arch stable

Sure, but how up-to-date is Debian testing when compared to Arch?


> Packages in Sid have no guarantees that they'll work

Guarantee is a strong word. Can Arch guarantee this? Occasional breakage is bound to happen with bleeding-edge rolling releases.

> no security team

Weaker guarantees than stable, but that doesn't mean Debian doesn't handle security issues in unstable or testing. It'll be too late if they start dealing with security issues once a package enters stable.

> no support system

Actually, support is the same for any Debian release. https://www.debian.org/support

> Sid isn't meant to be used as a daily driver

That shouldn't matter much for people who're willing to use Arch as a daily driver.

> if your computer stops working that will be expected in Sid but a gigantic bug in Arch

A gigantic bug but still happens nonetheless.

> Sure, but how up-to-date is Debian testing when compared to Arch?

According to repology, Debian testing has twice the number of latest packages than Arch official [1]. Considering that packages of higher importance tend to be more actively maintained, I'd assume that Debian won't be significantly behind the latest release for packages that exist in both Arch official and Debian.

[1] https://repology.org/repository/debian_testing


Except the kernel, unlike in Arch.


I've run it on all my desktop and laptop computers for 20 years and it's fine. However, the only package I upgrade automatically is Chrome. I do a full upgrade one or twice a year, and in the meantime I only upgrade packages as needed. The whole point of versioned dependencies is that you don't have to adhere to one particular snapshot.

Automatically upgrading every day is not smart, since then you're virtually guaranteed to catch every breaking change. See https://wiki.debian.org/DebianUnstable#What_are_some_best_pr...


I'm a fan of both operating systems, but I have had a much more pleasant having updated versions of packages by default in Arch than by heading over to testing or unstable on Debian - in other words, the newer packages on Arch felt far more robust than the unstable packages on Debian. This leaves aside the fact that stable Debian was far more stable than Arch for me.


Your comment is absurd. From the very names of "testing" and "unstable" you can tell that they're not meant for normal usage.


By Arch standards Debian “unstable” or “testing” would be branded “stable.” If you can choose Arch stable for normal use, then you can do the same for Debian unstable too.


Words can mean different things in different contexts - “stable” and “unstable” in Debian refer to whether or not the major version numbers of included packages are going to change, not to how buggy they are.


https://www.debian.org/doc/manuals/debian-faq/choosing.en.ht...

> If security or stability are at all important for you: install stable. period. This is the most preferred way.

> If you are a new user installing to a desktop machine, start with stable. Some of the software is quite old, but it's the least buggy environment to work in.

https://www.debian.org/doc/manuals/debian-faq/choosing.en.ht...

> Testing has more up-to-date software than Stable, and it breaks less often than Unstable. But when it breaks, it might take a long time for things to get rectified. Sometimes this could be days and it could be months at times. It also does not have permanent security support.

> Unstable has the latest software and changes a lot. Consequently, it can break at any point. However, fixes get rectified in many occasions in a couple of days [...]

https://wiki.debian.org/DebianUnstable#What_are_some_best_pr...

> The most important thing is to keep in mind that you are participating in the development of Debian when you are tracking Testing or Unstable.

...

Since we're comparing Debian with Arch, I'll add that Arch also has testing and staging repositories, in addition to the ones meant for normal usage.


Superior documentation and AUR. You can find almost anything in AUR, including all the proprietary crap, and install it all with one command. It's also very easy to write a PKGBUILD and upload it to AUR if you don't find what you need, because the package format is so much simpler.

Here are two specific examples which other distros might struggle with:

repackaging a tarball to a proper system package which is tracked by the package manager

https://aur.archlinux.org/cgit/aur.git/tree/PKGBUILD?h=dotne...

building a proper system package from the source (with one command! no `configure; make; make install` lunacy):

https://aur.archlinux.org/cgit/aur.git/tree/PKGBUILD?h=sway-...


> Superior documentation and AUR. You can find almost anything in AUR, including all the proprietary crap, and install it all with one command. It's also very easy to write a PKGBUILD and upload it to AUR if you don't find what you need, because the package format is so much simpler.

I'm a Debian fan, but these two points are very true. Arch documentation is great, and writing PKGBUILD files is easier than packaging for distribution via Apt. I don't even use Arch, but I still release for it because it's easy.


How do you know if someone uses Arch Linux.....


The best part of not using Arch Linux is not having to deal with the Arch Linux community.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: