FSF's insistence on "GNU/Linux" has always been childish, but now it's just silly really. For years the proportion of GNU software that actually gets run has been dwarfed by the code from Mozilla, KDE and/or GNOME, and other major desktop software vendors. The GNU/Linux argument has never been persuasive and I'd say it's high time they let it go.
Just so you know, GNOME stands for GNU Object Model Environment. Not that that negates your point, but I still disagree with you for other reasons. Firefox is in no way part of the operating system itself. I would argue that Bash and the core-utils on the other hand are. Also keep in mind that the GNU Project created the whole idea of a Free alternative to Unix and created the culture and licenses that made it possible. For example, Linux is even licensed under the GNU General Public License v2. They provided every part of the operating system except for the kernel (again the core-utils and bash but gcc, emacs, and screen are probably also worth mentioning). Linux as we know it would absolutely not exist if it weren't for the GNU Project. I see people disrespecting the FSF constantly and I just really don't understand why you would when you know how much they've done for Free software.
GNU and the free software movement aren't going to die just because most people think GNU/Linux is a ridiculous thing to call an operating system.
The core of GNU's argument is that they need GNU/Linux so people will recognize the importance of the movement, but the manner of people participating in this naming campaign is completely counterproductive.
I don't mean to marginalize the gnu project, but there were already free -nixes out there. Starting over from scratch isn't much of an accomplishment other than the license...
>and created the culture and licenses that made it possible
The GPL has ultimately done more harm than good for open source. The only time proprietary software vendors are forced to release anything is when they accidentally mess up because they didn't understand what they were getting into. This happens rarely and when businesses get burnt, they're more apprehensive to do anything with opensource in the future. I know because I still see this happening today.
When people aren't scared of the codebase, they can start using it. If they can modify it without worrying about legal repercussions, they're not afraid to modify it. Sure they don't have to give back, but if you force them to they're not even going to make the modifications in the first place.
If someone wants to build something closed source, they're going to build something closed source even if it means re-inventing the wheel to avoid the GPL. I'd rather have the wheel be a BSD wheel, so even if product X will never be opensource, at least it will be built with good wheels. I'm a realist that knows closed software isn't going anywhere and at the end of the day I'd like all software to be as good as it can be.
>The GPL has ultimately done more harm than good for open source.
This claim seems to defy the simplest explanation given the historical success of linux, so I think you need to be a bit more rigorous in substantiating your argument.
I'll agree that strictly enforcing 'giving back' is an unnecessary imposition and cost for adopters when a) there are many similar substitutable goods b) open source has momentum: evidence exists that giving back can be positive and in the adopters' interest.
However, neither of those were the case in the era when the GPL was born. The GPL should at least be given credit for helping bootstrap the present state of open source.
From paragraph 2:
"The development of Linux is one of the most prominent examples of free and open source software collaboration; typically all the underlying source code can be used, freely modified, and redistributed, both commercially and non-commercially, by anyone under licenses such as the GNU General Public License."
Also paragraph 2:
"In 1983, Richard Stallman, longtime member of the hacker community at the MIT Artificial Intelligence Laboratory, announced the GNU project, saying that he had become frustrated with the effects of the change in culture of the computer industry and its users.[8] Software development for the GNU operating system began in January 1984, and the Free Software Foundation (FSF) was founded in October 1985. An article outlining the project and its goals was published in March 1985 titled the GNU Manifesto. The manifesto also focused heavily on the philosophy of free software. He developed The Free Software Definition and the concept of "copyleft", designed to ensure software freedom for all."
These kinds of attribution are infinitely better than a "GNU/" tacked thoughtlessly on to a generic term.
> This happens rarely and when businesses get burnt, they're more apprehensive to do anything with opensource in the future.
And why is that a bad thing? When they "accidentally" copy a random image on Google Image Search and publish it in a flyer, they get sued by the maker for copyright infringement, and they become more apprehensive to copy random images in the future. This is a good thing. Businesses need to learn that there are laws and that software is subject to licenses that they should read. If they fuck up it's their own fault. Do you really want your user base to consist of people who don't read your license and don't try to honor your rights and only care about their own profit?
I think the point is that the GPL has hurt open source because of the businesses' reactions. Of course it's vaguely good when businesses learn to be careful with software licensing, but if being careful entails not using open source, open source loses /because of the GPL/.
>Businesses need to learn that there are laws and that software is subject to licenses that they should read
I'm not saying burnt in a lawsuit against them for infringement, it could be as simple as abandoning technologies based on recommendation or clarification from their legal council.
What the GNU folks brought to the table is the GPL. It negates the incentive to make proprietary forks. The thriving ecosystem in which Canonical, Red Hat, IBM, among others thrive would not be possible without it.
> The GPL has ultimately done more harm than good for open source
citation needed
> Sure they don't have to give back
And that's probably why *BSD is such a rich ecosystem when compared to Linux distros.
> if product X will never be opensource, at least it will be built with good wheels.
I would prefer not creating such incentive to proprietary software. As a user, I value my rights more than I, as a programmer, value my power to restrict my users' rights.
If they can modify it without worrying about legal repercussions, they're not afraid to modify it
Oh, they can modify it. They just can't redistribute the modified code as proprietary software (or under a non-compatible license). But there's nothing stopping you from taking emacs code, for example, and modifying it any way you want. Actually, that's the point of Free software.
I never said they couldn't I said they couldn't without worrying about legal repercussions. Even in the cases when they're in the right, they have to carefully worry about the license. If I develop on an opensource stack, its prohibitively expensive to deal with the overhead of fixing the underlying components rather than put workarounds in my code. Even if the former would benefit everyone the most.
I read your comment a couple of times and I must admit that I just don't get what you're trying to say. As I said before, the developers don't have to worry about legal repercussions unless they intend to distribute the modified code under a license which would violate the terms and conditions of GPL.
So if you're developing an open source stack for your own personal needs, modify the underlying components any way you want to. If you want to redistribute it, though, you have to give the users of your software the same rights that you were given when you used those components. It's as simple as that.
Not true, the various "Free" BSDs came much later. As far as I know, BSD existed but wasn't Free. If it was already free, then RMS would probably not have launched the GNU project.
In case anyone's wondering about the actual timeline:
September 1983: RMS announces the GNU project. (Work actually begins in January 1984.)
March 1987: first release of GCC.
June 1989: first release of bash.
May 1991: FSF announces that work on GNU Hurd (their own kernel) "has begun".
June 1991: first free-as-in-speech BSD release ("Net/2").
October 1991: first release of glibc.
September 1991: first release of Linux (0.01) -- not free-as-in-speech.
March 1992: first release of 386BSD (0.0). July 1992: release 0.1, much more usable.
April 1992: USL v BSDi lawsuit filed. (This was a big obstacle to early adoption of the BSDs.)
December 1992: first free-as-in-speech release of Linux (0.99).
April 1993: first official release of NetBSD (0.8).
November 1993: first official release of FreeBSD (1.0).
January 1994: USL v BSDi lawsuit settled.
April 1994: FSF announces that the Hurd boots (but doesn't do anything much else).
May 2011: still waiting for the first truly usable release of GNU Hurd. No one seems to think it's very likely that there will ever be one.
So ... when the GNU project was announced, there weren't any free Unixes of any sort out there. It's possible that GNU really did create the whole idea of a free alternative to Unix. But the first free Unixoid kernel that was actually released was a BSD, and the first free whole Unixoid OS that was actually released was a BSD.
I think he still would have, because what he set out to create is a 100% copylefted system. It's in the manifesto:
> Everyone will be permitted to modify and redistribute GNU, but no distributor will be allowed to restrict its further redistribution. That is to say, proprietary modifications will not be allowed. I want to make sure that all versions of GNU remain free.
The FSF actively embraces non-copyleft or weak-copyleft licenses. The GPL3 was explicitly designed to be compatible with the Apache license. The Apache license has a weaker copyleft than GPL2.
Proprietary doesn't mean secret. There are codebases that are made available to customers for inspection, under NDAs and with all copyrights "reserved" by the owner. Looking at code is not the primary benefit of open or free software.
I use the command line a lot, but really, how big of a contribution is cat, ls, find, etc.? It's not like these are extremely complex or difficult programs; the GNU version is still the dominant one mainly because it works fine and it had momentum. I acknowledge that these have grown to be relatively advanced and it may not be an overnight job to rig up a replacement, and I appreciate the effort that has been put in to them by GNU developers, but it certainly wouldn't leave us desolate if we all had to stop using GNU code for some reason. There are non-GNU implementations of these programs already.
Other than that, how much of my normal CLI usage is attributable to GNU? I'll give screen. What about OpenSSH, Python, htop, pacman, etc.? I use these a lot and they are not from GNU. If I am primarily running Python programs on my Linux box, should I call it Python/Linux since the Python Foundation is the most significant contributor to my userspace experience?
It's not like these are extremely complex or difficult programs
Have you looked at cat.c?
You'll be surprised how complex and optimized even the supposedly simple ones are.
Multiple decades of optimization and cross-platform polish are nothing to sneeze at. Try working against the Darwin or Solaris userland sometime if you want to experience what the alternatives look like. Not pretty at all.
The point isn't that GNU programs aren't useful -- it's just that the claim that GNU is so important it should be included as a mandatory prefix any time anyone mentions "Linux" in a context that is not kernel exclusive is grating and silly.
What about clang? It isn't part of the OS. It can't even build the OS yet. It's a nice project, and it might be a solid competitor to gcc in the near future, but the fact that it exists hardly refutes the point that gcc is a large, crucially important component of a modern Linux distribution.
(I should point out that I too find the whole GNU/Linux tiff silly, but equally silly are the folks who try to write the FSF out of the picture out of spite.)
gcc isn't really "part of the OS" either. Many distributions don't even include it in the default install. Ubuntu is a prominent such distribution.
I'm not trying to write FSF out of the picture. I appreciate their contributions and surely things like gcc, gdb, and emacs are hefty achievements. I am grateful to the FSF and the GNU Project for its legacy and its direct benefit to myself in terms of code provided. However, I don't feel that these contributions entitle GNU to a special prefix on the OS name any more than it entitles KDE, Mozilla, Xorg, or anyone else to a special prefix. I use code from all of those parties and quite appreciate that code, too.
No one wants to write them out of the picture. It's just not realistic to credit all the vital contributors in the name. Why is GNU not content with the CREDITS file?
The generic term for the Linux ecosystem is not the venue for that. It might have helped if it were done early on, but GNU and the free software movement are in no danger of falling into obscurity at this point.
Anyone who might have followed the GNU/ to find out what it is would have found out about the free software movement in the process of learning about Linux. That was my introduction to it, and this naming campaign has given me a negative impression.
I'm glad people are working on replacements for GNU so I don't see someone saying "you mean GNU/Linux" every time anyone mentions Linux in public.
The point isn't that Linux isn't useful -- it's just that the claim that Linux is so important it should be included as a mandatory suffix any time anyone mentions "GNU" in a context that is not kernel exclusive is grating and silly.
On a more serious note, Android is a great example of a non-GNU Linux(or not-as-much-GNU Linux).
The company that made the table saw that cut the lumber for my house doesn't get partial naming rights to my house either, I don't receive mail addressed
I looked at the article and it seems to strengthen the argument for calling it a GNU/Linux distribution. The top contributors as per the chart.
1. kernel 9%
2. gnu 8%
3. kde 8%
4,5 java, mozilla 6%
6. gnome 3%
If you combine gnu with gnome, we have
1. gnu 13%
2. kernel 9%
Also considering that java/mozilla are platform agnostic, the candidate names we have are
linux-gnu-kde OR gnu-linux(-kde)
If you are not running KDE, then gnu-linux seems like a very strong candidate for the name. The argument for calling it linux would revolve around the availability of GNU software on other non-linux distributions in some form or the other, thus making GNU a not very distinctive moniker. But MacOS/BSD doesn't really carry as much GNU as Linux distros do.
But the kernel is running all the time when booted up. A better argument would be that only a fraction of GNU software is active at any moment, so by percentage of GNU v/s kernel code executed in any given time interval, the kernel should win handily. This pie chart, as it stands, actually makes a case for calling it GNU\Linux.
It's all irrelevant because operating systems in general are NOT named after their prominent components. It does often happen that the kernel and the OS have the same name, but that is not because the OS is named after the kernel but rather the other way around--that is, the creators of the OS name the OS, and then the kernel is known as the X-kernel, where X is the name of the OS.
If I choose to take the Linux kernel, GNU utilities, and assorted other free software and put them together to form a complete OS, I can call it whatever I want, and BY DEFINITION that is the correct name for my operating system, since I'm the one who put it together.
Well yes. Debian, Ubuntu, SuSE, etc are the names of the OS distributions.
When I'm generalizing based on the stack people expect, I usually say Linux when I mean the common Glibc, GCC (maybe CLang), and GNU utils with a Linux kernel setup people expect to find normally (and I'm usually excluding Android saying that), or I say that it's UNIX variant when I want to include Mac and BSD or any other generic POSIX complaint OS.
It's true that the kernel is loaded all the time (though not usually running more than a small fraction of the time), but so is glibc. It's far from obvious that more cycles are spent in the kernel than are spent in glibc (though of course in both cases the goal is to run as little as possible, allowing more application code to run).
It's also worth considering that only a fraction of the kernel's LOC are even compiled in the average distribution's kernel (are the LOC dedicated to Itanium, Alpha and PA-RISC support really useful to Ubuntu?).
It's also worth considering that only a fraction of the kernel's LOC are even compiled in the average distribution's kernel (are the LOC dedicated to Itanium, Alpha and PA-RISC support really useful to Ubuntu?).
Does the kernel spend more lines of code on platform support or on drivers? I was under the impression that drivers made up the vast majority of the kernel, most of which are enabled in distribution kernels.
A better argument would be that only a fraction of GNU software is active at any moment, so by percentage of GNU v/s kernel code executed in any given time interval.
The gnu tools are a very small part of the overall system, and they are probably some of the most replaceable (and replaced). The kernel is a lot less so, and while it's comparable in size it's also the common thread that people want to refer to when they say "linux systems."
Besides, neither the kernel nor the gnu tools are used all that much by a regular user. The desktop environment, the browser, office tools, etc. are much, much more visible to the end user.
In any case, it makes a lot more sense to say just "linux" than to arbitrarily "gnu/" in front just to placate some pedants. Say "gnu" when you're talking about gnu, not when you're talking about linux.
But does calling it GNU\Linux add any specificity over calling it Linux? I can't think of a major distro that doesn't use the GNU components. Therefore calling it GNU\Linux is rather like calling it Cocoa\Mac OS X. The GNU adds no additional information.
"I picked Ubuntu natty (released in April) as a reference, ... and am considering only the “main” repository, supposedly the core of the distribution, actually packaged by Ubuntu and not repackaged from Debian."
I'm not sure I understand Ubuntu's "main," or what's "packaged by Ubuntu and not repackaged from Debian." Does this mean the author is not counting code that appears in both Debian and Ubuntu? Because I would expect a lot of that code to be GNU, and I would expect GNU code to make up a small proportion of the Ubuntu-only code.
But that must not be what it means, right? I thought nearly all of Ubuntu's "main" was just repackaged from Debian, in which case the remainder would be a very strange sample to draw on.
Pretty much everything that is in Ubuntu is also in Debian. What Ubuntu "main" means is those packages are actually built by Ubuntu with their own patches and quality control. Wheres "universe" has everything that is not in main and is repackaged from debian to Ubuntu by the community without official support.
The assumption is that the "main" archive is a fairer representation of the important stuff in the distribution than including "universe" as well. You actually have to manually enable universe on an Ubuntu install to be able to use it.
"What Ubuntu "main" means is those packages are actually built by Ubuntu with their own patches and quality control."
I guess this is where I was confused. My understanding is that the packages in main still originate as .debs from Debian (i.e., they are not directly packaged from upstream sources), even if Ubuntu applies their own patches and such, so it sounds strange to say that these are not "repackaged." (I can't find an Ubuntu page confirming this, but Wikipedia says, "Ubuntu packages are based on packages from Debian's unstable branch..." [1])
So, I think the mapping from your terminology to mine is:
"built by Ubuntu" => "repackaged by Canonical from Debian"
"repackaged from Debian" => "repackaged by Ubuntu community from Debian"
The author is mistaken, as you point out there wouldn be nothing to look at (well, effectively nothing) if the code from debain was removed. Also the methodology is a bit suspect. The main non-free and contrib repositories are divided along GPL-compatibility lines, so just looking at main doesn't really paint a full picture.
I think this would be very interesting to see. I know there is Plan 9 from User Space that could provide cat, ls, grep, sed, etc, although the parameters are different so most scripts would probably break. Someone mentioned Clang as an alternative to gcc. The biggest hurdle would probably be glibc.
In reality, I think even if you came up with alternatives that had the same functionality as the GNU software, so many Linux programs have probably assumed GNU that patching everything to work would be a colossal undertaking.
I remember back in the early 90s, I often had the choice of using the local tools or the GNU tools. I went for the GNU versions every time: they were stable, predictable and worked the same everywhere I went. It felt like they'd built their own Unix then, despite the fact that Hurd was still not much more than some really ambitious design documents.
My point is, I fully appreciate the importance of the GNU work to the development of free and open-source software, especially Linux. That said, I can't see that most of the arguments being put forward here make any sense. I mean, should everything compiled in gcc have a gnu prefix? (GNU/Google, anyone?)
If the point is supposed to be "why insist on calling it GNU/Linux when xx% of it isn't even produced by GNU?", it's an ill-made point. Most of the userland (and certainly what we could call the "core" userland, especially in terms of development tools) is GNU software. The system would be pretty unusable libc, gcc, ld, make, etc.
I don't think bsd libc supports linux specifics such as fanotify and such, the libc is usually tied to a kernel. (I haven't looked at bsd libc so I may be wrong)
My point isn't that there aren't competent replacements. Obviously the BSDs in general have different userlands, as does Solaris, and so forth. But those things -are- the userland on a Linux system.
I tried hard to not make it about that but about the fragmentation of the sources of software in a modern distribution. But to your point, now with llvm we're increasingly at a point where libc, gcc, ld, make and many other gnu staples actually do have very competent replacements. As I pointed out in the post of any of the big gnu projects only gdb really has no replacement.
GNU's argument for why it deserves specific recognition over other equally and more significant contributions is uncompelling. Repeating it isn't going to make it more so.
Some friends of mine were working on a project to make a non-GNU linux distribution (well maybe not actually for distribution) just to see if they could: https://github.com/burke/non-gnu-linux/wiki
I don't think much has happened with it recently, but they were working to build the kernel with icc (the Intel C compiler).
As of a few weeks ago, Gentoo-Bionic is up at https://gitorious.org/gentoo-bionic which is as far as I know the first attempt at a Linux system built with a BSD licensed libc, Bionic being the Andoid libc. All the other glibc alternatives like uclibc tend to also be GPL licensed.
Bionic is missing a fair amount of stuff, as it only has to run a limited subset on Android, but has enough to run much software.
PCC would probably be easier, and it is still open source. Although a BSD-userland Linux would be interesting, I think the GCC compiler has become so widespread that it is almost impossible not to use it (except for Windows, of course). Even the BSDs, which are thoroughly against copyleft licenses, use GCC.
The kernel is pretty closely coupled to the GNU userland. stali[1] is trying to accomplish the same thing. I think they are using utilities/ibraries from OpenBSD. Not sure if they have a working system, though.
The whole argument around the "GNU/Linux" terminology is that the FSF deserved part of the glory because they contributed the userspace to Linux's system code. I think the persistence of those pushing this point of view is a projection of their disappointment over Hurd and demands for subsequent glory "because we deserve it!" This is a really childish stance from the start.
A kernel is of course useless without userspace. However, almost since its inception the percentage and importance of the GNU contribution has been steadily dwindling; in the course of a normal day, my wife, who is a KDE user, uses much more software provided by KDE e.V. than software provided by GNU/FSF. Should we call all of her Linux installs KDE/Linux?
That she 'uses more software provided by KDE' is an unsubstantiated claim that people make way to fast. Is it possible to build KDE without GNU tools? If not, every use of KDE is an indirect use of GNU tools. While running, how many times are glibc and related foundational components called?
The problem of GNU tools is their invisibility. But by that metric (what users see), your wife should call the system KDE, since the kernel is entirely invisible to her. Surely no one here suggest dropping Linux as the appropriate name for the system? But if so, doesn't GNU deserve to be mentioned as well, for making Linux actually usable by things such as KDE? The foundation is Linux + GNU tools. Neither can do without the other, but nothing can do anything without the two of them. That's as foundational as you can get in a GNU/Linux system, which is why people call it that.
Isn't the "you couldn't build it without GNU tools" argument somewhat silly, as no other OS out there includes the toolchain as part of the name? I doubt the folks over at Microsoft would seriously consider renaming Windows "Visual Studio/Windows" or "TFS/Windows", yet they would be hard pressed to build Windows without it. I'm also fairly sure Linux isn't the only OS out there that can be built with GNU tools (I thought Solaris was using it, at least during the brief period of time that they were open), yet it does seem to be the main target of the "GNU's contribution should be specially recognized" crowd - it's hard to see this as anything but sour grapes over Linux pretty much taking over the GPL licensed Unix-like kernel market away from Hurd.
"Surely no one here suggest dropping Linux as the appropriate name for the system?"
Linux based Android doesn't use Linux. Arguably the distro name is now the "primary" name (though this would still leave the FSF people feeling unloved). Thinking about it, I mostly say I use Ubuntu on the Desktop and Android on the phone and tablet and barely ever mention I run Linux nowadays.
I don't think much of KDE uses GNU tools under the covers. I know for instance that they've implemented their own copy functionality instead of just piping to cp or whatever. I don't know if they're dependent on any other GNU tools for runtime (they are probably dependent on GNU build tools), maybe we could ask a KDE developer. It'd be interesting to know.
Why? It is a reasonable proxy for how long it would take to replace something, and it's the effort to replace something that is core to the "you wouldn't be where you are without us" argument. How would you measure contribution?
So wrong. GNU grep may be fast, but it's also the most bloated grep with the most lines of code. I think that's the opposite of what you're trying to prove.
I'm not the OP, but one way would be to look at the dependency graph for packages. My guess is that a whole heck of a lot of them depend either directly or transitively on GNU code -- for example, GLibC.
That should be an easy change. Do you happen to know how where to look up the list of default packages?
I'd assume the percentage of GNU would go down as gcc and gdb are probably not in the default install. The interpretation is a little more dubious though as gcc and gdb are definitely used to produce the software in the default install.
Comparing the size of Ubuntu 'main' isn't interesting.
A more thoughtful assessment would have examined what packages were installed & used rather than which ones existed. That data is really easy to find: http://popcon.ubuntu.com/
I do have that data and have been using it in these kinds of analysis. There are two big problems with it. The first is that it's not given out in any kind of time-series or by distro-version so some analysis are very hard to do (this one would work though). The second is that the data gathering is opt-in so the sample will have a bias, probably towards power users.
No expert, but I always thought Ubuntu was probably the distro least likely to have a large % of GNU thanks to all the extra stuff that comes with it. Would be interested to see this for Debian or etc.
I think in terms of a standard desktop user's install, a Debian machine configured for usual use is not going to differ all that greatly from an Ubuntu machine. A server config forsaking much of the user-friendly stuff, maybe.
I think my point there is that I don't think actual distribution "names" matter so much as what their intended purposes are for varying the "what" that is measured for this metric.
Ubuntu doesn't even call itself Ubuntu Linux let alone Ubuntu GNU/Linux."Ubuntu is a fast, secure and easy-to-use operating system used by millions of people around the world."
Protip: eglibc is a fork of glibc, considers glibc an upstream, and communicates patches with glibc, requiring copyright reassignment for contributed code. This makes them roughly as GNU as glibc itself, which is maintained by a Redhat employee.
Things in other are basically all individual projects. So they're not included in any overarching project or grouping. And I actually find that very interesting as it means that development is highly fragmented and distributions do an essential job in putting it all together.
There's a line there somewhere. Clearly udev is very tightly linked to the kernel and it was built to replace devfs which was inside the kernel. On the other extreme maybe the filesystem programs (e.g., reiserfsprogs or xfsprogs) aren't as tight a coupling and would make sense outside. I put the split into the footnote to make that clearer.
I don't quite think it makes sense to say that all of the software in a Ubuntu distribution should fall under the label of "GNU/Linux". Just because it runs on it doesn't mean it IS that thing.
That said, it is somewhat interesting to see the proportion of software.
I'm not the author or the submitter, but it seems to me that this article is more or less in response to FSF's "We developed most of the central components, forming the largest single contribution to the whole system."[1]
I tried hard to avoid too much of that controversy getting in the way of looking at the data and overshadowing what I thought were more interesting conclusions. But I was of course aware of the naming controversy and it was part of why I had the curiosity to do the analysis in the first place.
I think it's more useful to look at the impact of a contribution. Apache, KDE, and GNOME are a bigger deal than GNU by that metric. But I also think it's a bad idea to try and put one over another when they all do their part.
I don't know how you measure "impact," but this doesn't sound right to me. If you took away all the GNU code from a GNU/Linux system, my guess is that you would no longer have a working Unix system. I don't think that can be said of the other organizations you mention.
And if you mean social impact, I think it would be hard to overstate the role of GNU in getting the free software movement going. Yes, there is an ecosystem of self-sustaining projects and organizations now; but I have my doubts whether any of that would be there without the initial efforts of GNU.
By that line of thinking, why call it "Linux" (or anything else) either? I agree that "GNU/Linux" sounds kind of dumb, if that's what you're driving at -- but I'd be happy to call the operating system "GNU".
Why do we call it aspirin instead of acetylsalicylic acid? It's familiar, easy to spell, and easy to remember. It's an accident of history. What would be accomplished by renaming it?
Most people in the world don't call it Aspirin, as that's a registered trademark of Bayer AG. The trademark was voided in the U.S. during WWI and never re-granted.
Bayer holds the trademark for Aspirin (Capital A) in Canada (and others).
The trademark is void in the US (and others)
When Canadians that I know (incluidng me) say "aspirin" we mean very deliberately "Any kind of ASA - and if it's not the generic stuff you're some kind of weirdo who likes to waste money" - but we still call it aspirin.
Brand names don't, of course - they list ASA as an ingredient... but I've never in my life heard someone ask if I had any ASA.
Different experiences. The first time I had someone ask me for some, I was confused: I didn't know what it was (I had just immigrated from the US). But I've had more than a few people refer to it as ASA in the thirteen years I've been here.
LinesOfCode to determine how much "GNU" there is is like counting number of screws in a car to determine how much "car" it is.
Without GCC to compile the kernel and all the other tools, there would be no kernel. A free open source compiler which has existed for decades now, and its competition, Clang, is it even stable yet?
It still cant build a fully working out of the box Linux kernel for x86, eg no 16 bit early boot stuff, patches for various stuff, some things not working.
It is stable though. Linux is just a hard target...
1. It has always been targetted at gcc, so no serious portability work has been done, more of a co-development thing. Linux will only compile correctly on a few recent versions of gcc, you cant compile old kernels on new gcc or vice versa.
2. There is a lot of inline assembler (including oddities like x86 16 bit code) that has to work exactly as specified, such as for creating memory barriers say, or other low level constructs. Inline assembler is non standard, although clang does now implement the gcc stuff.
3. You are making an ABI, so everything must be laid out exactly the same way. Portable C does not give you exact control of padding, again there are gcc extensions.
4. Testing is hard - there are a huge number of optional modules, as well as SMP and non SMP builds, so even if you build the code with a new compiler without a lot of people using it it is hard to be sure it really works. There is no test suite...
Our cparser compiler cannot even try to compile Linux, because it does not support the gcc -regparm and the -softfloat arguments. These are features that do not matter for applications (on x86).
-regparm changes the calling convention, so the first arguments must be stored in registers instead of the stack
-softfloat uses library functions to emulate floating point operations instead of the FPU
I think that the GNU+Linux attribution is accurate and deserved. The author of this page thought it fair to use SLOC as his unit of measurement; I think that "fundamentality" is a far fairer unit.
Anyone that thinks that "GNU" should be dropped from the "GNU/LINUX" should be using alternate cp, rm, ln, etc. Or just use NetBSD (do this anyway).
The biggest GNU contributions are gcc, gdb and emacs.
Command line stuff is trivial. GNU made them for free first, but if they hadn't, someone else would have. The same cannot be said for gcc, gdb and emacs.
> GNU made them for free first, but if they hadn't, someone else would have.
Chnage that to "GNU re-packaged or re-wrote versions of utilities that had existed for quite a while in the BSD world, but if they hadn't, someone would have just used the BSD version" and you would be closer to reality. GCC is really the only lingering semi-dependency and as clang improves to the point where it takes fewer patches to build around GCC-specific bits of the kernel the remaining contributions will become nice to have but not essential.