Slightly off-topic but since the author mentions it; I personally cannot help but feel like the strong push for rust in the kernel is overstepping some sort of boundary.
Since the kernel started, you can't begin to count the number of "better than C" languages that appeared.
Why, if C is so bad, wasnt one of the alternatives introduced to slowly replace C in the kernel years ago?
Zig, D, Ada, ... they all offer massive benefits over C.
The only reason there is a push for Rust so much harder than anything else can only be explained by the community and their almost aggressive spirit of "if its not written in Rust, it should be".
No regard for software that works great, the engineers that put time and effort into it to make it secure and fast, if its not Rust it must be broken.
Im excited to see the linux kernel improve even further, no matter what it takes, but it does rub me the wrong way and makes me stop and think a little bit when one particular language is held on such a gigantic pedestal.
> cannot help but feel like the strong push for rust in the kernel is overstepping some sort of boundary
There's no "strong" push; the RIIR narrative doesn't actually belong to the Rust community (which I've found very pragramatic), but it's so inherently inflammatory that the general public associated it to the Rust community nonetheless.
What's actually happening is that some major projects are evaluating - slowly - whether Rust could be integrated in some areas. Besides the kernel, another high-profile project is QEMU.
> Why, if C is so bad, wasnt one of the alternatives introduced to slowly replace C in the kernel years ago? Zig, D, Ada, ... they all offer massive benefits over C.
Because of the "well-known value proposition" that Rust brings.
I'm continuously mystified, now that unsafe memory management has been unambiguously shown to be the major factor in vulnerabilities, how safe memory management is still not recognized as the highest priority (which doesn't imply to rewrite in Rust/Golang/etc) by everybody.
> The first comment on that article complains that Daniel is being pressured by the Rust community, and he responds:
The first comment actually proves, in a typical way, the point of the RIIR misinformation.
Here it is, in full:
> This sounds like the usual EEE playbook. Some powerful party wants to take away your project. The new slightly incompatible curl will then be the new “curl”, and the current curl will be dead. We have seen it so many times.
> Please Daniel stick to C-based curl, and don’t align with such parties. They should rename their work, and we will see if their rusted thingy gets any traction without using the well known and trusted “curl” brand.
The author is not a Rust programmer, and they're thinking on behalf of Curl's author, pushing a pseudo-conspiracy theory narrative ("some powerful party wants to take away your project").
Who's this powerful "party"? The Rust Illuminati? /s
Depending on the interpretation, the Curl author's answer could be actually a confirmation of the nonsense of the comment ("It’s not a conspiracy").
The author has actually been outspoken on the nutty people putting pressure in a way or another. Here's a fun post:
> the RIIR narrative doesn't actually belong to the Rust community
As someone who has been the target of aggressive "rewrite it in rust" comments and "pressure" (multiple issues and useless discussions) on one of my projects that probably completely depends on how you define "the Rust community". The highly annoying RIIR crowd certainly does stem from Rust users. As a Rust user myself I'd like to distance myself from them too, but that doesn't really change that they do come from the Rust community.
It annoys me a lot when people try to describe a community, specially a completely open one as programming language communities where literally anyone can come an go at any time, as having certain characteristics.
It's similar to when you try to generalize over nationalities. While there may be some small truth in it over very large numbers, there's always going to be plenty of exceptions so that making such generalizations should ALWAYS be considered inappropriate, be it "negative" or not. Even more so when there are large minorities within the community.
I do agree that the RIIR is likely to be one of these minorities, but that doesn't make them NOT part of the Rust community.
Another "minority" in the Rust community that I have noticed a lot in my projects/blog posts is made up of those who are never satisfied with any Rust code: there's always some better way to write something, or some more efficient implementation, and your code definitely sucks... Rust is a very large language with many features, so there's always a lot of ways to do things, but even two extremely highly skilled Rust developers are highly likely to disagree on which way is "the best" for a certain non-trivial design decision... the fear of not using the best alternative and being ridiculed by other Rust developers claiming my approach is bad makes me quite cautious about showing Rust code in any blog post I write.
But I still don't want to claim this is a characteristic of "the Rust community", just like I would hesitate to say "the Rust community is very welcoming".
We probably just need to stop the human impulse to generalize about groups of people, that's all.
This is true in politics and religion as well, the outward facing representation of a group is often defined by its loudest proponents. They are often not representative of the group. But external perception is still important and easily shaped by loud / repeat voices.
Disavowing certain behaviours works on a personal level, but otherwise people who wield power in such groups have to try and use that power to try to improve external perception often in spite of the many vocal members of the group.
> I'm continuously mystified, now that unsafe memory management has been unambiguously shown to be the major factor in vulnerabilities, how safe memory management is still not recognized as the highest priority
It is very straightforward to recognize the reason for this as soon as you realize that vulnerabilities themselves almost never have the highest priority either. Therefore, preventing them also can't have the highest priority.
> now that unsafe memory management has been unambiguously shown to be the major factor in vulnerabilities, how safe memory management is still not recognized as the highest priority
It is a shame it isn't taken more seriously, especially when you consider the recent Pegasus discoveries - https://digitalviolence.org/
A memory management error can literally be a matter of life and death for some.
The Therac-25 tragedy was not a memory-management issue. If its had been written in Rust the state machine error would still have cooked a lot of people.
I entirely agree that safe memory management is one of the most important topics today. I just wonder if our time would be better spent developing tools to formally verify memory safety of existing languages, instead of trying to "rewrite everything in Rust" to get safety.
I see your point, though, and I was definitely being snarky for the sake of being snarky.
> I just wonder if our time would be better spent developing tools to formally verify memory safety of existing languages, instead of trying to "rewrite everything in Rust" to get safety.
In theory, that makes sense.
Now, as a sibling post points out, such tools have been around for dozens of years. Microsoft started using these tools for drivers around the time of Windows XP, if I recall correctly. I've used such tools. Heck, I've developed such tools. In my experience, they require orders of magnitude more experience (both of the tools, of the underlying theory and of the codebase) than Rust (or Ada, or other safe languages) to achieve comparable levels of safety.
But metadata is needed for proper formal reasoning, and C has barely any additional information content over some very vague “typing” hints, everything else being only conventions enforced more or less by the linux community.
So Rust’s lifecycle annotations are what makes proper proofs possible in the first place.
(As for proofs though, they are no silver bullets either — a program of linux’s complexity is unlikely to be ever completely verified, but rust may indeed give valuable guarantees for most of the program)
>I just wonder if our time would be better spent developing tools to formally verify memory safety of existing languages,
The industry can't implement your suggestion because while computer science can formally verify a restricted subset of some C source code, we can't write a generalized tool to scan any C code to prove memory safety.
Consider memory operations of existing language such as C Language with these statements:
char *dst; // value is set anywhere in the program
char *src; // value is set anywhere in the program
size_t sz; // value is set anywhere in the program
// ... bunch of other code that may alter dst/src/sz ...
memcpy(dst, src, sz); // is that memory safe?
Nobody has yet written analysis tools to prove & verify that the memcpy() is "safe" because predicting with perfect accuracy how variables dst, src, sz will get set & change during runtime are variations of the unsolvable Halting Problem:
https://en.wikipedia.org/wiki/Halting_problem
That's why there's still no computer science research paper showing a tool that can scan general C source code to prove that all calls to memcpy(), strcpy(), and any writes to RAM via aliases & pointer dereferencing are "memory safe".
We're hoping for "Sufficiently Smart Compiler"[1] type of tool that can verify C code is safe but the source itself doesn't have enough information embedded in it to prove that.
So the alternatives so far have been:
- static approach: extra syntax for manual annotation (C++ unique_ptr<>, or Rust borrow &ersand, or manual mutex lock/unlock)
- dynamic approach: runtime tests with extra memory checks such as Valgrind + UBSAN + ASAN + debugger memory fences, etc. This can catch some but not all memory errors. 100% provable is not possible for non-trivial programs because of the combinatorial explosion of runtime execution paths and memory values.
- GC garbage collection in a different "managed" language. The tradeoff is higher RAM footprint, extra cpu and GC pauses for tracing scanners.
> GC garbage collection in a different "managed" language. The tradeoff is higher RAM footprint, extra cpu and GC pauses for tracing scanners
It would hardly mean a problem for most programs, including OS kernels, especially that kernels are themselves GCs for resources like processes, file handlers, etc.
Also, GC pauses are overblown, most of the tasks can be done in parallel.
I work with safety-certified systems written in C. The tooling around such things is extensive and mature: static analysis, extensive test runs with runtime analysis instrumentation (test coverage at the MC/DC level, valgrind for memory and concurrency analysis) it's all there and used.
It's true that a lot of projects written in C do the equivalent of using Rust's "unsafe". Then again, a lot of Rust projects use Rust's "unsafe".
JOVIAL and ESPOOL were the first system languages with a notion of unsafe code blocks, both about 10 years older than C.
It is a big difference to have unsafe code jump to our eye and being easily searchable, or having each line of code as a possible source of integer overflows, memory corruption, or UB optimization exploit of the day.
> I personally cannot help but feel like the strong push for rust in the kernel is overstepping some sort of boundary.
As a Linux kernel developer, I feel like non-kernel developers like yourself telling me which language I should use is definetly overstepping a boundary.
I couldn't care less about what non-kernel-developers think about this issue, but "since you mention this": please do mind your own business.
> Why, if C is so bad, wasnt one of the alternatives introduced to slowly replace C in the kernel years ago?
Because we, as in the actual people that spend their whole day developing the Linux kernel *do not want* to write and maintain Linux kernel code written in these languages.
> The only reason there is a push for Rust so much harder than anything else can only be explained by the community and their almost aggressive spirit of "if its not written in Rust, it should be".
This is so wrong. We, Linux kernel developers, are already writing drivers and kernel modules in Rust, *because we want to*. We now want to merge this new code so that other devs and users can use it. Most other devs want this too, so now we added Rust support to the Linux tree.
---
It really is that simple. My code, my body, my rules.
I really don't know what's so hard for people to understand here.
Well... indeed that was harsh. However I do understand, imagine you work with this (Kernel), know the "ins and outs" and someone wants to "put words in your mouth", this really feels disrespectful.
I'm not a Kernel developer, one day I might be (not smart enough yet), What I can do is express my opinions, concerns and thoughts about the future of the project, tho those who actually codes it, should be the ones to decide it.
Disclaimer: I love Rust so take my point of view with gigantic salty.
Somebody told them to get an abortion, and they just did. But I know better, I know what's best for them, and I know and I should decide what they should be doing with their body. They should not be getting abortions, nobody should, and I am sad and disappointed that this is happening.
Does this ring a bell?
Replace abortion with Rust, or a spoken language, or ...
Anyone who works in the kernel knows, that somebody dropping an email on the mailing list saying "you should rewrite your kernel in X" is not something that achieves anything.
The OP claims that this somehow happened for Rust. People from the outside managed to somehow "push" us to use Rust in the kernel against our will, they dropped an email and we somehow were fooled by this people and started doing it. The OP is "sad"/"disappointed" that this happened.
If the OP were a kernel developer, (1) it would be accusing itself of getting fooled (yet their comment does not seem to realize this), and (2) it would know that this is not how the Linux kernel community works.
D and Ada both use a GC in many contexts, and are not half as interesting in their GC-less contexts. They both initially only had a proprietary compiler, which durably harmed adoption. They both seem to cater to fewer domains than C/C++/Rust. Today Ada is only used in high-stakes industry, and D doesn't seem to have any claim to fame. It's a pity that neither succeeded in their time, but there's no good reason to use either of them today.
Zig is very promissing, but is just too new.
You failed to mention C++. It's well loved (and hated), has many pros and cons compared to C, and is used for kernel development (just not Linux).
It's silly to think that the only explanation for Rust success is community lobbying. Rust has many concrete advantages, like being safer than C/C++/D/Zig, being fully GC-free and suitable for kernel and embedded development, having actually gained significant traction, having great tooling/docs/ecosystem... And generally being a language that people enjoy. Rust isn't the end-game of kernel programming, but it isn't just "yet another better C".
The only reason it isn't highly adopted was the high cost of its compilers, only SGI and Sun cared to have compilers on UNIX back in the day, and Microsoft, IBM and Apple rather doubled down on C++.
Sorry, bad mischaracterisation on my part, annoyingly it's too late to edit my post.
Looking at the Ada docs again, it considers manual deallocation an unsafe operation and suggests avoiding it altogether (IIUC, by restricting yourself to the stack or by leaking to the heap). That seems like a huge restriction, making the "Ada is safer than Rust" claims rather academic.
> Why, if C is so bad, wasnt one of the alternatives introduced to slowly replace C in the kernel years ago?
Couldn't you always say this up until the time it happens? There always has to be a first success.
E.g. pre unix you could ask, if high level languages are so great, why weren't they used instead of assembly a long time ago.
> Zig, D, Ada, ... they all offer massive benefits over C.
So does perl, there's more to the debate than if a language has some "benefit". Its all about costs and benefits. I think the way rust has positioned itself makes it a bit more palatable than those other languages in this domain. That doesn't neccesarily mean it will succeed, but it seems more in the right direction for that goal than some of its competitors.
You've dismissed his question without really answering it, which is a little unfair.
The OP's question might be unpopular on HN (who largely seem to be in love with Rust) but it's a valid question and not one that's at odds with Rust either.
To paraphrase his question: "Why is there more momentum behind Rust to replace C than there has been before?"
> So does perl, there's more to the debate than if a language has some "benefit".
It's pretty clear that Perl isn't a systems language nor performant enough to write a kernel even if it were a systems language. So bringing that up is rather silly. And you don't even answer the question of "what is more to the debate than "some benefit".
Zig, D and Ada might all be perfectly valid languages to rewrite Linux in. So might OCaml and countless others. But it is Rust that has the momentum. As someone who can program in > dozen different languages, I too have found the momentum behind Rust a little unusual and I can't really put my finger on why it has proven so popular when others have, relatively speaking, failed. There are aspects of Rust that I like but I don't find it an order of magnitude better than Ada, OCaml nor Zig.
So the question remains open, what has changed that all of a sudden the time is right to rewrite Linux?
I wonder if a large part of Rust's success isn't down to the language per se but rather down to it emerging just as C++ was finally reaching that critical mass point where developers are fed up with it's complexity and ready to jump ship. With the other languages either peaking too early or simply lacking enough corporate sponsorship for the mainstream to adopt. Maybe the syntax has a part to play too where Rust looks more familiar to new C/C++ developers in a way that Ada and OCaml do not.
> It's pretty clear that Perl isn't a systems language nor performant enough to write a kernel even if it were a systems language. So bringing that up is rather silly. And you don't even answer the question of "what is more to the debate than "some benefit".
But that's exactly my point. The OP asks, why not a language "better" than C, full stop. That's the wrong question. There are many languages better than C according to someone, but terrible to write a kernel in. Better than C is a terrible metric. The momentum behind rust isn't because it is simply better than c in some unspecified way.
To actually answer your question, i think its both political and technical. Rust is on an upswing. In any political movement,momentum is important. Ada is old news, it doesn't have that excitement around it that is critical for a political movement to galvanize its supporters. From a technical perspective, it seems like support primary comes from people who hate memory safety vulns. That's the impetus but lots of languages prevent that. The aspects of c that make it popular are (syntactic) simplicity and full control. Rust has really good PR to suggest that it does the memory thing very well, well still valuing the other factors that make c popular. Other languages tend to try and improve a lot of things or stray further from the c path, which make it seem like bigger switch with higher costs.
So in conclusion, i think rust does just enough to have a benefit people consider "worth it" (relative to switching costs) well not straying too far from c values which makes switching costs low. It also came at the right time, has momentum and good propaganda, which is critical for any political "movement"
While I have no issue with people picking languages for personal reasons, if we are going to start presenting RIIR (Rewrite it in Rust) as a technical argument then we need to move past the "it has momentum" part of the argument because trends have a habit of changing (as you've acknowledged yourself).
Not taking anything away from this specific project though as it's a hobby project. Just making a wider point about the calls for deprecating C and C++ in GNU/Linux.
Fair enough. I just worry that this momentum is a little premature and we'll end up creating as many problems as we solve. Rust is still quite a young language and while it has already proven itself it has also already taken on a lot of complexity despite its age and gone through several iterations of breaking changes too. I'm not entirely convinced that in 10 years time Rust wont end up as painful to audit as C++ is now. While Rust does offer some memory safety guarantees out of the box, that still depends upon the abstractions not needing to fallback to `unsafe` -- which isn't going to be avoidable with kernel development. And as the complexity of the code base and language semantics grow, the developers ability to audit `unsafe` is going to be will diminish in competency. And that's without even addressing the risk of introducing new bugs as part of the rewrite. Taking those points into account, I can't help feeling there's a little bit of "the emperors new clothes" going on here were the promises being made are greater than the reality.
I'm all in favour of a "rewrite it in $SAFE" but the risk adverse part of me feels like Rust is a combination of too young and lacking in formal guarantees to consider the massive undertaking we're asking here. Particularly when you consider this is going to be a multi-decade project and one that we can't trivially switch to another language if we regret that decision part way. To give context to that point: it's already been 30 years of C with Linux, longer if you count BSD or Unix. So the commitment we're asking from Rust is massive.
Maybe I'm being overly cautious. Unfortunately I've found the discussions on the topic are usually underwhelming in terms of people being prepared to honestly answer the hard questions about Rust's readiness for such a critical role. And that has left me feeling more nervous than reassured that the whole thing isn't just a fad.
Maybe to try to answer some of OP's question. Rust is past it's "1.0" release. Other alternatives are also plenty viable and I would love to see more technical conversation considering them, but I think Rust is viewed as more mature (not a 'Rust' user here). An example, I find Zig really interesting. However, I think even the main creator of Zig says it is not production ready yet (however it is getting closer to that '1.0' release). D may be the only one I know (besides Ada) that has that level of maturity.
Also, for people who hate memory safety issues, does it really matter which language? Rust is a solution with momentum. For the people motivated by that concern, fighting over language choice is a distraction.
For user land it doesn't matter at all as people can mix and match whatever languages they want as long as there is an ELF entry point into the runtime. However rewriting the kernel, which is what this discussion is about, is a whole other matter.
But as bawolff said, the main issue is moving towards safer tech. If Ada had good enough momentum, we'd be discussing Ada in the kernel instead. C++ has been rejected from Linux, leaving Rust as the only kernel-capable language with enough momentum, so Rust it is.
That has nothing to do with memory safety though let alone Rust specifically. What we need is a smarter way to manage authentication (non-persistent session cookies and usage a password manager to handle the log in is a step in the right direction).
We are stuck in a local minima with C and C++. Rust is a particle with high enough energy to tunnel through the barrier to a lower, safer energy level.
To some degree, it doesn't matter how Rust acquired the energy; if you want to hop the gap, gotta hitch a ride on the highest energy wave.
Not necessarily. Language does matter as it is a construct that forms how problems are thought about. One thing I like about D is how it is much more multi-paradigm. I would like to see some more analysis of using D in Linux.
As much as I like OCaml, I doubt we'll ever see a tracing garbage collected language in the Linux kernel.
Also neither OCaml nor D make any attempts at stopping data races, which is sort of a selling point of Rust. Ada/SPARK, I suppose do, but I don't know at what cost; I'm sure though Rust syntax will feel more familiar to C developers than Ada, even if a lot of Rust syntax comes from ML. But it has {} :).
I don't know Zig, Ada and D very well, but isn't Rust the only language that can guarantee huge amounts of safety with zero runtime overhead due to the burrow checker?
No, you can guarantee much more with regarding to safety using Ada/SPARK. I have a table somewhere. So if you want safety guarantees, it is Ada/SPARK all the way. Check out https://blog.adacore.com.
I cannot emphasize it enough: if people really wanted so much safety, they would have chosen Ada/SPARK. I also find Ada easier to read and write than Rust. I know C, OCaml, Erlang, Forth, and Factor, yet I have difficulties with Rust. Am I really that bad of a programmer? Welp, at least I know Ada and I can write actually super safe stuff. :P
A lot of those guarantees are only available in the alloc-free subset of Ada, which make them much less attractive. The devil is in the details, making Ada guarantees not always better than Rust ones.
Hm, so can Ada/SPARK bring the same memory safety/data race guarantees to the table as Rust, yet not being more difficult to express and maintain those guarantees? I'm not familiar with the language.
No, Ada has no lifetimes and borrow checking and thus cannot possibly offer the same expressiveness as Rust while supporting safe memory deallocation without garbage collection.
> The goal is to allow a pattern of use of pointers that avoids dangling references as well as storage leaks, by providing safe, immediate, automatic reclamation of storage rather than relying on unchecked deallocation, while also not having to fall back on the time and space vagaries of garbage collection.
with Ada.Unchecked_Deallocation;
procedure Test is
type Int_Ptr is access Integer;
procedure Free is new Ada.Unchecked_Deallocation (Object => Integer, Name => Int_Ptr);
X : Int_Ptr := new Integer'(10);
Y : Int_Ptr;
begin
Y := X;
Free (Y);
end Test;
GNATprove output:
test.adb:8:04: info: absence of memory leak at end of scope proved
test.adb:9:04: info: initialization of "Y" proved
test.adb:9:04: info: absence of memory leak at end of scope proved
test.adb:11:06: info: absence of memory leak proved
Does that proposal allow to express anything that can be expressed in Rust?
That seems quite unlikely given the lack of syntax for lifetimes. It's still in theory possible to infer them, but that would result in API instability and hard to decipher error messages.
Can you for instance have a function that given a reference to an hash table and a reference to a key returns a reference to the value corresponding to a key equal to the given one? (this requires the return value lifetime to be the same as the hash table lifetime, while the search key lifetime is unrelated)
Can you refactor code that takes multiple parameters by reference to taking by value a single structure including those parameters as fields with independent lifetimes?
That is why SPARK exists, and RAII offers safe memory deallocation.
Ada never used GC, although Ada 83 allowed for optional implementation, that no compiler ever made use of, thus it was removed from Ada standard in the 2005 revision.
It has a package manager now! :D Anyways, there are some things that could make Ada/SPARK more desirable but it has nothing to do with the language itself. :(
Nope, Rust still cannot provide formal proofs Ada/SPARK style, and using String, Vec, Rc, Arc, RefCell imposes just runtime checks as other type safe systems languages.
> but rather down to it emerging just as C++ was finally reaching that critical mass point where developers are fed up with it's complexity and ready to jump ship.
> I personally cannot help but feel like the strong push for rust in the kernel is overstepping some sort of boundary.
Assuming you're not a kernel developer, I don't understand why you need the feel to be offended in their stead, as they don't really seem to mind at all.
> Zig, D, Ada
(i) not yet stable; (ii) torn apart between two runtimes, not really GC-free; (iii) has a... complicated toolchain story, and basically didn't exist in a free/libre aspect on PC until very recently – and didn't/doesn't compile on 90% of Linux targets.
> The only reason there is a push for Rust so much harder than anything else can only be explained by the community and their almost aggressive spirit of "if its not written in Rust, it should be".
No, the only reason there is a push for Rust in the kernel is because people are interested enough to write the code to get Rust in the kernel.
> it does rub me the wrong way and makes me stop and think a little bit when one particular language is held on such a gigantic pedestal.
Maybe you should stop and think a little bit on why kernel developers, arguably among the most competent ones on this subject, put Rust on what you perceive as a ‶gigantic pedestal″.
So one should not rewrite programs out of some form of respect for the original author's language choice? That doesn't sound reasonable. It is actually worse: it sounds like a socially unsound approach.
It would be really nice to be able to rewrite the Linux kernel in something that is better suited to writing reliable low level software than C. That's not very controversial. The problem is that there hasn't been many clear candidates for this.
For that investment to pay off the language you rewrite it in has to be sufficiently better than C in ways meaningful to the project to make the investment worth it.
I think it is very good that people do these experiments. And even though I am not a Rust developer, I really want people to explore Rust as an alternative low level language for writing operating systems. Including Linux. Part of this comes from having worked off and on with embedded code for the past few years. The state of much embedded code is just awful. And it is clear that languages that offer more safeguards are desperately needed.
Rewrites are a misguided idea. You cannot rewrite a project this big, almost at all. My favorite example of how difficult rewrites are is the Rust rewrite of the GNU coreutils. A project since 2014, usable, but nowhere close to the completeness of the original. It's way, way, harder than you'd think to rewrite mature software (especially software that is heavily tested).
If I think about it: most of my career has consisted of re-creating something that already exists. And the one constant is that there is never any lack of people who line up to tell me what a bad idea this is.
They aren't always wrong. But they also are not always right. And when they aren't right interesting things tend to happen that sometimes involve important changes.
Linux itself was a "misguided" project as Minix already existed for the low end and lots of UNIXen existed for the higher end. The exact same sentiments, plus a lot less charitable ones, were put forward back then.
And I remember people thinking me silly for working for a web search company in the late 90s "because Altavista already exists".
Rewrites are perhaps a misguided idea for people who are highly entrenched in a "worker" mentality. But there are other modes of thinking and working. There exist people with what one could call a "creator" mentality.
But I would distill one point from your post that I think is important: the goal of re-creating Linux may not be the correct goal. But I think staying close to Linux has merit. Usability over fashion has proven to be a deciding factor for success as far as design goes. The rest is down to leadership and organizational skill.
I generally agree with you, but at the other hand, linux ecosystem is held back a lot by this everything C mentality. Like, ripgrep, a grep clone written in rust with proper parallelism runs loops around the C version.
Rewrites of smallish mature software is hard more because of reliance on backwards compatibility — for such long-living programs, even certain bugs/strange behaviors are part of their specification.
(this is a slightly different discussion since I think most people who work on the Linux kernel see it as something entirely separate from the userspace. It isn't a given that the kernel must absolutely be the same language as the userspace or vice versa)
I think it might be a good idea to examine whether there is indeed such a mentality. I suspect that much of this comes down to "tradition" (whatever that is) and convenience. In particular, what it takes to set up a build environment for the tools in userspace, and to package and distribute them.
For instance, how many people care that Docker and Kubernetes is written in Go? Does it have any practical consequence for the user? Would people notice if, say, Ubuntu were to distribute a `grep` that was written in Rust? Or better yet: if unixen were to adopt a replacement for grep that can be trusted to behave in a defined manner?
As for backwards compatibility, well, that ship has kind of sailed since you can't really trust utilities to behave the same way across systems and configurations. For instance if you change locale on machines things may start to behave differently.
("Back in the day", we used to lean quite heavily on "sort", "uniq" and "grep" and a few other utilities in a system that essentially was a hilbilly, shellscript version of map-reduce, and quickly found that these do not behave identically across UNIXen - or even across the exact same version of the same distribution if someone has messed with the locale. So we wrote our own versions of all of these small utilities and made them militantly ignorant of their surroundings. If I learned something from this exercise it is that any promises of "compatibility" are highly dubious and the design of these utilities isn't always so good that one wouldn't be served better by perhaps augmenting the menagerie of utilities with ones that have properly defined, and sensible, behaviors)
The push to use a language comes from the mindshare, and Rust is more popular than Zig, D, or Ada. There does seem that some boundary was crossed: the attempt has decent manpower, and is being taken seriously.
I also think you've gotten the wrong impression. "if its not written in Rust, it should be" is rather wrong, no one is suggesting to replace Python with Rust. Instead, there's a spirit of "if it's written in C, it should be replaced with Rust". Since a lot of free software is written in C, I can't blame you for generalizing.
> Zig, D, Ada, ... they all offer massive benefits over C.
> The only reason there is a push for Rust so much harder than anything else can only be explained by the community and their almost aggressive spirit of "if its not written in Rust, it should be".
Nonsense. Zig is even newer than Rust and still not ready. D pretty much required garbage collection. Ada probably would have been a decent option but it still doesn't solve heap memory safety like Rust does.
GNAT is based on gcc and is as such very portable. The kernel is gpl so no problem even using gnat gpl, right ? But even then gccada has been there for a long time. My guess about rust is more about mindshare and coming from a 'cool' company, and probably timing...
> but it still doesn't solve heap memory safety like Rust does
Care to elaborate on this?
> Ada also prevents dangling references to objects on the stack or the heap, by providing automatic compile-time checking of "accessibility" levels, which reflect the lifetimes of stack and heap objects.[1]
People love writing Rust, if they want to try rewriting stuff why not? Might end up better might end up worse, can't hurt. I doubt it's out of spite for other languages, think it's just out of personal passion and curiosity.
I actually feel like Rust makes more sense for a kernel than user mode-apps. If there's any software that you want provable safety guarantees on, it's the kernel. What bothers me is the indiscriminate push to rewrite any and all user-mode apps in Rust. That one feels far less necessary to me in general, though there are exceptions obviously.
Rewrites in general are usually a terrible idea, unless the original is horribly broken (and beyond fixable).
If you wrote your app in C++, instead of rewriting it for the next decade in Rust just to get to the same point, slap some AddressSanitizer, ThreadSanitizer, -fanalyzer, cppcheck, etc. on it. You will get 98% of the way there.
Instead of spending 8+ years(!) rewriting[1] the GNU coreutils in Rust, you could spend half that time to ensure full coverage (branch-coverage, condition-coverage) in the existing ones. I will actually have a breakdown if someone tries to rewrite SQLite in Rust[2].
There are obviously different people doing this stuff than original GNU coreutils people.
I'd be very wary to try to contribute anything to old GNU project like that, since despite having right licensing ideas, they attract the biggest assholes. This might be another reason why different people are doing rewrites.
And sometimes, it turns out extremely well, like ripgrep.
The contributions of those assholes has been a massive factor in the success of the Linux/GNU ecosystem. I'd strongly urge potential contributors to speak with the real people behind the project rather than blindly trust the advice of somebody on the internet whose affiliations and interests can never be reliably ascertained.
"Kills 98% of viruses" is good enough for hand soap, but it's not good enough for the vital infrastructure that basically all software relies on.
Now that we have self-driving cars, rockets, medical devices, etc relying on this infrastructure, I don't think a push towards total provable memory safety is a bad idea.
(Though notice I said "total provable memory safety". If there's a way to get there without a rewrite in Rust, that's preferable a million times over.)
I hear you. But, to take my example of the GNU coreutils rewrite, even after more than 8 years it only passes 32% of the test cases that the C GNU Coreutils pass. That's not 98%, that's not even 90%. It's hard to rewrite good software.
And something like SQLite shows that you can get 100% coverage (as in, every single statement is hit in the tests).
That's 100%.
The coreutils rewrite is essentially a hobby project, which nobody is going to switch to until it can boast 100% compatibility. So of course it's slow coming, it's not a fair comparison. There's also the issue that writing a drop-in replacement is harder and less attractive than making something better (see ripgrep and other coreutils alternatives).
That is not to say sqlite does not have 100% coverage, it of course does, but you are building on top of infrastructure which is really really hard to be kept safe.
Rust somehow makes it harder to make certain kinds of bugs, which comes at its own cost.
PS: I don't like or dislike rust, just I have seen the sqlite example being brought a lot, and a lot of people didn't expect sqlite file to be an attack vector :) so nobody is even looking there.
> Rewrites in general are usually a terrible idea, unless the original is horribly broken (and beyond fixable).
I would say, it depends on the ROI.
For example, my current millstone, is a 20 year c++ program written towards the c++98 spec. The physical hardware is held together with spit and best wishes with the current management stating that we need to move to a Linux based, Intel platform as that's what 99% of the org is on.
And, we're not licensing Vendor A or B's product, therefore "patch it till the major rewrite is completed".
In the intervening 20 years, C++11, 17 or even 20 is very different from 98.
Will I rewrite the whole thing, nope, as there's another project to rewrite the client/server portion (again 20 years ago) and at that time, I will eliminate that c++ piece.
My feeling is that for coreutils-like user-mode apps, the push is not to rewrite them in Rust, but rather to rewrite them at all.
Coreutils and the standards they are based on are far behind what's possible with CLI UX, and I feel like that amounts for some of the effort. That's certainly why I myself cheer this movement on. I want to see a nicer Unix.
Rewriting/porting the Linux kernel would be a massive undertaking, and guaranteed to introduce new bugs. Not something you would do without massive benefits.
Rust with its safety guarantees has those benefits in a way no other language does.
While Zig enables writing safe code, it does not enforce it. D and C++ mainly improves productivity.
> Rust with its safety guarantees has those benefits in a way no other language does.
Excuse me? Do you know about Ada/SPARK? Rust is super far from Ada in terms of safety. I see misconceptions about Ada here all the time. Well, either that, or they do not know much about the language. Seriously, if you are interested in safety, you must check Ada/SPARK out.
I think you're right, but there's very little chance Ada/SPARK will ever become as popular as Rust because of politics. Rust managed, somehow, to get the backing of tech giants like Microsoft and Amazon... the fact it started at Mozilla, a well-regarded company with lots of influence in tech (despite its downwards trajectory in the browser wars) probably means more than any technical advantage Rust has over the competition (though I have to say Rust's type system and the borrow-checker are really inovative on their own and deserve the general praise they receive).
If Ada/SPARK were to be successful, they would need this corporate backing. Maybe that's possible outside the USA? Say, the EU probably has enough influential tech companies and weight to push forward with a Rust competitor that's perhaps superior (as you claim, I don't know enough about SPARK yet to make that claim myself)? I think that would be extremely benefitial to both the tech world AND to Rust and Ada themselves... competition is a great way to make things evolve fast.
EDIT: Ada/SPARK does seem to have a really good advantage over Rust: it's an actual specification, not just a single compiler implementation where "whatever the compiler does" is how the language behaves. This is very important in many industries. I know there's work ongoing to fix this for Rust, but that may take several years and even never amount to anything.
I am an outsider who has not personally written anything in Ada/SPARK or Rust.
But I have worked at companies that transitioned (or started to transition) from C/C++ to Ada/SPARK and from C/C++ to Rust. I suspect that there are a few additional factors besides politics that prevented some C developers from taking Ada/SPARK as seriously as they seem to be taking Rust.
1) Ada and SPARK still retain a whiff of academic/military design-by-committee which prejudices a certain kind of developer. Whereas the Rust team has been open-source from the start.
2) The military projects I am familiar with started switching to Ada (and later SPARK) due to the US DoD mandate in the 90s. Of the people I know who use it, many of them started using it because their job required it. (Though to be honest many of them now love it!) Whereas Rust adoption seems to be largely driven by a hearts-and-minds initiative to convince regular developers that memory safety is a problem in C, and that they should choose Rust to solve that problem.
3) Syntax. Some C developers don't seem to consider a language a Serious Programming Language unless it uses curly braces. (Just ask a C/C++ developer about Lisp's parentheses or Python's whitespace.) Whereas the Rust team decided to hew close to the conventions used in C wherever appropriate.
I have to say that I love curly braces, but given some Ada's features I can forgive the language for the lack of it. :) But yeah, those points are fair.
I apologize if I was unclear. Obviously other languages have type safety. But a language like Haskell is not very suitable for writing a kernel. I believe Rust is the language that has the right feature set and momentum to even have any hope of being taken seriously for such a task.
Rust isn’t even the only mature systems language that offers memory safety.
In this thread people have thrown Haskell and Perl as examples of languages not suitable for writing a kernel and while they’re right it’s still a straw man argument because they’re exampling languages that nobody is advocating for kernel development.
C was already bad when it appeared, the only reason it won, was UNIX being made available for a symbolic price with source code.
Had AT&T been allowed to take commercial advantage of UNIX, we would probably be coding OSes in PL/I, BLISS, Mesa, Modula-2, Ada or some offspring of them.
I never understood how C took over the desktop space as well. Pascal was a way better language and already dominant for home computing back in the 80s. Then by the 90s everywhere was C and C++.
I never quite understood how or why that happened.
>Pascal was a way better language and already dominant for home computing back in the 80s.
Not sure what you mean there ("dominant"?) but in the 1980s, BASIC was the dominant language on home computers because that's what they included. Commodore VIC-20 & 64, Atari 400/800, Texas Instruments TI-99 TI-BASIC, and IBM home computers like the PC jr had IBM-BASIC which was licensed from Microsoft's GW-BASIC.
Something like Turbo Pascal was only purchased by a minority of 1980s computer users. That's why 1980s home computer enthusiast magazines like Compute! published their code lists in BASIC: https://www.google.com/search?q=%22compute!%22+magazine+code...
>I never understood how C took over the desktop space as well.
I'm guessing it's because C Language was already popular on the commercial side because of UNIX implementations which spread into microcomputers. Microsoft dabbled in XENIX which was AT&T UNIX. In the 1980s when popular desktop programs such as DOS, Lotus-123, and WordPerfect migrated from pure assembly to <high_level_language> ... they all ended up choosing C instead of Pascal. Why did industry converge on C? Maybe because there were more C compilers for various platforms. (The portability and cross-platform angle.)
> Not sure what you mean there ("dominant"?) but in the 1980s, BASIC was the dominant language on home computers because that's what they included.
That was late 70s / early 80s and the firmware was all assembly with most software being written in machine code because the speed of those BASIc interpreters left a lot to be desired.
I’m talking later in the decade when operating systems became the norm. Early versions of Windows and Mac OS were written in Pascal.
>Early versions of Windows and Mac OS were written in Pascal.
Windows 1.0 in 1985 was written in C Language and assembly. The x86 alternative os to MS-DOS such as Digital Research GEM was also written in C and assembly.
> The x86 alternative os to MS-DOS such as Digital Research GEM was also written in C and assembly.
Minor nitpick from what you’ve posted above (I’m sure you know this and perhaps had a brain fart when you posted):
GEM was DRs GUI so an alternative to Windows. Their DOS equivalent (in fact the predecessor to DOS technically speaking) was CP/M.
I do take your larger point though and thank you for the links. It was very informative. However that example aside, it’s still worth noting that Pascal was a popular language that fell out of favour in the 90s as C its derivatives swept through the industry. Now it seems uncool to have ALGOL-like syntax with languages either opting for C-braces or being whitespace driven like Python. And I see that as a loss in terms of readability.
I’m probably just sound old now though — moaning about “the good old days” lol
Not in what concerned 16 bit platforms, at least in Europe.
Turbo Pascal, and compilers for Basic dialects were everywhere.
On Amiga, most stuff was being done in Assembly, Modula-2, AMOS, and yes if you bought the Commodore SDK, there was a C compiler.
Mac OS was written in Object Pascal, and eventually moved into C++ with MPW and Symantec PowerPlant around 1992.
On OS/2 we had Smalltalk and C++ with CSet++, on Windows Delphi (TPW still managed an appearance on Windows 3.1), VB, C++ with OWL, VCL, MFC.
Even if the kernel for Windows and OS/2 was written in C, the upper layer was all about C++ with those SDKs, which meant using proper strings, vectors and collection classes with bounds checking was already quite an improvement anyway.
C and UNIX only took new wind into their sails thanks to the GNU Manifesto that all FOSS software should be written in C.
I’m sure Pascal was much more widespread than your post suggests though. I’m sure I read somewhere that early Windows (as well as Lisa) was written in Pascal. And I do remember Pascal compilers for the Atari ST too. I’d be amazed if there wasn’t one for the Amiga.
But even just going back to DOS, Turbo Pascal was a real game changer for me.
These languages, when used without garbage collection and with dynamic memory allocation, are not memory safe.
Gaining memory safety at no cost is the whole point of rewriting in Rust, since it dramatically changes the behavior of the resulting program (it no longer has memory-related bugs and vulnerabilities).
The other languages might have slight benefits over C, but they either also have drawbacks (with makes them unsuitable) or the benefits are not worth a rewrite.
If it were some other open source project I could see where you're coming from, but if there's one community I do not expect to kowtow to language hype and zealotry it's the kernel devs. You know, the people who still coordinate their development through mailing lists and sending unified diffs around.
I write a lot of C, some of it in the kernel (although I don't think any of my code made it upstream since I mostly do vendor code for vendors who don't care for mainlining, so I don't think I qualify as a true kernel dev).
Out of the languages you mention I think only Zig could be a serious contender but it's probably way too novel and niche to be a serious contender at this point. D has a GC and is probably less popular than Rust nowadays, Ada uses garbage collection (or other forms of memory management that don't seem very suitable for kernel work AFAIK).
But most importantly there seem to be people willing to actually write Rust code in the kernel and add support for it, whereas I don't see a very strong initiative to push Ada upstream...
C was a terrible language from day 1. But cheap high-performance compilers were rare until fairly recently, and language ergonomics or expressiveness are hard to measure objectively, whereas "my language is faster on these unrepresentative microbenchmarks" is easy to measure. If unix culture had cared about safety or quality or really anything other than microbenchmarks and the feeling of superiority you get from doing something gratuitously difficult they'd have rewritten everything in OCaml back in the '90s, sure. But kernel developers love the macho culture of being a hardcore elite even more than regular developers, and once you've framed the question "how fast a language should we use?" it's very hard to answer anything other than "the fastest language possible".
Rust is the first decent language that can match C on the silly microbenchmarks. D doesn't, Ada doesn't, and Zig is far less mature than Rust. That's the real reason it's succeeding where previous alternatives failed.
> The only reason there is a push for Rust so much harder than anything else can only be explained by the community and their almost aggressive spirit of "if its not written in Rust, it should be".
It could also be that Rust is just much better for the kernel use-case than the alternatives we've seen so far?
> No regard for software that works great, the engineers that put time and effort into it to make it secure and fast, if its not Rust it must be broken.
I wasn’t aware that this was a widespread attitude in the rust community, my personal experience has been the exact opposite..
It’s probably a view driven by the fact that there’s a lot of “Show HN” that mention Rust, or are just pet projects to rewrite something in Rust. It can definitely give the appearance of “if it’s not Rust, it should be”
The idiopathic priapism manifested by Rusties is pretty much identical to the enthusiasm every junior developers seems to show for rewriting everything. Toy examples, software that hasn't yet had contact with production, it's all just so easy. Then comes shifting and unclear requirements, change orders, prioritizations, schedules, resource constraints, career changes, cowboys and petty dictators, and next thing you know your mature C-based code mess is replaced with a less mature Rust-based code mess and the few technical edge cases it was designed to avoid haven't solved any of the big important problems in software development.
Writing secure C code is a fool's errand. The sheer volume of C code and the vulnerabilities therein is perhaps the biggest obstacle to computer security today. I don't care if it's Rust or some other language, but it has been clear for some time that C needs to go.
I'm pretty sure that if you compare total "lines of code written in C/C++" or "projects written in C/C++" versus "vulnerabilities found" you'd be surprised of the (small) result.
See. Not all code is "vulnerable", connected to the net or facing a user.
But let's leave outside of the count every other project that is not a Triple-A videogame written in C/C++. That amount of projects alone probably would be larger than the quantity of "big" projects that would be written in Rust in the next 10 years. And they are not all "packed with vulnerabilities", or at least, nothing that's big enough to be read here on HN.
Nothing can beat Ada when it comes to safety, and it has all the constructs in the language you want, and it has all the positives that you want. Heck, there is a reason for why Ada/SPARK is used for a lot of important projects, say, a cryptographic library.
In any case, Ada would be the perfect choice, IMO.
> Why, if C is so bad, wasnt one of the alternatives introduced to slowly replace C in the kernel years ago?
There are very few alternatives, and even fewer of them gained much popularity, so Rust is an exception here. Being better than C technically is not enough, you need something that other people want to use (and ideally already use). Kernel is more of a social than technical thing.
> if its not Rust it must be broken
That's simply true, especially for operating systems. The amount of safety bugs they have is staggering, even after decades of work, and Rust largely fixes that.
I think it will matter where the model in C does not match the hardware in terms of CPU, cache, and auxiliary processors. Maybe the Rust developers will more quickly adjust to the assembly and machine language of chip developers.
As an outsider, it sounds like C just tries to be a good enough assembly language to save people some time.
Kind of. Unix was originally written in assembly. B was then created to write some user land applications but it lacked some features that the PDP-11 supported and thus it was slow. Eventually as those features were added that language was re-invented as C. However it wasn't until Version 4 when the Unix kernel was rewritten in C. Which is still very early in Unix's time line but certainly after the creation of Unix.
The kernel IMHO needs to be consistent and the tools to build it need to be widely available. Many distributions have everything you need to build the kernel already installed. I think Rust will be a good choice for the kernel when people start talking about Rust software with some new language "X".
I think Golang was put on a similar pedestal for a time and it got a lot more silent again. I really think it is a phase, but Google is also pushing for this. I talked about someone that it was mainly to have a better pipeline for new developers, but I don't yet see how that is the case.
This is HN... every few days there is another post about "$old_software in $new_language_of_the_week", years go by, languages of the week change, from ruby, to go, to rust, to...., and the old, original software is still the only useful variant.
Note that this submission is about a learning project where someone aims to write a (minimal?) Linux ABI compatible kernel.
To directly answer your question though. It's not that 'Rust memory allocations' panic. Rather it was the case that allocations widely used when the initiative was started could panic. This is being solved by writing a library with non-panicking memory allocations to be used in kernel development [1].
However, I'm not involved with this in any way, so I may be wrong.
Anyway, I don't see paniccing as something a Linux kernel must never ever do. After all, we are all familiar with kernel oopses and panics, so clearly the existing body code already does it some situations. Just have the Rust code do that, if it wasn't its current behavior.
One can argue Rust is one way to get rid of those panics/oops by e.g. statically guaranteeing a pointer is not null after a certain point.
He (and other prominent kernel developers) has manifested his interest many times, but is taking a strict "wait and see" approach (as you would expect from any big project manager). Whether it happens or not, Rust has gotten closer to inclusion in Linux than any other language.
Where doesn't Rust 'compete' with C, and where is C the better choice in a kernel?
Sometimes an OS rewrite, even in a different language, can be faster for some tasks (IIRC Singularity and/or Midori outperform(ed) Windows on some tasks, but I can't find the link).
> Where doesn't Rust 'compete' with C, and where is C the better choice in a kernel?
My opinion as someone who writes C for embedded devices and has started to play with rust:
C is better than Rust primarily when the C code already exists and the Rust code doesn’t.
Rewriting the kernel will probably have little impact in terms of performance for the user, the main benefits would be for developers, because Rust can be more expressive (less boilerplate and simpler ways to do some things), and it should have less bugs, since it’s performing more complex checks than C code would.
For me as an individual, I’d prefer to be writing Rust, I find it much more ergonomic (for my workflow at least). The main drawback I have is that no one else in my team really knows Rust (although that’s slowly changing).
I'm wondering more about accessing hardware primitives - if you spend a dozen lines marshalling unsafe code around and ultimately writing the equivalent C code in Rust, and slightly slower (perhaps the compiler optimizes anyway?), is it worth the effort?
> if you spend a dozen lines marshalling unsafe code around and ultimately writing the equivalent C code in Rust, and slightly slower (perhaps the compiler optimizes anyway?), is it worth the effort?
The thing is, that's not generally where most of the bugs and development blockers lie here. When embedded development gets stuck, is generally because:
a) A part of the hardware doesn't work like you think it works.
b) Two or more independent parts of code (think ISRs) are conflicting in some non trivial way.
Having well structured HALs helps with (a), while memory safety can prevent some errors on (b).
One of the nice things I've seen with Rust is how high quality the HALs look, even if some of the tooling looks a little shaky for my taste. Of course part of that is that manufacturer provided HALs are generally quite bad...
Also having lots of people share the same unsafe blocks is the best way to keep the bugs away, and sharing libraries with cargo beats the real way C code gets shared in embedded development (mostly through copy/paste from StackOverflow...).
I would like to see more innovative kernels and not toy projects here on HN, and I mean kernels implemented by people with years of real world experience that bring new ideas (any dude can read some blog and book and badly re implement a mini subset of Linux)
Hello everybody out there using minix - I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready.
Written by Linus Torvalds on Sunday, 25 August, 1991 in comp.os.minix
You should then search and find all ShitApp made in Rust and post it on HN because the chance that it will be good in 15 years is not zero, probably 1/10^10^10^10 .
Eventually, a full OS or OS kernel written in Rust is going to compete with Linux whether if you like it or not, unless either the maintainers want to have fun fixing the same old C vulnerabilities and bugs for more years or they adopt Rust themselves.
You may not like it, Linus or his maintainers may not like it, but that is where the future is heading.
Downvoters: I know. The future is Rust and change is scary for something like Linux. Good luck finding mountains of use-after-frees or memory corruption vulnerabilities which fixing those in every part of the kernel is a losing battle that Rust has already eliminated in the first place.
What I want is that HN will have less toy stuff upvoted because the title contains a specific keyword. I will submit tomorrow my X11 re implementation in Rust, it is a thing I made in weekends, full of bugs , it is missing everything and I am not experienced in anything graphics related but I read half a book about X11 and a fvew Rust blogs and now I will be on HN first page and get tons of stars on GitHub.
Long story short, HN should not be IMO a Rust forum where any toy or shit re-implementation is admitted but it is what it is, fanboys will up-vote anything until the Rust image gets completely destroyed.
Exactly. That also goes for the Rust upvote rings artificially voting any projects with 'written in Rust™' in the title or every single Rust patch release.
Chances are, the crate is not even production ready.