Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a matter of economics already.

There are safer, better alternatives for almost everything we do. But economy dictates that we end up with a compromise. Rust would be just another compromise, slightly different stage, no huge difference and potentially a huge cost.

Silver bullets in software development do not exist, rust is no exception to this and the irrational hyping of rust as being a silver bullet actually has the opposite effect.

If rust is that much better at all aspects of software development (and not just in preventing one class of bugs) then it will find mainstream adoption. But you don't get that effect by ramming it down other people's throats, you get that effect by showing it in practice. And this is where rust - at least so far - is underwhelming.

Also, and this is another point of irritation with me, the rust community makes it seem as if theirs is the only language that will avoid this kind of bug, which is far from true, there are other platforms / languages with far wider adoption that have these traits as well.



I'm not promoting Rust, nor have I actually written anything in it. I do think though it's a step in the right direction.

> But you don't get that effect by ramming it down other people's throats, you get that effect by showing it in practice.

I disagree. I've seen many C programmers who think that C is the be all end all of programming languages. Those people will not be convinced by showing them another technology that is better in practice. The fact that it's actually unbelievably hard to write correct software in C is somehow a point of pride for many people, though I speculate that few of them can actually follow through.

Of course there will always be logic bugs in software, even with formal systems and whatnot (i.e. bugs in the specifications). But memory bugs and the resulting security exploits could in many cases be a thing of the past already.

Many industries have similar regulations, i.e. seatbelts in cars. If somebody dies because of a faulty or non-existing seatbelt, there will be consequences. If somebody dies because they were talking on the phone and the car didn't prevent it, well, you can't control everything.

Is this a buffer overflow? Why was this written in C? No good answer - pay up.


> There are safer, better alternatives for almost everything we do. But economy dictates that we end up with a compromise. Rust would be just another compromise, slightly different stage, no huge difference and potentially a huge cost.

I think the argument is that the economics need to change. People need to stop being irresponsible - we need liability when someone is negligent, just like in any other engineering discipline.

> Silver bullets in software development do not exist, rust is no exception to this and the irrational hyping of rust as being a silver bullet actually has the opposite effect.

I keep seeing this, and yet I have not once seen anyone call it a silver bullet.

> Also, and this is another point of irritation with me, the rust community makes it seem as if theirs is the only language that will avoid this kind of bug, which is far from true, there are other platforms / languages with far wider adoption that have these traits as well.

Show me another language with no garbage collection, memory safety, C/C++ level performance that has been anything other than academic.


> I think the argument is that the economics need to change. People need to stop being irresponsible - we need liability when someone is negligent, just like in any other engineering discipline.

I totally agree with that and have been a long time proponent of liability for damage caused by software bugs.

But that would have to be all bugs, not just some classes of bugs.

> Show me another language with no garbage collection, memory safety, C/C++ level performance that has been anything other than academic.

D.

And by the way, rust is only 'memory safe' as long as you don't disable the safety mechanisms, so I think 'memory safe by default' would be a better way to describe it.


> I totally agree with that and have been a long time proponent of liability for damage caused by software bugs. But that would have to be all bugs, not just some classes of bugs.

Why? That goes against everything we've done to categorize bugs - we do not treat all bugs the same, and we would not consider all bugs to be due to negligence.

> D.

Only with a garbage collector or a 'safe' annotation. Rust has the opposite - a safe language by default.

> And by the way, rust is only 'memory safe' as long as you don't disable the safety mechanisms, so I think 'memory safe by default' would be a better way to describe it.

That's fine, memory safe by default is an acceptable way to refer to it. It's worth noting that I have written thousands of lines of rust code and never published code with a single line of unsafe. None of my projects have required it. So 'by default' is pretty powerful, since I've never ever needed to opt out.


> Why? That goes against everything we've done to categorize bugs - we do not treat all bugs the same, and we would not consider all bugs to be due to negligence.

That would be something you could only know by evaluating this on a bug-by-bug basis, the important thing to realize is that to an end user it simply does not matter what class of bug caused their data-loss, loss of privacy or loss of assets.


My point is that not all bugs lead to data loss or loss of privacy. But yes, I agree.

That said, I feel that we're now talking about memory safety vs semantic safety.

We can certainly prove a program is entirely memory safe. Rust's type system is prove, so the work is then to prove the unsafe code safe (actively being researched).

There is no way to prove all semantics of a program (Rice's theorem). Therefor I would argue that a bug is due to a semantic issue is not necessarily negligent, whereas we could easily see memory safety issues from using C as a case of negligence.

But in terms of liability they would likely both fall into the same bucket.


Negligence in the legal sense of the word revolves around care. If someone is maintaining some codebase and causes a memory safety related bug you would have to look at that bug in isolation and what the person did to avoid it. Saying 'you should have rewritten this in rust in order to avoid liability' is not a standard of care that anybody will ever be held to.

So the ideal as you perceive it is so far from realistic that I do not believe pursuing that road is fruitful.

What we can do is class bugs (or actually, the results of bugs, the symptoms that cause quantifiable losses) in general as a source of (limited) liability. This will put those parties out of business that are unable to get their house in order without mandating one technology over another (which would definitely be seen as giving certain parties a competitive edge, something I don't think will be ever done).

So that's why I believe that solution is the better one.

But in a perfect world your solution would - obviously - be the better one, unfortunately we live in this one.


> Saying 'you should have rewritten this in rust in order to avoid liability' is not a standard of care that anybody will ever be held to.

Why is 'you should not have exposed C code to the internet' an unreasonable basis for care?

I'm not saying your approach is wrong, I think they're not mutually exclusive - your solution would provide incentive to not use C in a technology agnostic way. But at some point isn't it just irresponsible to write critical infrastructure, or code that exposes user information, in a language that has historically been a disaster for security?


> Why is 'you should not have exposed C code to the internet' an unreasonable basis for care?

Because it does not mesh with reality. There are 100's of millions of lines of C code exposed to the internet in all layers of the stack. So if you start off with a hardcore position like that you virtually ensure that your plan will not be adopted.

It's the legacy that is the problem, not the future.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: