1. Overflow doesn't cause memory unsafety in Rust, so trying to imply that it's not a genuinely safe language is incorrect.
2. Even if it did, saturating on overflow is exactly as safe as trapping, and has the added benefit of not introducing an untold number of new ways that your program can panic.
3. Trapping on overflow is oversold as a solution to logical errors because you're still opening yourself up to approximately four billion (or 18 quintillion, depending on width) invalid states before your program bites it. If you want a real solution, stick numeric bounds in your type system.
So, I'm not really caring about memory safety--there are lots of reasons to care, mind you, but that's not my concern.
It's that, quite simply, not being able to easily catch integer arithmetic errors is a really annoying flaw in a language that needs to do systems work--especially if it's used in places where, say, you are decoding buffers or handling IO or basically getting data from an untrusted/faulty source.
It could even be done as a basic addition to the standard library...I frankly don't care how. It's just that it is a really obvious wart to anyone that's ever had to do safe integer work in the lingua-franca of languages, C.
Just because the program doesn't crash doesn't mean it's being useful.
It sounds like there is confusion between "trapping" and "wrapping" here.
"Introducing 4 billion [...]" sounds like a description of the failure modes of wrapping. In contrast, trapping interrupts the flow of the program, so there are zero "invalid states before your program bites it".
Trapping seems like a very appealing solution, from the perspective of application code. The primary downsides are that it's somewhat inefficient in today's popular hardware architectures, and it's somewhat inconvenient for optimizers.
They are invalid states because, with the exception of indexing into an array or doing something else directly related to the memory size of your platform, INT_MAX is never the point at which the value stored in a fixed-width numeric type ceases to make sense. If I have an RPG where I want to cap a character's stats at 99 (been playing too much Dark Souls, I think...) yet fail to implement a manual sanity check, I open myself to the possibility of, at best, 128-99=29 invalid states (and in the worst-case, if I used a u64, I get the aforementioned 18 quintillion invalid states). My program is invalid during this time, but trapping arithmetic can't help me. This is why I said that to actually solve this problem you need numeric bounds in your type system.
When I was in high school, my friend and I played a robot fighting game, where you had some fixed energy and every action would consume some of it. My friend found a way to exercise a large number of expensive actions in one turn, resulting in energy underflow. With some care, he found a way to achieve the maximum possible energy. At no point was the energy out of bounds.
Sanity checks can detect invalid states, but not invalid calculations (which have intermediate values that may exceed the bounds - consider calculating the average stat in your Dark Souls example). You can get an unexpectedly valid state from a calculation that overflows, and thereby pass a manual safety check. Trapping arithmetic will catch these.
You are correct that trapping on overflow isn't useful as a substitute for bounds checking on variables that have a well-defined maximum and/or minimum value. However, that doesn't mean overflow checking is useless. It remains extremely useful to prevent silent incorrect behavior when an otherwise unbounded value hits implementation limits. In fact, I'd call it essential to any language that aims for the level of robustness Rust aspires to.
I'm not trying to say it's useless. :) I think it's very valuable, but we need to be honest about its shortcomings rather than viewing it as a sheer win. A language like Rust is completely worthless if it's not fast; we've got a dozen languages that are already memory-safe and within 2x the speed of C. Given that Rust's plethora of safety mechanisms mean that numeric overflow can't cause memory unsafety, I completely understand why they would take the practical route rather than the theoretically-perfect route (even if, as predicted, they get crucified for it by armchair language designers).
3. What numeric bound should YouTube have placed on their view counter (in the case I linked earlier)? 2 billion? Why? If the answer is "Because that's all that fits in an i32" then we're right back where we started: overflow takes many programs directly from a valid state to an invalid one.
Some values that you wish to model don't have any logical upper bound. In this case the correct solution is to forgo fixed-size integers entirely and use an arbitrary-precision integer. If that is impractical for your given domain, then you need to select the largest fixed-size numeric type that is practical and resign yourself to some degree of potential incorrectness.
What's the problem with just trapping overflow by default again?
I'm willing to bet many would happily take the performance hit in favor of having a genuinely safe language.