Nope. You don't need to "solve the halting problem" and I guess I'll explain why here.
Firstly lets attack the direct claim. The reason you'd reduce to the halting problem is via Rice's theorem. But, Rice only matters if we want to allow exactly all the programs with the desired semantics. In practice what you do is either allow some incorrect programs too (like C++ does, that's what IFNDR is about, now some programs have no defined behaviour at all, but oh well, at least they compiled) or you reject some correct programs too (as Rust does). Now what we're doing is merely difficult rather than provably impossible, and people do difficult things all the time.
This is an important choice (and indeed there's a third option but it's silly, you could do both, rejecting some correct programs while allowing some incorrect ones, the worst of both worlds) but in neither case do we need to solve an impossible problem.
Now, a brief aside on what Rust is actually doing here, because it'll be useful in a moment. Rust's compiler does not need to perform a static bounds check on array index, what Rust's compiler needs to statically check is only that somebody wrote the runtime check. This satisfies the requirement.
But now back to laughing at you. While in Rust it's common to have fallible bounds checks at runtime (only their presence being tested at compile time) in WUFFS it's common for all the checks to be at compile time and so to have your code rejected if the tools can't see why your indexing is always in bounds.
When WUFFS sees you've written arr[k] it considers this as a claim that you've proved 0 <= k < arr.len() if it can't see how you've proved that then your code is wrong, you get an error. The result is that you're going to write a bunch of math when you write software, but the good news is that instead of going unread because nobody reviewing the code bothered reading the maths the machine read your math and it runs a proof checker, so if you were wrong it won't compile.
I'm glad that I provide amusement, but rustc-1.63.0 still compiles this program that panics at runtime. My experience with Rust is 5 min, so the style may provide further amusement:
fn f(n : usize) -> usize {
if n == 0 {
10
} else if n % 32327 == 0 {
1
}
else {
f(n-1)
}
}
fn main() {
let a = [1,2,3,4,5];
let n = f(12312);
println!("{:?}", a[n]);
}
> but rustc-1.63.0 still compiles this program that panics at runtime.
An index OOB error? Here, it's important to remember the Rust panic is still memory safe. Perhaps you should read the article, or read up on what undefined behavior is?[0] Here, the Rust behavior is very well defined. It will either abort or unwind.[1]
If you prefer different behavior, there is the get method on slices.[2]
Statically asserting at compile time that all memory access are either in-bounds, or will result in a controlled unwind or exit of the process, guarantees there are no memory safety violations.
We understand you're saying it's not possible in the general case to assert that all memory accesses are in bounds. Instead of that, if you ensure all memory accesses are either in bounds or that they at least do not violate memory safety, you've achieved the requirement of "memory safety", regardless of runtime inputs.
> This subthread is about claims of STATIC checking of memory safety. A panic is not static checking. Perhaps you should read what you respond to.
Oh, I read it.
Rust, and for that matter, the person to whom you are replying, above, never claimed that Rust could statically check array bounds. You created that straw-man. Yes, Rust does use static analysis, as one important method to achieve memory safety, but Rust doesn't require static analysis in every instance for it to be Rust's big step function, or for Rust to achieve memory safety.
Yes, certain elements of memory safety can only be achieved at runtime. Fine? But using static analysis to achieve certain elements of memory safety at compile time is obviously better where possible, rather than only at runtime, such as re: Java or Fil-C?
If the only way of triggering spatial memory unsafety in C++ was vector[i] and that operation was defined to always interrupt execution, then yes, C++ would be considered memory safe. But that is not the case.
The equivalent of vector[i] in Rust is Vex::get_unchecked, which is marked as unsafe, not the default that people reach for normally.
We are, however, talking in this subthread about the compiler inserting bounds checks and (incorrectly) calling the process "static checking".
I refuted that point by pointing out that the same process, if done manually in C++, would not be considered "static analysis that provides memory safety for array access".
At compile time, Rust guaranteed memory safety here by ensuring a runtime check was present, which your code triggered, resulting in a panic.
A panic is not a violation of memory safety; if you wanted to violate memory safety you'd need to have caused that deref to succeed, and println to spit out the result.
EDIT: let me guess, did the panic message include "index out of bounds: the len is 5 but the index is 10"?
Firstly lets attack the direct claim. The reason you'd reduce to the halting problem is via Rice's theorem. But, Rice only matters if we want to allow exactly all the programs with the desired semantics. In practice what you do is either allow some incorrect programs too (like C++ does, that's what IFNDR is about, now some programs have no defined behaviour at all, but oh well, at least they compiled) or you reject some correct programs too (as Rust does). Now what we're doing is merely difficult rather than provably impossible, and people do difficult things all the time.
This is an important choice (and indeed there's a third option but it's silly, you could do both, rejecting some correct programs while allowing some incorrect ones, the worst of both worlds) but in neither case do we need to solve an impossible problem.
Now, a brief aside on what Rust is actually doing here, because it'll be useful in a moment. Rust's compiler does not need to perform a static bounds check on array index, what Rust's compiler needs to statically check is only that somebody wrote the runtime check. This satisfies the requirement.
But now back to laughing at you. While in Rust it's common to have fallible bounds checks at runtime (only their presence being tested at compile time) in WUFFS it's common for all the checks to be at compile time and so to have your code rejected if the tools can't see why your indexing is always in bounds.
When WUFFS sees you've written arr[k] it considers this as a claim that you've proved 0 <= k < arr.len() if it can't see how you've proved that then your code is wrong, you get an error. The result is that you're going to write a bunch of math when you write software, but the good news is that instead of going unread because nobody reviewing the code bothered reading the maths the machine read your math and it runs a proof checker, so if you were wrong it won't compile.
Edited: Fix off-by-one error