I thought TypeScript was able to analyze some static code, like below:
const alwaysTrue = true
if (alwaysTrue === false) { }
// This comparison appears to be unintentional because the types 'true' and 'false' have no overlap.(2367)
The back and forth on that thread seems to result in advocating for what we’ve got: compiler (TypeScript in your example) not caring, but linter caring.
The reason not to in the transpiler is it can be very helpful during testing to block or force some paths.
TypeScript is a superset of JavaScript, so they can't just like.. stop you from writing valid JS. Generally speaking, any valid JS is valid TypeScript. It's not invalid to write that, so TS won't do anything to stop you from doing so. However, fortunately linters exist for the exact reason of catching these sort of "gotchas" or "probably not what you were intending" kind of situations.
TypeScript seems to often make arbitrary decisions what they consider a type error and what is not. The argument is often "TypeScript supports everything in JavaScript" but that is done inconsistently.
A typical example is string + number results in no warning. I understand that some people actually want to do it for "convenience", but often this turns out to be a mistake (e.g. passing the wrong variable). However, if you have a Map<string, number> and tries to do map.get( 3 ), that's an error even though you can absolutely do that in JavaScript -- it just always returns undefined, no error thrown.
You can find a stackoverflow thread about this. To me the current behavior does not make any sense -- I use TypeScript for static typing and avoid any implicit type conversion, and I want that to be consistently applied without exception, that's why I am using TypeScript. I am a bit disappointed that there are a number of such holes in TypeScript.
I thought TypeScript was able to analyze some static code, like below: