Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's no nonsense in the terms that it is explained: although a problem may be mathematically undecidable, it may still be a solved problem in practical terms - i.e. as finding good enough solutions for real world instances.

This is well known, that's the whole reason for formal methods in CS or of languages like Coq, Idris and so on. There is nothing revolutionary about it.

The author should probably spend more time explaining why they think we will see a golden age of formal methods when the industry has consistently been pointing in a different direction for decades, instead of making vague associations or insinuating that we "misunderstand" Turing.

> As for point 3, I'd say you're making the same mistake; a model may be mathematically powerful yet practically unwieldy, and computing models from the 50's suffer this problem.

Compilers, interpreters etc. can easily translate between equivalent computational models (e.g. Lambda calculus and Turing machines), but the formal results, including undecidability, apply to all of them. So it doesn't matter that a Turing machine is hard to program; nobody wants to actually use it as a programming language. It's a computational model.

> Similarly, the Von-Neumann architecture was a great solution for computing physics problems with the electronics available in the 50's, but basing every modern programming language on that paradigm is a mistake.

I would say that, if you make such a claim, you should state why it is a mistake and what should be done instead.



> I would say that, if you make such a claim, you should state why it is a mistake and what should be done instead.

Others have explained the problem with the von Neumann bottleneck,[1][2] and ways to overcome it or even redefine it.[3]

The industry is evolving towards programming paradigms less affected by that original computation model such as functional, functional-reactive, or agent-based programming(*), yet most of those are still converted down to Von-Neumann-style runtimes to be processed. Now that machine learning is increasingly being used to solve new problems, many computations may be done with dedicated non-VN hardware like tensor processors or analog computers (which are coming back with a vengeance).

(*) Listed in increasing order of real-world expressivity, although their computational expressivity in all them is mathematically equivalent to Brainfuck.

[1] https://dl.acm.org/doi/10.1145/1283920.1283933

[2] https://www.techtarget.com/whatis/definition/von-Neumann-bot...

[3] https://www.sigarch.org/the-von-neumann-bottleneck-revisited...*




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: