YES! This is exactly my complaint about math. I find the abstract proof-oriented math kinda interesting (although How to Prove It is an exception that I discuss below) but I _really_ want practical real-word applications of these maths.
Even though I've learned linear algebra decades ago, Andrew Ng's example of using a matrix to encode 5,000 images then doing linear algebra on it blew my mind. I've since used that perspective in many other fields. Not once have I applied a proof to solve a programming problem.
I've thought of publishing, i.e.blogging, examples that I've come across but that would just be a mish-mash of stuff I've read elsewhere with no overarching theme/framework. Besides, someone else must have done this, no?
> Not once have I applied a proof to solve a programming problem.
There are people in this very thread insisting that proofs are extremely useful in programming. I dunno if I just picked up the same skills elsewhere (Logic? Philosophy? Just... IDK, thinking and developing an absolute shitload of heuristics through years of experience?) or am entirely missing out and in fact don't have a clue how to program, but I don't see it (outside some rare niches where it probably is useful—coq exists, after all).
Sure, the word "exhaustive" can apply both to accounting for all (reasonably) possible problems in a block of code, and also to proofs, but the former doesn't feel at all like working on proofs, to me, to pick just one example (and some posts have seemed to imply that accounting for e.g. edge cases is exactly one case in which experience with proofs come in handy, but man, they feel like very different and barely-related activities to me).
In a way, writing types is proof-esque. It establishes some basic correctness guarantees regardless of the programming language. Some concepts like refinement types make it seem more "mathy", by requiring quantifiers within programs: https://ucsd-progsys.github.io/liquidhaskell/blogposts/2019-...
Someone without any background in the subject would probably find Dafny interesting.
In my experience, being familiar with inductive proofs is pretty useful for programming. For most non-trivial code involving recursion or loops the way I personally "know it works" usually has the flavour of an inductive proof.
For example, I might not remember how to code bisection search but I can figure out the loop invariant and from there it's easy enough to code a working binary search. And even if you have right bisection search down to muscle memory, you can modify the loop invariant to create left bisection search if you need it, while if I tried to keep two binary search algorithms in my head I feel like it would be more error-prone.
Even though I've learned linear algebra decades ago, Andrew Ng's example of using a matrix to encode 5,000 images then doing linear algebra on it blew my mind. I've since used that perspective in many other fields. Not once have I applied a proof to solve a programming problem.
I've thought of publishing, i.e.blogging, examples that I've come across but that would just be a mish-mash of stuff I've read elsewhere with no overarching theme/framework. Besides, someone else must have done this, no?
EDITED: Used the correct book title.