> the Unicode string "√5" is representable as 4 UTF-8 bytes
As the other person pointed out, this is representing an irrational number unambiguously in a finite number of bits (8 bits in a byte). I fail to see how your original statement was careful :)
> representable in a finite number of digits or bits
Isn't "unambiguous representation" impossible in practice anyway ? Any representation is relative to a formal system.
I can define sqrt(5) in a hard-coded table on a maths program using a few bytes, as well as all the rules for manipulating it in order to end up with correct results.
Well yeah but if we’re being pedantic anyway then “render these bits in UTF-8 in a standard font and ask a human what number it makes them think of” is about as far from an unambiguous numerical representation as you could get.
Of course if you know that you want the square root of five a priori then you can store it in zero bits in the representation where everything represents the square root of five. Bits in memory always represent a choice from some fixed set of possibilities and are meaningless on their own. The only thing that’s unrepresentable is a choice from infinitely many possibilities, for obvious reasons, though of course the bounds of the physical universe will get you much sooner.
I think what the parent comment is saying is that, lobster was likely introduced as an elite/rare dish to people in the current century increasing the appeal
That is exactly the point they are trying to make... that you enjoyed it BECAUSE you thought of it as a delicacy and not as peasant food.
I think the point is a little overwrought, really... while our expectation is part of what makes it taste good, it doesn't completely change what we think... there are a lot of foods that are considered delicacies that a lot of people don't like.
> Because our program just consists of a sequence of regular expressions, you can't loop at all! That, technically, means we can't actually perform Turing Complete But we can do any bounded computation by just unrolling any loops we may have.
Although some (most?) implementations may be. Though by the above quote, the author didn't make use of that.
That's the point, I think: for a large number of real-world algorithms, you don't actually need a Turing Machine. There was a very well-written explanation of this on the front page[1] some time ago, concluded with:
> Any algorithm that can be implemented by a Turing Machine such that its runtime is bounded by some primitive recursive function of input can also be implemented by a primitive recursive function!
Also, "The Little Typer" book explores a language based on "primitive recursive functions" and shows what can be done in it and how.
I appreciate the resources and recommendations. I've been interesting in Rocq (formerly Coq) recently, and I've seen dependent types mentioned, so I've been curious to learn more.
On that note, I discovered Dafny[1] recently, as a more accessible way to program with proofs. There's a companion book[2] that seems very accessible and down-to-earth (and, unfortunately, quite expensive). I didn't have the time to play with it yet, but it looks like it does what Ada/SPARK does (and more), but with less verbose syntax and more options for compilation targets. It seems to be actively developed, too. Personally, I had a very hard time getting into Coq, which is a proof assistant more than a programming environment - Dafny seems much more welcoming for a "working programmer" :)
I appreciate even more ideas to work with. A more "working" proof language sounds interesting. While I agree that Rocq is decidably not for a "working programmer," I've had a lot of fun working through the book "Software Foundations". Last night, I was able to formally prove the pumping lemma for regular languages, and that was very satisfying and enjoyable. Another reason I'm learning Rocq is due to my (largely uninformed) interest in homotopy type theory.
As the other person pointed out, this is representing an irrational number unambiguously in a finite number of bits (8 bits in a byte). I fail to see how your original statement was careful :)
> representable in a finite number of digits or bits