Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does the existence or absence of a finite name affect the reality of a number? You can put every real number in a one-to-one correspondence with itself. Or a one-to-one correspondence of f(x).

This is a really interesting thought experiment, and I think in some ways reframes the problem of "exists" (which is nebulous) to "has a description which is finite (or countably infinite, but not uncountable)".

But I think if you could prove a way to countably label an arbitrary real number, you could probably inductively prove that you could do it for any. But you run face-first into the interesting number paradox and/or incompleteness.

https://en.wikipedia.org/wiki/Interesting_number_paradox



If unicorns exist, but humans can never observe them do you really care whether they exist or not?

I generally am reducing questions that are only relevant to philosophy such as "Do real numbers exist?" to questions which you can actually answer.

To make something clear, rationals have a finite description. If you allow infinite descriptions, all real numbers can be described by Cauchy sequences. When I talk about countably many names, I am talking about the set of finite names humans use for numbers. You cannot meaningfully have infinitely many sets of finite names that humans agree on so you cannot cover the real numbers with names.

The interesting number paradox only works for countable sets unless I am mistaken. You can have "interesting" real numbers.


When the sets of "finite names" you discuss requires more space than the universe has atoms and/or plank-lengths, this entire discussion feels incredibly arbitrary.

Consider Graham's number for example. Generate for me a random number between Googleplex and Graham's number, and describe it to me uniquely.

Despite this random number being a finite value describable by a finite number of digits, there's simply not enough space in the known universe to actually write down the hypothetical random number chosen. I've chosen numbers so large that its impossible for you to literally describe these finite numbers.


Your insistence here & in the other sub-thread that these numbers are represented as a string of digits suggests that you don't understand the argument being made.

If you don't insist on this, then let me give a quick counterexample to "All the numbers larger than say... a googleplex": googleplex + 1


Sure. But my own understanding of your stance has evolved throughout today. My current counter argument is:

> Generate for me a random number between Googleplex and Graham's number, and describe it to me uniquely

I know you cannot do this, and I presume you also know you cannot do this. Just because there exists a finite representation doesn't mean you can tell me that representation.

-------

I insist upon this because your requirement of "finite representation" is seemingly arbitrary to me. We can enumerate all important numbers as say... all numbers represented by algebra (addition, subtraction, multiplication, division, roots, exponents, variables, logs, sine, cosine, integral, derivatives, Knuth's notation, and other functions... etc. etc.) that can be described in fewer than 1-million symbols.

And now we have a significant number of "irrational" and "i" numbers, specifically the set that we'll figure out with modern mathematics. Its a finite set (by bounding it by 1-million symbols, we have a finite number of numbers), and arguably the more important set of numbers that represents how modern math functions.

--------

I dare say that all "important" numbers follows my (relatively arbitrary) definition above (all numbers describable in 1-million modern mathematical symbols or fewer), and is certainly more important than say... most of the numbers between googleplex and Graham's number.


>your stance

Just to be clear, I'm not the person you were originally replying to. You can describe what you're talking about by writing a program--whether or not it terminates in our lifetime does not change that it is in fact a representation of the number.


I can confidently say that the space to uniquely describe a truly uniformly random number between googleplex and graham's number is so large, THE PROGRAM cannot be written down in this universe even with all the atoms in the universe at our disposal.

Graham's number is very very very large. It is finite, it is an integer, but it is absurdly huge. Graham can describe Graham's number, but arbitrarily / uniformly picking a number close to it at random is basically impossible.


Graham's number can be computed in a few lines of code using Knuth's up-arrow notation.


> uniquely describe a truly uniformly random number

This is the hard part. Picking a randomly uniform number "close to" Graham's number.

Graham's number itself is easily described of course: I can just say "Graham's Number". The numbers "close to it", (say, +/- 1% of Grahams number), are impossible to describe.

If you don't believe me, then please ship me the impossible number of hard drives that describes one such number, but you will have had to have picked it truly randomly. Pick one at random.

-------------

EDIT:

> Graham's number can be computed in a few lines of code using Knuth's up-arrow notation.

Also, this is wrong. The first number of the pyramid can be described in up-arrow notation. But even the 2nd number of the pyramid requires g1 (ie: ~7.6 Trillion) up arrows to describe.

I can safely say that g3 (which requires G2 arrows to describe) has more up-arrows involved than there are atoms in this universe. So g3 already cannot be described by a computer program using up-arrow notation alone. And Graham's number is g64, sitting on top of a huge pyramid of such numbers.


Again, you're implicitly assuming the representation has to be a string of digits. I'm not sure that I can convince you, or even what to convince you of, as you aren't accepting the premises of the argument. The numbers +/- 1% of Graham's number can be trivially represented with a program. Likewise, here's some people codegolfing programs to output Graham's number[1]. If it seems like I'm cheating by using too powerful of a method, that's the point that the original person was making: there exist real numbers that can't be described even like this.

[1]: https://codegolf.stackexchange.com/questions/83873/theoretic...


> Again, you're implicitly assuming the representation has to be a string of digits

No. I'm implicitly assuming that g64 different numbers requires at least log(g64)/log(character size) different atoms to describe (assuming each atom in this universe can represent say 256 different states, ie a byte, then all the atoms in the world cannot be lined up to uniquely describe the entire set of numbers of g64 +/- 1%).

Lets say you have a program that accurately describes g64, good job. Now g64-1. Now g64+1. Now... g64+2. Etc. etc. I get that, you can represent some of these numbers in a compact form. But I'm asking for something far more difficult.

How many symbols do you need before you can describe g64 +/- (1% of g64)? Well... a lot, it turns out. The number of programs you'd have to write would fill all the hard drives in the universe before you're done writing even a smidgen of them.

-------

And my challenge is effectively closer to describing a random number from 50% of g64 to 100% of g64. This will require log(g64) / log(character size), which is easily beyond the number of atoms or even the arrangement of atoms in this universe. That's just the innate limitation of language in general.

--------

The numbers of g64 +/- 1% are so huge that there's no way any program written on any kind of hard drive can describe those numbers. The shear number of numbers destroys the capabilities of all the hard drives of the world. It doesn't matter what magical notations you use, they're too large to reasonably describe.

-----------

EDIT: Or maybe this is more easy to think about? All programs of size 1MB or smaller can at best, represent 2^(8 * Megabyte) numbers.

All programs of size 1000 Zetabytes or smaller can at best, represent 2^(8 * zetabytes) number of numbers.

As it turns out, 2^(8*zetabyte) is smaller than 1% of g64. So you cannot possibly uniquely represent those numbers (g64 +/- 1%) even with 1000 Zetabytes worth of storage.

If we extend this out to another few dozens sets of magnitude, by multiplying by 10^300 or so, its still smaller than the space of g64 +/- 1%, so even if we used all the atoms of the universe in a giant, intergalactic hard drive, we still can't represent this set of numbers.


> If unicorns exist, but humans can never observe them do you really care whether they exist or not?

I do, actually.

It depends what you are using the unicorns for. If you use them for their blood for extending your lifespan, no they don't exist. If you are using the metaphor of unicorns as "the go-to thing which is used as a counter-example of existence", then they do exist. "Existing" is nebulous.

> The interesting number paradox only works for countable sets unless I am mistaken. You can have "interesting" real numbers.

Ah yeah, right, that relies on on sequencing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: