> Again, you're implicitly assuming the representation has to be a string of digits
No. I'm implicitly assuming that g64 different numbers requires at least log(g64)/log(character size) different atoms to describe (assuming each atom in this universe can represent say 256 different states, ie a byte, then all the atoms in the world cannot be lined up to uniquely describe the entire set of numbers of g64 +/- 1%).
Lets say you have a program that accurately describes g64, good job. Now g64-1. Now g64+1. Now... g64+2. Etc. etc. I get that, you can represent some of these numbers in a compact form. But I'm asking for something far more difficult.
How many symbols do you need before you can describe g64 +/- (1% of g64)? Well... a lot, it turns out. The number of programs you'd have to write would fill all the hard drives in the universe before you're done writing even a smidgen of them.
-------
And my challenge is effectively closer to describing a random number from 50% of g64 to 100% of g64. This will require log(g64) / log(character size), which is easily beyond the number of atoms or even the arrangement of atoms in this universe. That's just the innate limitation of language in general.
--------
The numbers of g64 +/- 1% are so huge that there's no way any program written on any kind of hard drive can describe those numbers. The shear number of numbers destroys the capabilities of all the hard drives of the world. It doesn't matter what magical notations you use, they're too large to reasonably describe.
-----------
EDIT: Or maybe this is more easy to think about? All programs of size 1MB or smaller can at best, represent 2^(8 * Megabyte) numbers.
All programs of size 1000 Zetabytes or smaller can at best, represent 2^(8 * zetabytes) number of numbers.
As it turns out, 2^(8*zetabyte) is smaller than 1% of g64. So you cannot possibly uniquely represent those numbers (g64 +/- 1%) even with 1000 Zetabytes worth of storage.
If we extend this out to another few dozens sets of magnitude, by multiplying by 10^300 or so, its still smaller than the space of g64 +/- 1%, so even if we used all the atoms of the universe in a giant, intergalactic hard drive, we still can't represent this set of numbers.
No. I'm implicitly assuming that g64 different numbers requires at least log(g64)/log(character size) different atoms to describe (assuming each atom in this universe can represent say 256 different states, ie a byte, then all the atoms in the world cannot be lined up to uniquely describe the entire set of numbers of g64 +/- 1%).
Lets say you have a program that accurately describes g64, good job. Now g64-1. Now g64+1. Now... g64+2. Etc. etc. I get that, you can represent some of these numbers in a compact form. But I'm asking for something far more difficult.
How many symbols do you need before you can describe g64 +/- (1% of g64)? Well... a lot, it turns out. The number of programs you'd have to write would fill all the hard drives in the universe before you're done writing even a smidgen of them.
-------
And my challenge is effectively closer to describing a random number from 50% of g64 to 100% of g64. This will require log(g64) / log(character size), which is easily beyond the number of atoms or even the arrangement of atoms in this universe. That's just the innate limitation of language in general.
--------
The numbers of g64 +/- 1% are so huge that there's no way any program written on any kind of hard drive can describe those numbers. The shear number of numbers destroys the capabilities of all the hard drives of the world. It doesn't matter what magical notations you use, they're too large to reasonably describe.
-----------
EDIT: Or maybe this is more easy to think about? All programs of size 1MB or smaller can at best, represent 2^(8 * Megabyte) numbers.
All programs of size 1000 Zetabytes or smaller can at best, represent 2^(8 * zetabytes) number of numbers.
As it turns out, 2^(8*zetabyte) is smaller than 1% of g64. So you cannot possibly uniquely represent those numbers (g64 +/- 1%) even with 1000 Zetabytes worth of storage.
If we extend this out to another few dozens sets of magnitude, by multiplying by 10^300 or so, its still smaller than the space of g64 +/- 1%, so even if we used all the atoms of the universe in a giant, intergalactic hard drive, we still can't represent this set of numbers.