Hacker News new | past | comments | ask | show | jobs | submit login

Actually I'd argue the example you provided is normal, as long as you authorise a particular encoding where every number n you're looking for is encoded as a string of n zeros.

It's then trivial to see that every number you can think of is encoded in there, and therefore any data, piece of music or movie that ever existed.

(I'm not sure we're allowed to fiddle with the encoding, but since we allow ourselves to represent a piece of music into a number, we're already talking about encoding anyway, so it doesn't seem like cheating to me...)




Normality of a number is with respect to number bases, so your trick with encoding is invalid. Otherwise, every computable number could be considered normal - take an algorithm for generating of it, supply a random string (this is the encoding), disregard the random string, and you have a perfectly valid normal representation of your number. So it is cheating.


I agree that normality is a specific formalized concept, but you could always require that an encoding function like this is injective.


Encoding doesn't count. Normality is a very specific mathematical concept: https://en.wikipedia.org/wiki/Normal_number

Also, 1.01001000100001... is a good example of a number that is both irrational and transcendental but not normal.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: