Defaulting to bignums would turn every + in the program from O(1) to O(N). That makes secure programming pretty difficult - this attack's already bad enough:
It's a small thing, but it's nice to do. Or at least detect overflows and move to bignum in the default numerical implementation. It's not end-of-the-world if you don't, but it helps avoid a lot of bugs...
I really recommend having the program trap on overflow. There are some proposals like as-if-infinitely-ranged, which is like bignums but without having to actually store a bignum.
It's really problematic to use them, though. Every integer would turn into a pointer, and the O(N) thing really is a problem. If you're doing secure coding you need careful control of data dependencies, since they create side-channel leaks, and nobody who makes "safe" languages appreciates this.
Plus most people don't need numbers that big. I think it was even a mistake to make size_t 64-bit.
http://arstechnica.com/business/2011/12/huge-portions-of-web...