Best part of the post was where someone suggested he increase his heap size to 1GB to get a fibonacci algorithm to run.
After pointing this out, a member of the Node.js
community (post now deleted) suggested I might have an
obsolete version of Node with a 1GB heap limit and that
I recompile without the 1GB restriction so that this
retarded algorithm can continue eating up all my system RAM.
It is curious that an array with 1 million doubles would take anywhere near a gigabyte in V8. 1000000 * 8B is about 8MB. Even with 5000% overhead for the data structure itself, that's only 400MB. Where did the other 600 MB go?