True, but I am convinced that meditation can lead to awakening, which, apparently, involves the cessation of suffering and feeling one with the universe. It's not so far fetched, since our experience is generated by our brains, and similar things can happen on drugs.
It really is readable, to me at least. I just wanted to say that because some people will perhaps read your comment and take it as true, because they too struggle with it at the moment, and then discard Clojure. But I'm just in love with it---yes, the syntax.
It would thread the game-state (immutable) through all the functions, and call hook (a function) in between all the functions to send network messages and stuff (to minimize latency), and also collect and return new events that the functions created that I needed to handle (e.g. someone died when the characters attacked). It's almost like the state monad but not quite.
Big O is not necessarily about execution time. Also, why is it ridiculous to describe execution time as "time complexity"? From http://en.wikipedia.org/wiki/Analysis_of_algorithms "Usually, the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps (time complexity) or storage locations (space complexity)."
> Also, why is it ridiculous to describe execution time as "time complexity"?
It's unnecessarily different than the normal definition of "complexity." When you say "complex" in normal conversation, you mean "complicated" or "hard to understand." You would never say "the time complexity of getting to the airport is 30 minutes."
It also begs the question. Wrapping your head around the idea of asymptotic complexity is the hard part of understanding big O. Defining big O in terms of complexity doesn't help if you don't understand the concept of asymptotic complexity yet.
I think the best way to explain this, by far, is with a graph.
If the definition uses "complexity" in this way then it's not using it in the "plain English" sense, but in a specialized sense. Quantity isn't complexity.
Branching factor 32 is great for lookups, but isn't it slower for modification? At least, one has to create more array cells in total (31 copies in each node), no?
First of all, O(g(n)) is a set. It is the set of functions f(n) such that there exists positive constants n0 and C, and C*g(n) > f(n) when n > n0.
Second, talking about O(g(n)) does not imply that the time complexity being discussed is the worst-case (or any other case) time complexity. One could for example say that the algorithm A's best-case time complexity is in O(n), and it's worst-case time complexity is in O(n^2).