I believe they are using a non-standard crossover operator in this implementation. A standard crossover operator will generate some permutation of the genes of the two parents: either you inherit the gene from your mother or your father. This implementation creates a new gene that is a weighted average of the genes of the two parents. This will probably cause rapid convergence of the gene pool, which may or may not be intended. Changing Math.random() to Math.round(Math.random()) in crossVars would change this to the standard crossover operator. I'd be interested to see how this affects performance.
I'm sure I've run this as Java code years ago, along with another demo that evolved a car capable of navigating around a course
Edit: I remember now - I was trying to learn neural networks and the code demonstrated using GA to evolve a neural network. Can't find a link, unfortunately, which is a shame as it's an interesting subject
I just had a look at the source and it's written in pure AS2 which makes me think that it's probably kind of old. Either that or the person who wrote it never made the switch to AS3.
I just thought it was interesting because I haven't personally seen many realtime demos of genetic algorithms, especially ones that run in the browser, though I don't doubt that they've been kicking around for years.
Can someone please explain what the graph is? I can't quite figure it out. I'm assuming that the green is total distance traveled... but I can't figure out what the black one means.
For each iteration there are 20 individuals. Each gets a score according to how far they get. The horizontal line is the generation. The black line shows the best in each generation, and green line shows the average over the 20 individuals in that generation.
The black line is always above the green, but sometimes crashes way down. The green line doesn't change dramatically, and generally climbs.
I think the green line is average fitness of all iterations and the black line is a logarithmic index of each generation's maximum fitness to the optimal achieved (both relative to time). So the black line pretty quickly gets bounded by an asymptote at the top of the screen, because the improvements in fitness are far more significant early on. Basically an example of diminishing returns...
There is a great discussion of this (and a further mapping of fitness functions and evolutionary algorithms to a 2-dimensional "fitness landscape") in The Origin of Wealth by Eric Beinhocker - http://www.amazon.com/Origin-Wealth-Evolution-Complexity-Eco...
Absolutely phenomenal exploration of complexity economics.
Just checked the source -- farting around with it a bit to try and raise the timeout period among other things.
Nope it's a lot simpler than that as said above. Basically the x axis is generation, y axis is distance. The black line is the max and the green line is the average.
In a lot of cases, "GA" just seems to be a fancy name for randomized search. Even when it does use elements from the evolutionary metaphor (crossover operators, etc.), you can often get equivalent results with a non-GA randomized-optimization method.
The interesting part here IMO is how to pose the problem as an optimization problem, and what representation they used for the candidate solutions. The details of how you optimize once you have those two things aren't as exciting, and on a lot of problems don't matter that much, so it's a bit of a shame that they always get emphasized (admittedly, the "genetic" and "evolutionary" metaphors make for nice PR, whereas a lot of randomized-optimization algorithms are really boring sounding).