Technically, this is a supervised learning NN library that implements the canonical backprop algorithm. Also looks like all networks are feed-forward and fully connected, with neurons activated using a sigmoid function (1 / [1+e^-ab]). Appears that cross-validation is used as well, but I haven't looked into how or which kind.
You probably want to add momentum or some other form of local optima escape/avoidance mechanism.
This is awesome. I think more resources like this will help spread the ML field to those who otherwise wouldn't be exposed to it. We should really have a core set of tools/libs like pybrain and opencv in every language.
harthur's Bayesian classifier module is also awesome: https://github.com/harthur/classifier. Using it to classify videos based on tags was the first time I'd made anything with ML.
Technically, this is a supervised learning NN library that implements the canonical backprop algorithm. Also looks like all networks are feed-forward and fully connected, with neurons activated using a sigmoid function (1 / [1+e^-ab]). Appears that cross-validation is used as well, but I haven't looked into how or which kind.
You probably want to add momentum or some other form of local optima escape/avoidance mechanism.