Hacker Newsnew | past | comments | ask | show | jobs | submit | simonblanke's commentslogin

I am glad you like it. Hyperband relies on early stopping of the model. So it is something you would do inside the objective function by yourself. I have plans to add some helper functions for early stopping in the future.


Thank you very much :-) yes you can put anything you want into the search space. Even pandas dataframes, numpy arrays or classes. Here is an example: https://github.com/SimonBlanke/Hyperactive/blob/master/examp...


Check out the Neural Architecture Search Tutorial here: https://nbviewer.jupyter.org/github/SimonBlanke/hyperactive-...

Neural Architecture Search is just one of many optimization applications you can work on with Hyperactive. Check out the examples in the official github repository: https://github.com/SimonBlanke/Hyperactive/tree/master/examp...


Yes it is quite easy to switch algorithms via the "gpr" parameter. You just have to write a wrapper class. I am currently working on a repository that discusses how to do that in detail: https://github.com/SimonBlanke/surrogate-models I will upload some examples (GPy and Gpflow) within the next 24 hours.

I think those wrapper-classes will already answer some of your questions. If you have additional or unanswered questions you can open an issue in the Grad-Free-Opt repository.


Yes Hill Climbing + some of its variants are featured in this package.


I thought about a table in the readme that shows some kind of metric for each optimizer that describes its performance. I will look into that.


I will look into this algorithm. Thanks for the suggestion. I have some basic explanations of the optimization techniques and their parameters in a separate repository: https://github.com/SimonBlanke/optimization-tutorial

But there is still a lot of work to be done.


Two very interesting questions! I should work soon on a comparison of Hyperactive and GFO to other packages. If some important features are missing, maybe i could add them.

I will also look into Dlib. If you like you can open an official issue in my repository. We could discuss why it is useful and if it should be implemented.


Gradient-Free-Optimizers has a type of genetic algorithm: Evolution Strategy


Evolution strategy is included. The "Covariance matrix adaptation" is for making this algorithm work for continuous search spaces. But gradient-free-optimizers has discrete search space.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: