I am glad you like it. Hyperband relies on early stopping of the model. So it is something you would do inside the objective function by yourself. I have plans to add some helper functions for early stopping in the future.
Yes it is quite easy to switch algorithms via the "gpr" parameter. You just have to write a wrapper class. I am currently working on a repository that discusses how to do that in detail: https://github.com/SimonBlanke/surrogate-models
I will upload some examples (GPy and Gpflow) within the next 24 hours.
I think those wrapper-classes will already answer some of your questions. If you have additional or unanswered questions you can open an issue in the Grad-Free-Opt repository.
I will look into this algorithm. Thanks for the suggestion. I have some basic explanations of the optimization techniques and their parameters in a separate repository: https://github.com/SimonBlanke/optimization-tutorial
Two very interesting questions! I should work soon on a comparison of Hyperactive and GFO to other packages. If some important features are missing, maybe i could add them.
I will also look into Dlib. If you like you can open an official issue in my repository. We could discuss why it is useful and if it should be implemented.
Evolution strategy is included. The "Covariance matrix adaptation" is for making this algorithm work for continuous search spaces. But gradient-free-optimizers has discrete search space.