Not-so-many years ago, this kind of work developing optimization algorithms would have been called optimization algorithms, not AI.
> We develop Urania, a highly parallelized hybrid local-global optimization algorithm, sketched in Fig. 2(a). It starts from a pool of thousands of initial conditions of the UIFO, which are either entirely random initializations or augmented with solutions from different frequency ranges. Urania starts 1000 parallel local optimizations that minimize the objective function using an adapted version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. BFGS is a highly efficient gradient-descent optimizer that approximates the inverse Hessian matrix. For each local optimization, Urania chooses a target from the pool according to a Boltzmann distribution, which weights better-performing setups in the pool higher and adds a small noise to escape local minima.
It does look like the gradient descent is paired with a type of genetic algorithm:
> For each local optimization, Urania chooses a target from the pool according to a Boltzmann distribution, which weights better-performing setups in the pool higher and adds a small noise to escape local minima. These choices add a global character to the exploration. When one of the local optimizations of Urania finds a better parameter setting for a setup in the pool, it replaces the old solution with the superior one. Upon convergence, Urania repeats and chooses a new target from the pool. In parallel, Urania simplifies solutions from the pool by probabilistically removing elements whose removal does not impact the overall sensitivity.
This irks me to no end, why not just call it applied mathematics algorithms to not use specific terms, rather than AI. Is grep AI? Is your web browser AI?
> We develop Urania, a highly parallelized hybrid local-global optimization algorithm, sketched in Fig. 2(a). It starts from a pool of thousands of initial conditions of the UIFO, which are either entirely random initializations or augmented with solutions from different frequency ranges. Urania starts 1000 parallel local optimizations that minimize the objective function using an adapted version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. BFGS is a highly efficient gradient-descent optimizer that approximates the inverse Hessian matrix. For each local optimization, Urania chooses a target from the pool according to a Boltzmann distribution, which weights better-performing setups in the pool higher and adds a small noise to escape local minima.
https://journals.aps.org/prx/abstract/10.1103/PhysRevX.15.02...