TY - JOUR
T1 - Escaping local minima with local derivative-free methods
T2 - a numerical investigation
AU - Cartis, Coralia
AU - Roberts, Lindon
AU - Sheridan-Methven, Oliver
N1 - Publisher Copyright:
© 2021 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2022
Y1 - 2022
N2 - We investigate the potential of applying a state-of-the-art, local derivative-free solver, Py-BOBYQA to global optimization problems. In particular, we demonstrate the potential of a restarts procedure–as distinct from multistart methods–to allow Py-BOBYQA to escape local minima (where ordinarily it would terminate at the first local minimum found). We also introduce an adaptive variant of restarts which yields improved performance on global optimization problems. As Py-BOBYQA is a model-based trust-region method, we compare largely with other global optimization methods for which (global) models are important, such as Bayesian optimization and response surface methods; we also consider state-of-the-art representative deterministic and stochastic codes, such as DIRECT and CMA-ES. We find numerically that the restarts procedures in Py-BOBYQA are effective at helping it to escape local minima, when compared to using no restarts in Py-BOBYQA. Additionally, we find that Py-BOBYQA with adaptive restarts has comparable performance with global optimization solvers for all accuracy/budget regimes, in both smooth and noisy settings. In particular, Py-BOBYQA variants are best performing for smooth and multiplicative noise problems in high-accuracy regimes. As a by-product, some preliminary conclusions can be drawn on the relative performance of the global solvers we have tested with default settings.
AB - We investigate the potential of applying a state-of-the-art, local derivative-free solver, Py-BOBYQA to global optimization problems. In particular, we demonstrate the potential of a restarts procedure–as distinct from multistart methods–to allow Py-BOBYQA to escape local minima (where ordinarily it would terminate at the first local minimum found). We also introduce an adaptive variant of restarts which yields improved performance on global optimization problems. As Py-BOBYQA is a model-based trust-region method, we compare largely with other global optimization methods for which (global) models are important, such as Bayesian optimization and response surface methods; we also consider state-of-the-art representative deterministic and stochastic codes, such as DIRECT and CMA-ES. We find numerically that the restarts procedures in Py-BOBYQA are effective at helping it to escape local minima, when compared to using no restarts in Py-BOBYQA. Additionally, we find that Py-BOBYQA with adaptive restarts has comparable performance with global optimization solvers for all accuracy/budget regimes, in both smooth and noisy settings. In particular, Py-BOBYQA variants are best performing for smooth and multiplicative noise problems in high-accuracy regimes. As a by-product, some preliminary conclusions can be drawn on the relative performance of the global solvers we have tested with default settings.
KW - Derivative-free optimization
KW - global optimization
KW - trust region methods
UR - http://www.scopus.com/inward/record.url?scp=85100960086&partnerID=8YFLogxK
U2 - 10.1080/02331934.2021.1883015
DO - 10.1080/02331934.2021.1883015
M3 - Article
SN - 0233-1934
VL - 71
SP - 2343
EP - 2373
JO - Optimization
JF - Optimization
IS - 8
ER -