When solving global optimization problems in practice, one often ends up repeatedly solving problems that are similar to each others. By introducing a rigorous definition of similarity to exploit priors obtained from past experience to efficiently solve new (similar) problems, in this work we incorporate the META-learning rationale into SMGO -Δ, a global optimization approach recently proposed in the literature. Through a benchmark numerical example we show the practical benefits of our META -extension of the baseline algorithm, while providing theoretical bounds on its performance.
META-SMGO-$\Delta$: Similarity as a Prior in Black-Box Optimization
Busetto, Riccardo;Breschi, Valentina;Formentin, Simone
2023-01-01
Abstract
When solving global optimization problems in practice, one often ends up repeatedly solving problems that are similar to each others. By introducing a rigorous definition of similarity to exploit priors obtained from past experience to efficiently solve new (similar) problems, in this work we incorporate the META-learning rationale into SMGO -Δ, a global optimization approach recently proposed in the literature. Through a benchmark numerical example we show the practical benefits of our META -extension of the baseline algorithm, while providing theoretical bounds on its performance.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


