arXiv Analytics

Sign in

arXiv:2002.04632 [cs.LG]AbstractReferencesReviewsResources

Differentiating the Black-Box: Optimization with Local Generative Surrogates

Sergey Shirobokov, Vladislav Belavin, Michael Kagan, Andrey Ustyuzhanin, Atılım Güneş Baydin

Published 2020-02-11Version 1

We propose a novel method for gradient-based optimization of black-box simulators using differentiable local surrogate models. In fields such as physics and engineering, many processes are modeled with non-differentiable simulators with intractable likelihoods. Optimization of these forward models is particularly challenging, especially when the simulator is stochastic. To address such cases, we introduce the use of deep generative models to iteratively approximate the simulator in local neighborhoods of the parameter space. We demonstrate that these local surrogates can be used to approximate the gradient of the simulator, and thus enable gradient-based optimization of simulator parameters. In cases where the dependence of the simulator on the parameter space is constrained to a low dimensional submanifold, we observe that our method attains minima faster than all baseline methods, including Bayesian optimization, numerical optimization, and REINFORCE driven approaches.

Related articles: Most relevant | Search more
arXiv:1909.09501 [cs.LG] (Published 2019-09-20)
Trivializations for Gradient-Based Optimization on Manifolds
arXiv:2302.07384 [cs.LG] (Published 2023-02-14)
The Geometry of Neural Nets' Parameter Spaces Under Reparametrization
arXiv:2405.06312 [cs.LG] (Published 2024-05-10)
FedGCS: A Generative Framework for Efficient Client Selection in Federated Learning via Gradient-based Optimization
Zhiyuan Ning et al.