arXiv Analytics

Sign in

arXiv:1809.03832 [stat.ML]AbstractReferencesReviewsResources

Learning rate adaptation for differentially private stochastic gradient descent

Antti Koskela, Antti Honkela

Published 2018-09-11Version 1

Differentially private learning has recently emerged as the leading approach for privacy-preserving machine learning. Differential privacy can complicate learning procedures because each access to the data needs to be carefully designed and carries a privacy cost. For example, standard parameter tuning with a validation set cannot be easily applied. In this paper, we propose a differentially private algorithm for the adaptation of the learning rate for differentially private stochastic gradient descent (SGD) that avoids the need for validation set use. The idea for the adaptiveness comes from the technique of extrapolation in classical numerical analysis: to get an estimate for the error against the gradient flow which underlies SGD, we compare the result obtained by one full step and two half-steps. We prove the privacy of the method using the moments accountant mechanism. This allows us to compute tight privacy bounds. Empirically we show that our method is competitive with manually tuned commonly used optimisation methods for training deep neural networks and differentially private variational inference.

Related articles: Most relevant | Search more
arXiv:2209.04188 [stat.ML] (Published 2022-09-09)
Differentially Private Stochastic Gradient Descent with Low-Noise
arXiv:2405.12553 [stat.ML] (Published 2024-05-21)
Uncertainty quantification by block bootstrap for differentially private stochastic gradient descent
arXiv:1906.03049 [stat.ML] (Published 2019-06-07)
Computing Exact Guarantees for Differential Privacy