arXiv Analytics

Sign in

arXiv:2211.01852 [cs.LG]AbstractReferencesReviewsResources

Revisiting Hyperparameter Tuning with Differential Privacy

Youlong Ding, Xueyang Wu

Published 2022-11-03Version 1

Hyperparameter tuning is a common practice in the application of machine learning but is a typically ignored aspect in the literature on privacy-preserving machine learning due to its negative effect on the overall privacy parameter. In this paper, we aim to tackle this fundamental yet challenging problem by providing an effective hyperparameter tuning framework with differential privacy. The proposed method allows us to adopt a broader hyperparameter search space and even to perform a grid search over the whole space, since its privacy loss parameter is independent of the number of hyperparameter candidates. Interestingly, it instead correlates with the utility gained from hyperparameter searching, revealing an explicit and mandatory trade-off between privacy and utility. Theoretically, we show that its additional privacy loss bound incurred by hyperparameter tuning is upper-bounded by the squared root of the gained utility. However, we note that the additional privacy loss bound would empirically scale like a squared root of the logarithm of the utility term, benefiting from the design of doubling step.

Related articles: Most relevant | Search more
arXiv:2012.07828 [cs.LG] (Published 2020-12-14, updated 2021-08-23)
Robustness Threats of Differential Privacy
arXiv:2102.08166 [cs.LG] (Published 2021-02-16)
Differential Privacy and Byzantine Resilience in SGD: Do They Add Up?
arXiv:2211.00734 [cs.LG] (Published 2022-11-01)
On the Interaction Between Differential Privacy and Gradient Compression in Deep Learning