arXiv Analytics

Sign in

arXiv:2011.03900 [stat.ML]AbstractReferencesReviewsResources

The Cost of Privacy in Generalized Linear Models: Algorithms and Minimax Lower Bounds

T. Tony Cai, Yichen Wang, Linjun Zhang

Published 2020-11-08Version 1

The trade-off between differential privacy and statistical accuracy in generalized linear models (GLMs) is studied. We propose differentially private algorithms for parameter estimation in both low-dimensional and high-dimensional sparse GLMs and characterize their statistical performance. We establish privacy-constrained minimax lower bounds for GLMs, which imply that the proposed algorithms are rate-optimal up to logarithmic factors in sample size. The lower bounds are obtained via a novel technique, which is based on Stein's Lemma and generalizes the tracing attack technique for privacy-constrained lower bounds. This lower bound argument can be of independent interest as it is applicable to general parametric models. Simulated and real data experiments are conducted to demonstrate the numerical performance of our algorithms.

Related articles: Most relevant | Search more
arXiv:2209.08030 [stat.ML] (Published 2022-09-16)
Detection of Interacting Variables for Generalized Linear Models via Neural Networks
arXiv:2407.13977 [stat.ML] (Published 2024-07-19)
A Unified Confidence Sequence for Generalized Linear Models, with Applications to Bandits
arXiv:2410.08994 [stat.ML] (Published 2024-10-11)
Optimal Downsampling for Imbalanced Classification with Generalized Linear Models