arXiv Analytics

Sign in

arXiv:2012.01668 [stat.ML]AbstractReferencesReviewsResources

Online Forgetting Process for Linear Regression Models

Yuantong Li, Chi-hua Wang, Guang Cheng

Published 2020-12-03Version 1

Motivated by the EU's "Right To Be Forgotten" regulation, we initiate a study of statistical data deletion problems where users' data are accessible only for a limited period of time. This setting is formulated as an online supervised learning task with \textit{constant memory limit}. We propose a deletion-aware algorithm \texttt{FIFD-OLS} for the low dimensional case, and witness a catastrophic rank swinging phenomenon due to the data deletion operation, which leads to statistical inefficiency. As a remedy, we propose the \texttt{FIFD-Adaptive Ridge} algorithm with a novel online regularization scheme, that effectively offsets the uncertainty from deletion. In theory, we provide the cumulative regret upper bound for both online forgetting algorithms. In the experiment, we showed \texttt{FIFD-Adaptive Ridge} outperforms the ridge regression algorithm with fixed regularization level, and hopefully sheds some light on more complex statistical models.

Related articles: Most relevant | Search more
arXiv:1709.01716 [stat.ML] (Published 2017-09-06)
Optimal Sub-sampling with Influence Functions
arXiv:2006.14554 [stat.ML] (Published 2020-06-25)
STORM: Foundations of End-to-End Empirical Risk Minimization on the Edge
arXiv:2008.06631 [stat.ML] (Published 2020-08-15)
On the Generalization Properties of Adversarial Training