arXiv Analytics

Sign in

arXiv:1708.03366 [cs.LG]AbstractReferencesReviewsResources

Resilient Linear Classification: An Approach to Deal with Attacks on Training Data

Sangdon Park, James Weimer, Insup Lee

Published 2017-08-10Version 1

Data-driven techniques are used in cyber-physical systems (CPS) for controlling autonomous vehicles, handling demand responses for energy management, and modeling human physiology for medical devices. These data-driven techniques extract models from training data, where their performance is often analyzed with respect to random errors in the training data. However, if the training data is maliciously altered by attackers, the effect of these attacks on the learning algorithms underpinning data-driven CPS have yet to be considered. In this paper, we analyze the resilience of classification algorithms to training data attacks. Specifically, a generic metric is proposed that is tailored to measure resilience of classification algorithms with respect to worst-case tampering of the training data. Using the metric, we show that traditional linear classification algorithms are resilient under restricted conditions. To overcome these limitations, we propose a linear classification algorithm with a majority constraint and prove that it is strictly more resilient than the traditional algorithms. Evaluations on both synthetic data and a real-world retrospective arrhythmia medical case-study show that the traditional algorithms are vulnerable to tampered training data, whereas the proposed algorithm is more resilient (as measured by worst-case tampering).

Related articles: Most relevant | Search more
arXiv:1504.02141 [cs.LG] (Published 2015-04-08)
Detecting falls with X-Factor HMMs when the training data for falls is not available
arXiv:1910.04214 [cs.LG] (Published 2019-10-09)
Who's responsible? Jointly quantifying the contribution of the learning algorithm and training data
arXiv:2004.11947 [cs.LG] (Published 2020-04-24)
Symbolic Regression Driven by Training Data and Prior Knowledge