arXiv Analytics

Sign in

arXiv:2011.10797 [cs.LG]AbstractReferencesReviewsResources

Adversarial Classification: Necessary conditions and geometric flows

Nicolas Garcia Trillos, Ryan Murray

Published 2020-11-21Version 1

We study a version of adversarial classification where an adversary is empowered to corrupt data inputs up to some distance $\varepsilon$, using tools from variational analysis. In particular, we describe necessary conditions associated with the optimal classifier subject to such an adversary. Using the necessary conditions, we derive a geometric evolution equation which can be used to track the change in classification boundaries as $\varepsilon$ varies. This evolution equation may be described as an uncoupled system of differential equations in one dimension, or as a mean curvature type equation in higher dimension. In one dimension we rigorously prove that one can use the initial value problem starting from $\varepsilon=0$, which is simply the Bayes classifier, in order to solve for the global minimizer of the adversarial problem. Numerical examples illustrating these ideas are also presented.

Related articles: Most relevant | Search more
arXiv:2005.13815 [cs.LG] (Published 2020-05-28)
Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity
arXiv:2205.10022 [cs.LG] (Published 2022-05-20)
Towards Consistency in Adversarial Classification
arXiv:1703.01218 [cs.LG] (Published 2017-03-03)
Learning Graphical Games from Behavioral Data: Sufficient and Necessary Conditions