arXiv Analytics

Sign in

arXiv:1911.05166 [cs.LG]AbstractReferencesReviewsResources

Negative sampling in semi-supervised learning

John Chen, Vatsal Shah, Anastasios Kyrillidis

Published 2019-11-12Version 1

We introduce Negative Sampling in Semi-Supervised Learning (NS3L), a simple, fast, easy to tune algorithm for semi-supervised learning (SSL). NS3L is motivated by the success of negative sampling/contrastive estimation. We demonstrate that adding the NS3L loss to state-of-the-art SSL algorithms, such as the Virtual Adversarial Training (VAT), significantly improves upon vanilla VAT and its variant, VAT with Entropy Minimization. By adding the NS3L loss to MixMatch, the current state-of-the-art approach on semi-supervised tasks, we observe significant improvements over vanilla MixMatch. We conduct extensive experiments on the CIFAR10, CIFAR100, SVHN and STL10 benchmark datasets.

Related articles: Most relevant | Search more
arXiv:1910.02760 [cs.LG] (Published 2019-10-07)
Negative Sampling in Variational Autoencoders
arXiv:1908.09574 [cs.LG] (Published 2019-08-26)
Improvability Through Semi-Supervised Learning: A Survey of Theoretical Results
arXiv:2006.04097 [cs.LG] (Published 2020-06-07)
Optimally Combining Classifiers for Semi-Supervised Learning