arXiv Analytics

Sign in

arXiv:2002.09067 [cs.LG]AbstractReferencesReviewsResources

Incremental Sampling Without Replacement for Sequence Models

Kensen Shi, David Bieber, Charles Sutton

Published 2020-02-21Version 1

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.

Related articles: Most relevant | Search more
arXiv:2007.06744 [cs.LG] (Published 2020-07-14)
WOR and $p$'s: Sketches for $\ell_p$-Sampling Without Replacement
arXiv:1805.09461 [cs.LG] (Published 2018-05-24)
Deep Reinforcement Learning For Sequence to Sequence Models
arXiv:2002.10400 [cs.LG] (Published 2020-02-24)
Closing the convergence gap of SGD without replacement