arXiv Analytics

Sign in

arXiv:2103.12198 [cs.LG]AbstractReferencesReviewsResources

Challenges in Statistical Analysis of Data Collected by a Bandit Algorithm: An Empirical Exploration in Applications to Adaptively Randomized Experiments

Joseph Jay Williams, Jacob Nogas, Nina Deliu, Hammad Shaikh, Sofia Villar, Audrey Durand, Anna Rafferty

Published 2021-03-22Version 1

Multi-armed bandit algorithms have been argued for decades as useful for adaptively randomized experiments. In such experiments, an algorithm varies which arms (e.g. alternative interventions to help students learn) are assigned to participants, with the goal of assigning higher-reward arms to as many participants as possible. We applied the bandit algorithm Thompson Sampling (TS) to run adaptive experiments in three university classes. Instructors saw great value in trying to rapidly use data to give their students in the experiments better arms (e.g. better explanations of a concept). Our deployment, however, illustrated a major barrier for scientists and practitioners to use such adaptive experiments: a lack of quantifiable insight into how much statistical analysis of specific real-world experiments is impacted (Pallmann et al, 2018; FDA, 2019), compared to traditional uniform random assignment. We therefore use our case study of the ubiquitous two-arm binary reward setting to empirically investigate the impact of using Thompson Sampling instead of uniform random assignment. In this setting, using common statistical hypothesis tests, we show that collecting data with TS can as much as double the False Positive Rate (FPR; incorrectly reporting differences when none exist) and the False Negative Rate (FNR; failing to report differences when they exist)...

Related articles: Most relevant | Search more
arXiv:cs/0703062 [cs.LG] (Published 2007-03-13)
Bandit Algorithms for Tree Search
arXiv:2411.09900 [cs.LG] (Published 2024-11-15)
Statistical Analysis of Policy Space Compression Problem
arXiv:1306.0811 [cs.LG] (Published 2013-06-04, updated 2013-11-04)
A Gang of Bandits