arXiv:2008.09593 [math.PR]AbstractReferencesReviewsResources
Hyperbolic Polynomials I : Concentration and Discrepancy
Published 2020-08-21Version 1
Chernoff bound is a fundamental tool in theoretical computer science. It has been extensively used in randomized algorithm design and stochastic type analysis. The discrepancy theory, which deals with finding a bi-coloring of a set system such that the coloring of each set is balanced, has a huge number of applications in approximation algorithm design. A classical theorem of Spencer [Spe85] shows that any set system with $n$ sets and $n$ elements has discrepancy $O(\sqrt{n})$ while Chernoff [Che52] only gives $O(\sqrt{n \log n})$. The study of hyperbolic polynomial is dating back to the early 20th century, due to G{\aa}rding [G{\aa}r59] for solving PDEs. In recent years, more applications are found in control theory, optimization, real algebraic geometry, and so on. In particular, the breakthrough result by Marcus, Spielman, and Srivastava [MSS15] uses the theory of hyperbolic polynomial to prove the Kadison-Singer conjecture [KS59], which is closely related to the discrepancy theory. In this paper, we present two new results for hyperbolic polynomials $\bullet$ We show an optimal hyperbolic Chernoff bound for any constant degree hyperbolic polynomials. $\bullet$ We prove a hyperbolic Spencer theorem for any constant rank vectors. The classical matrix Chernoff and discrepancy results are based on determinant polynomial which is a special case of hyperbolic polynomial. To the best of our knowledge, this paper is the first work that shows either concentration or discrepancy results for hyperbolic polynomials.