arXiv:1612.07250 [quant-ph]AbstractReferencesReviewsResources
Contextuality beyond the Kochen-Specker theorem
Published 2016-12-21Version 1
When it isn't possible to tell two distinct experimental procedures apart purely from their input/output statistics, then it seems a plausible hypothesis that the two procedures must be physically identical. We call such a hypothesis "noncontextuality", an instance of Leibniz's principle of the identity of indiscernibles. Read in the contrapositive, this hypothesis entails that any physical distinctions between two experimental procedures must necessarily lead to a difference in their operational statistics. The results I present in this thesis concern the failure of this hypothesis -- a failure dubbed "contextuality" -- when one tries to embed an operational theory (such as quantum theory) in the ontological models framework. The Kochen-Specker theorem demonstrates the failure of noncontextuality for deterministic ontological models of quantum theory, i.e., those ontological models where the ontic/physical state of the system fixes the outcome of any projective measurement on the system in a deterministic manner. This thesis goes beyond the Kochen-Specker (KS) theorem by asking what operational facts must be verified in experiments to conclude that Nature does not admit noncontextual ontological models, not even indeterministic ones. This leads to noncontextuality inequalities that are robust to noise in the preparations and measurements. In the particular case of quantum theory, these inequalities are meaningful even when unsharp measurements (or POVMs) are allowed, a feature not shared by the traditional approach to KS-noncontextuality where unsharp measurements are excluded by fiat: allowing them renders even trivial POVMs (proportional to identity) maximally KS-contextual. The sense in which trivial POVMs are indeed "trivial" (or "noncontextual") is clear in our approach: they are simply too noisy to lead to violation of our noncontextuality inequalities.