A Discrepancy Lower Bound for Information Complexity

作者:Braverman Mark; Weinstein Omri*
来源:Algorithmica, 2016, 76(3): 846-864.
DOI:10.1007/s00453-015-0093-8

摘要

This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function f with respect to a distribution is , then any two party randomized protocol computing f must reveal at least bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on must reveal bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is , which provides an alternative proof to the recent proof of Viola (Proceedings of the twenty-fourth annual ACM-SIAM symposium on discrete algorithms, SODA 2013, New Orleans, LA, USA, 6-8 Jan 2013, pp 632-651, 2013) of the lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a followup breakthrough work of Kerenidis et al. (53rd annual IEEE symposium on foundations of computer science, FOCS 2012, New Brunswick, NJ, USA, 20-23 Oct 2012, pp 500-509, 2012), our simulation procedure served as a building block towards a proof that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity as well.

  • 出版日期2016-11