Abstract

Does the information complexity of a function equal its communication complexity? We examine whether any currently known techniques might be used to show a separation between the two notions. Recently, Ganor et al. [2014] provided such a separation in the distributional setting for a specific input distribution µ. We show that in the nondistributional setting, the relative discrepancy bound they defined is, in fact, smaller than the information complexity, and hence it cannot be used to separate information and communication complexity. In addition, in the distributional case, we provide an equivalent linear program formulation for relative discrepancy and relate it to variants of the partition bound, resolving also an open question regarding the relation of the partition bound and information complexity. Last, we prove the equivalence between the adaptive relative discrepancy and the public-coin partition bound, which implies that the logarithm of the adaptive relative discrepancy bound is quadratically tight with respect to communication.

Video Recording