# Learning without Interaction Requires Separation

Vitaly Feldman (Google)

We consider learning in large-scale systems under the constraint of local differential privacy. For many learning problems known efficient algorithms in this model require many rounds of communication between the server and the clients holding the data points. Yet multi-round protocols are prohibitively slow in practice due to network latency and, as a result, currently deployed large-scale systems are limited to a single round. Very little is known about which learning problems can be solved by such non-interactive systems. The only lower bound we are aware of is for PAC learning an artificial class of functions with respect to a uniform distribution (Kasiviswanathan et al., 2008).

We show that the margin complexity of a class of function is a lower bound on the complexity of any non-interactive algorithm for distribution-independent learning of the class. In particular, the classes of linear separators and decision lists require exponential number of samples to learn non-interactively even though they can be learned in polynomial time by an interactive algorithm. Our lower bound also holds against a stronger class of algorithms for which only queries that depend on labels are non-interactive (we refer to them as label-non-adaptive). We complement this lower bound with a new efficient and label-non-adaptive learning algorithm whose complexity is polynomial in the margin complexity.

Joint work with Amit Daniely.