Abstract

Motivated by applications in machine learning, we investigate the approximability of several classes of real-valued functions on {0,1}^n by juntas (functions of a small number of variables). Our goal is to approximate a target function, with range normalized to [0,1], within l_2-error epsilon over the uniform distribution. Our two main results are as follows.

Every submodular function is epsilon-close to a function of O(1/epsilon^2 log (1/epsilon)) variables. This result is proved using a "boosting lemma" by Goemans and Vondrak. We note that Omega(1/epsilon^2) variables are necessary even for linear functions.

Every XOS, self-bounding or more generally low average l1-sensitivity function is epsilon-close to a function of 2^{O(1/epsilon^2)} variables. This result is proved using the hypercontractive inequality, similarly to Friedgut's theorem for boolean functions . We show that 2^{Omega(1/epsilon)} variables are necessary even for XOS functions.

Joint work with Vitaly Feldman.

Video Recording