![Real Analysis in Computer Science_hi-res logo](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-01/Real%20Analysis%20in%20Computer%20Science_hi-res.jpg?h=e2b42a27&itok=jpgiJZrK)
Abstract
Motivated by applications in machine learning, we investigate the approximability of several classes of real-valued functions on {0,1}^n by juntas (functions of a small number of variables). Our goal is to approximate a target function, with range normalized to [0,1], within l_2-error epsilon over the uniform distribution. Our two main results are as follows.
Every submodular function is epsilon-close to a function of O(1/epsilon^2 log (1/epsilon)) variables. This result is proved using a "boosting lemma" by Goemans and Vondrak. We note that Omega(1/epsilon^2) variables are necessary even for linear functions.
Every XOS, self-bounding or more generally low average l1-sensitivity function is epsilon-close to a function of 2^{O(1/epsilon^2)} variables. This result is proved using the hypercontractive inequality, similarly to Friedgut's theorem for boolean functions . We show that 2^{Omega(1/epsilon)} variables are necessary even for XOS functions.
Joint work with Vitaly Feldman.