Abstract

ICA is a classical model of unsupervised learning originating and widely used in signal processing. In this model, observations in n-space are obtained by an unknown linear transformation of a signal with independent components. The goal is to learn the the transformation and thereby the hidden basis of the independent components. In this talk, we'll begin with a polynomial-time algorithm called Fourier PCA, for underdetermined ICA, the setting where the observed signal is in a lower dimensional space than the original signal, assuming a mild, verifiable condition on the transformation. Its analysis uses higher-order derivatives of Fourier transformations and anticoncentration of Gaussian polynomials. The running time and sample complexity are polynomial, but of high constant degree, for any constant error; a MATLAB implementation of the algorithm appears to need 104 samples for 10-dimensional observations. We then present a more efficient recursive variant for the fully determined case of standard ICA that needs only a nearly linear number of samples and has polynomial time complexity. An implementation of this variant appears to outperform known heuristics. Its analysis is based on understanding the roots of random polynomials. This is joint work with Navin Goyal and Ying Xiao.

Video Recording