![Computational Complexity of Statistical Inference_hi-res logo](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-03/Computational%20Complexity%20of%20Statistical%20Inference_hi-res.jpg?h=ccd3d9f7&itok=qCiIzWU5)
Abstract
The analysis of tensor data, i.e., arrays with multiple directions, has been an active research topic in data science. Datasets in the form of tensors arise from a wide range of scientific applications. Tensor methods also provide unique perspectives to many high-dimensional problems, where the observations are not necessarily tensors. On the other hand, many tensor problems exhibit intrinsic computational barriers and statistical-computational gaps.
In this talk, we discuss several recent advances in the statistical-computational trade-offs in various tensor learning problems, such as tensor PCA, tensor clustering, and tensor regression. Our theory demonstrates a "blessing of statistical-computational gap" phenomenon: one usually requires stronger conditions than the statistical (or information-theoretic) limit to ensure the computationally feasible estimation is achievable in these tensor problems. Such conditions “incidentally” render a feasible low-rank tensor inference without debiasing and overparametrized estimation without the cost of extra samples.