Abstract
Linear dynamical systems are the canonical model for time series data. They have wide-ranging applications and there is a vast literature on learning their parameters from input-output sequences. Moreover they have received renewed interest because of their connections to recurrent neural networks.
But there are wide gaps in our understanding. Existing works have only asymptotic guarantees or else make restrictive assumptions, e.g. that preclude having any long-range correlations. In this work, we give a new algorithm based on the method of moments that is computationally efficient and works under essentially minimal assumptions. Our work points to several missed connections, whereby tools from theoretical machine learning including tensor methods, can be used in non-stationary settings.
Based on joint work with Ainesh Bakshi, Allen Liu and Morris Yau