Abstract

Randomized methods for computing eigenvalues of large matrices are influenced by a number of fundamental spectral properties that affect algorithm performance and the quality of the resulting answers. Should one gauge convergence by eigenvalue errors, residual norms, or subspace angles? What matrix properties dictate the convergence rate? How does performance depend on the location of the sought-after eigenvalues, relative to the rest of the spectrum? How sensitive are eigenvalues and eigenvectors to changes in the matrix? How does such sensitivity influence convergence? We shall address these questions in the context of traditional (deterministic) eigensolvers, in the hope that such an overview will provide a helpful framework for designers of randomized algorithms.

Video Recording