Abstract
Sampling methods have long been ubiquitous in data science and machine learning. Recently, due to their complementary algorithmic and statistical properties, sampling and related sketching methods are central to randomized linear algebra and stochastic optimization. We'll provide an overview of structural properties central to key results in randomized linear algebra, highlighting how sampling and sketching methods can lead to improved results. This is typically achieved in quite different ways, depending on whether one is interested in worst-case linear algebra theory bounds, or whether one is using linear algebra for numerical implementations, statistical bounds, machine learning applications, uses in iterative stochastic optimization algorithms, etc. We'll provide an overview of how sampling and sketching methods are used in randomized linear algebra in each of these different cases, highlighting similarities and differences.