Abstract

What does it mean to *compute* something, and how should we measure the cost of doing so? This introductory lecture surveys foundational models of computation that are especially relevant when studying complexity questions in linear algebra. We begin with models of arithmetic: exact arithmetic over real or integer rings, and floating-point arithmetic as it arises in practice. These provide a natural entry point to the analysis of numerical error, where we introduce forward and backward error bounds as two complementary ways of quantifying stability.

We then turn to models of *complexity*. Arithmetic complexity focuses on counting operations in an idealized exact-arithmetic world, while bit complexity refines this picture by accounting for the representation size of numbers. Beyond operations on numbers, communication itself is often the true bottleneck. We will discuss models of communication complexity: first, in the sequential setting, where data must be moved between fast and slow levels of memory, and then in the parallel setting, where multiple processors exchange information.

By contrasting these models, we will see how the cost of computation can depend as much on information movement as on arithmetic itself. This overview should prepare participants to engage with the deeper themes of the program, where complexity theory and linear algebra meet.

Video Recording