Abstract

The mathematical connections between graph theory and linear algebra are intimate and well known. The computational links between the two fields are also deep, extending all the way to the design of basic data structures and fundamental algorithms that are efficient in time, memory, and power consumption. In the first 50 years of this computational relationship, graphs served numerical linear algebra by enabling efficient sparse matrix computation. Recently, matrix computation has been returning the favor, particularly in the domain of parallel and high-performance algorithms.

Video Recording