
Abstract
Graphons are powerful tools for modeling large-scale graphs, serving both as limit objects for dense graph sequences and as generative models for random graphs. This bootcamp introduces graphons from a machine learning (ML) perspective, with an emphasis on their applications in graph information processing and graph neural networks (GNNs). We will begin with the mathematical foundations of graphon theory, including homomorphism densities, cut distance, sampling, dense graph convergence, and convergence of spectra. From there, we explore how graphons can be used to formalize the convergence of convolutional architectures on convergent sequences of graphs, and what this reveals about the transferability of GNNs trained on subsampled graph data. We will also discuss recent advances in graphon-based ML, practical limitations of the graphon model in modern ML, and alternative approaches for capturing structure in sparser large graphs.