Abstract

Modern deep generative models like GANs, VAEs, invertible flows and Score-based models are demonstrating excellent performance in representing high-dimensional distributions, especially for images. We will show how they can be used to solve inverse problems like denoising, filling missing data, and recovery from linear projections. We generalize compressed sensing theory beyond sparsity, extending Restricted Isometries to sets created by deep generative models. Our recent results include establishing theoretical results for Langevin sampling from full-dimensional generative models and fairness guarantees for inverse problems.

Attachment

Video Recording