Talks
Fall 2020

Learning Ising Models from One, Ten or a Thousand Samples

Monday, December 14th, 2020 12:45 pm1:30 pm

Add to Calendar

Speaker: 

Costis Daskalakis (MIT)

Samples from high-dimensional distributions can be scarce or expensive. Can we meaningfully learn such distributions from one or just a few samples? We provide guarantees for single-sample estimation of Ising models, quantifying the estimation error in terms of the metric entropy of possible interaction matrices. As corollaries of our main result, we derive bounds when the model's interaction matrix is a (sparse) linear combination of known matrices, or it belongs to a finite set, or to a high-dimensional manifold. 

Our result handles multiple independent samples by viewing them as one sample from a larger model, and can be used to derive estimation bounds that are qualitatively similar to state-of-the-art in the multiple-sample literature. We thus unify two separate strands of work in the literature:  (1) a renaissance of recent work in Computer Science on estimating Ising models/MRFs from multiple independent samples under minimal assumptions about the model's interaction matrix; and (2) a growing literature in Probability Theory on estimating them from one sample in restrictive settings. On the technical front, we exploit novel concentration and anti-concentration inequalities for functions of the Ising model.