Abstract

We develop an information-theoretic view of the stochastic block model, a popular statistical model for the large-scale structure of complex networks.  A graph $G$ from such a model is generated by first assigning vertex labels at random from a finite alphabet, and then connecting vertices with edge probabilities depending on the labels of the endpoints. In the case of the symmetric two-group model, when the vertex degree is large, we establish an explicit ‘single-letter’ characterization of the asymptotic (per vertex) mutual information between the vertex labels and the graph. 
 
In this talk, I will discuss some of the estimation implications of this characterization. I will also discuss some ingredients of the proof technique: (i) the universality principle from random matrix theory and (ii) a rigorous verification of the replica-symmetric heuristic from spin glass theory. 
 
This is joint work with Emmanuel Abbe and Andrea Montanari.