![Computational Complexity of Statistical Inference_hi-res logo](/sites/default/files/styles/workshop_banner_sm_1x/public/2023-03/Computational%20Complexity%20of%20Statistical%20Inference_hi-res.jpg?h=ccd3d9f7&itok=qCiIzWU5)
Abstract
We study mean-field variational Bayesian inference using the TAP approach, for Z2-synchronization as a prototypical example of a high-dimensional Bayesian model. We show that for any signal strength (the weak-recovery threshold), there exists a unique local minimizer of the TAP free energy functional near the mean of the Bayes posterior law. Furthermore, the TAP free energy in a local neighborhood of this minimizer is strongly convex. Consequently, a natural-gradient/mirror-descent algorithm achieves linear convergence to this minimizer from a local initialization, which may be obtained by a finite number of iterates of Approximate Message Passing (AMP). This provides a rigorous foundation for variational inference in high dimensions via minimization of the TAP free energy. We also analyze the finite-sample convergence of AMP, showing that AMP is asymptotically stable at the TAP minimizer for any lambda > 1, and is linearly convergent to this minimizer from a spectral initialization for sufficiently large lambda. Such a guarantee is stronger than results obtainable by state evolution analyses, which only describe a fixed number of AMP iterations in the infinite-sample limit. This is a joint work with Michael Celentano and Zhou Fan.