Abstract

In this talk, I will discuss the relationship between the mirror descent and the variable metric method. When the metric in the mirror decent is induced by a convex function, whose Hessian is close to the Hessian of the objective function, this method enjoys both robustness from the mirror descent and superlinear convergence for Newton type methods. When applied to a linearly constrained minimization problem, we prove the global and local convergence, both in the continuous and discrete settings. As applications, we compute the gradient flows with variable mobility, and show that the so derived mirror descent method offers fast convergence and solution bound preservation.

Video Recording