Abstract

The problem of recognizing nonnegativity of a multivariate polynomial has a celebrated history, tracing back to Hilberts 17th problem. In recent years, there has been much renewed interest in the topic because of a multitude of applications in applied and computational mathematics and the observation that one can optimize over an interesting subset of nonnegative polynomials using “sum of squares (SOS) optimization”. In this talk, we give a brief overview of the developments in this field and show how they can be applied to two problems at the interface of machine learning and polynomial optimization. In part (i), we study the problem of learning a monotone polynomial from data. This is motivated by regression problems where the underlying function to be learned is monotone (consider, e.g., the price of a car as a function of its fuel efficiency). In part (ii), we study the problem of optimally decomposing a multivariate polynomials as the difference of two convex polynomials. This is motivated by certain majorization-minimization algorithms used in nonconvex optimization that require such a decomposition.

Video Recording