James Renegar (Cornell University)
First-order methods have dominated the literature in continuous optimization for well more than a decade, a consequence of these algorithms being capable of solving far larger problems than are within the reach of second-order methods (e.g., interior point methods), the critical difference being the amount of core memory (RAM) required in making an iteration. However, strong assumptions must be satisfied to ensure low memory requirements for first-order methods, perhaps the most notable being that the feasible region of an optimization problem is appropriately simple (e.g., a ball, a box, or a simplex), quite unlike the feasible region of a general hyperbolic program. Nevertheless, as we discuss, a hyperbolic program can be transformed into equivalent optimization problems to which standard first-order methods can be applied, with the feasible regions of the transformed problems being affine spaces, hence most certainly "appropriately simple". Moreover, procedures for computing (sub)gradients of the transformed problems can be determined by first considering the case of semidefinite programming, and then making judicious use of the Helton-Vinnikov Theorem. "Optimal" first-order methods result for hyperbolic programming.