
Abstract
We examine the problem of real-time optimization of networked systems and develop online algorithms that steer the system towards the optimal system trajectory. The problem is modeled as a dynamic optimization problem with time-varying performance objectives and engineering constraints. The design of the algorithms leverages the online primal-dual projected-gradient method. Both first-order and zero-order algorithms are considered. For zero-order algorithms, the primal step that involves the gradient of the objective function (and hence requires networked systems model) is replaced by its zero-order approximation with two function evaluations using a deterministic perturbation signal. The evaluations are performed using the measurements of the system output, hence giving rise to a feedback interconnection, with the optimization algorithm serving as a feedback controller. We provide insights on the stability and tracking properties of this interconnection. Finally, we apply this methodology to a real-time optimal power flow problem in power systems, for reference power tracking and voltage regulation.