Spring 2020

No Quantum Speedup Over Gradient Descent for Non-Smooth Convex Optimization

Monday, Jul. 12, 2021 12:30 pm1:15 pm

Add to Calendar


Robin Kothari (Microsoft)



We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function f:R^n→R and its (sub)gradient. Our goal is to find an ϵ-approximate minimum of f starting from a point that is distance at most R from the true minimum. If f is G-Lipschitz, then the classic gradient descent algorithm solves this problem with O((GR/ϵ)^2) queries. Importantly, the number of queries is independent of the dimension n and gradient descent is optimal in this regard: No deterministic or randomized algorithm can achieve better complexity that is still independent of the dimension n.
In this paper we reprove the randomized lower bound of Ω((GR/ϵ)^2) using a simpler argument than previous lower bounds. We then show that although the function family used in the lower bound is hard for randomized algorithms, it can be solved using O(GR/ϵ) quantum queries. We then show an improved lower bound against quantum algorithms using a different set of instances and establish our main result that in general even quantum algorithms need Ω((GR/ϵ)^2) queries to solve the problem. Hence there is no quantum speedup over gradient descent for black-box first-order convex optimization without further assumptions on the function family.

Available at