Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Robustness of Quantum Algorithms for Nonconvex Optimization

Weiyuan Gong · Chenyi Zhang · Tongyang Li

Hall 3 + Hall 2B #580
[ ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: In this paper, we systematically study quantum algorithms for finding an ϵ-approximate second-order stationary point (ϵ-SOSP) of a d-dimensional nonconvex function, a fundamental problem in nonconvex optimization, with noisy zeroth- or first-order oracles as inputs. We first prove that, up to noise of O(ϵ10/d5), perturbed accelerated gradient descent equipped with quantum gradient estimation takes O(logd/ϵ1.75) quantum queries to find an ϵ-SOSP. We then prove that standard perturbed gradient descent is robust to the noise of O(ϵ6/d4) and O(ϵ/d0.5+ζ) for any ζ>0 on the zeroth- and first-order oracles, respectively, which provides a quantum algorithm with poly-logarithmic query complexity. Furthermore, we propose a stochastic gradient descent algorithm using quantum mean estimation on the Gaussian smoothing of noisy oracles, which is robust to O(ϵ1.5/d) and O(ϵ/d) noise on the zeroth- and first-order oracles, respectively. The quantum algorithm takes O(d2.5/ϵ3.5) and O(d2/ϵ3) queries to the two oracles, giving a polynomial speedup over the classical counterparts. As a complement, we characterize the domains where quantum algorithms can find an ϵ-SOSP with poly-logarithmic, polynomial, or exponential number of queries in d, or the problem is information-theoretically unsolvable even with an infinite number of queries. In addition, we prove an Ω(ϵ12/7) lower bound on ϵ for any randomized classical and quantum algorithm to find an ϵ-SOSP using either noisy zeroth- or first-order oracles.

Live content is unavailable. Log in and register to view live content