Skip to yearly menu bar Skip to main content


Poster

Efficient Continual Finite-Sum Minimization

Ioannis Mavrothalassitis · Stratis Skoulakis · Leello Dadi · Volkan Cevher

Halle B #122

Abstract: Given a sequence of functions f1,,fnf1,,fn with fi:DRfi:DR, finite-sum minimization seeks a point xD minimizing nj=1fj(x)/n. In this work, we propose a key twist into the finite-sum minimization, dubbed as *continual finite-sum minimization*, that asks for a sequence of points x1,,xnD such that each xiD minimizes the prefix-sum ij=1fj(x)/i. Assuming that each prefix-sum is strongly convex, we develop a first-order continual stochastic variance reduction gradient method (CSVRG) producing an ϵ-optimal sequence with ˜O(n/ϵ1/3+1/ϵ) overall *first-order oracles* (FO). An FO corresponds to the computation of a single gradient fj(x) at a given xD for some j[n]. Our approach significantly improves upon the O(n/ϵ) FOs that StochasticGradientDescent requires and the O(n2log(1/ϵ)) FOs that state-of-the-art variance reduction methods such as Katyusha require. We also prove that there is no natural first-order method with O(n/ϵα) gradient complexity for α<1/4, establishing that the first-order complexity of our method is nearly tight.

Chat is not available.