Skip to yearly menu bar Skip to main content


Poster

Local Steps Speed Up Local GD for Heterogeneous Distributed Logistic Regression

Michael Crawshaw · Blake Woodworth · Mingrui Liu

Hall 3 + Hall 2B #393
[ ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: We analyze two variants of Local Gradient Descent applied to distributed logistic regression with heterogeneous, separable data and show convergence at the rate O(1/KR) for K local steps and sufficiently large R communication rounds. In contrast, all existing convergence guarantees for Local GD applied to any problem are at least Ω(1/R), meaning they fail to show the benefit of local updates. The key to our improved guarantee is showing progress on the logistic regression objective when using a large stepsize η1/K, whereas prior analysis depends on η1/K.

Live content is unavailable. Log in and register to view live content