Poster
Local Steps Speed Up Local GD for Heterogeneous Distributed Logistic Regression
Michael Crawshaw · Blake Woodworth · Mingrui Liu
Hall 3 + Hall 2B #393
[
Abstract
]
Wed 23 Apr 7 p.m. PDT
— 9:30 p.m. PDT
Abstract:
We analyze two variants of Local Gradient Descent applied to distributed logistic regression with heterogeneous, separable data and show convergence at the rate for local steps and sufficiently large communication rounds. In contrast, all existing convergence guarantees for Local GD applied to any problem are at least , meaning they fail to show the benefit of local updates. The key to our improved guarantee is showing progress on the logistic regression objective when using a large stepsize , whereas prior analysis depends on .
Live content is unavailable. Log in and register to view live content