Skip to yearly menu bar Skip to main content


Poster

Momentum Benefits Non-iid Federated Learning Simply and Provably

Ziheng Cheng · Xinmeng Huang · Pengfei Wu · Kun Yuan

Halle B #150

Abstract:

Federated learning is a powerful paradigm for large-scale machine learning, but itfaces significant challenges due to unreliable network connections, slow commu-nication, and substantial data heterogeneity across clients. FedAvg and SCAFFOLD are two prominent algorithms to address these challenges. In particular,FedAvg employs multiple local updates before communicating with a centralserver, while SCAFFOLD maintains a control variable on each client to compen-sate for “client drift” in its local updates. Various methods have been proposedto enhance the convergence of these two algorithms, but they either make imprac-tical adjustments to algorithmic structure, or rely on the assumption of boundeddata heterogeneity. This paper explores the utilization of momentum to enhancethe performance of FedAvg and SCAFFOLD. When all clients participate in thetraining process, we demonstrate that incorporating momentum allows FedAvgto converge without relying on the assumption of bounded data heterogeneity evenusing a constant local learning rate. This is novel and fairly suprising as existinganalyses for FedAvg require bounded data heterogeneity even with diminishinglocal learning rates. In partial client participation, we show that momentum en-ables SCAFFOLD to converge provably faster without imposing any additionalassumptions. Furthermore, we use momentum to develop new variance-reducedextensions of FedAvg and SCAFFOLD, which exhibit state-of-the-art conver-gence rates. Our experimental results support all theoretical findings.

Chat is not available.