Skip to yearly menu bar Skip to main content


Poster

Debiasing Federated Learning with Correlated Client Participation

Zhenyu Sun · Ziyang Zhang · Zheng Xu · Gauri Joshi · Pranay Sharma · Ermin Wei

Hall 3 + Hall 2B #369
[ ]
Fri 25 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: In cross-device federated learning (FL) with millions of mobile clients, only a small subset of clients participate in training in every communication round, and Federated Averaging (FedAvg) is the most popular algorithm in practice. Existing analyses of FedAvg usually assume the participating clients are independently sampled in each round from a uniform distribution, which does not reflect real-world scenarios. This paper introduces a theoretical framework that models client participation in FL as a Markov chain to study optimization convergence when clients have non-uniform and correlated participation across rounds. We apply this framework to analyze a more practical pattern: every client must wait a minimum number of RR rounds (minimum separation) before re-participating. We theoretically prove and empirically observe that increasing minimum separation reduces the bias induced by intrinsic non-uniformity of client availability in cross-device FL systems. Furthermore, we develop an effective debiasing algorithm for FedAvg that provably converges to the unbiased optimal solution under arbitrary minimum separation and unknown client availability distribution.

Live content is unavailable. Log in and register to view live content