Unlocking the Potential of Weighting Methods in Federated Learning Through Communication Compression
Valerii Parfenov · Daniil Medyakov · Dmitry Bylinkin · Nail Bashirov · Aleksandr Beznosikov
Abstract
Modern machine learning problems are frequently formulated in federated learning domain and incorporate inherently heterogeneous data. Weighting methods operate efficiently in terms of iteration complexity and represent a common direction in this setting. At the same time, they do not address directly the main obstacle in federated and distributed learning -- communication bottleneck. We tackle this issue by incorporating compression into the weighting scheme. We establish the convergence under a convexity assumption, considering both exact and stochastic oracles. Finally, we evaluate the practical performance of the proposed method on real-world problems.
Successful Page Load