Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distributed and Private Machine Learning

Personalized Federated Learning: A Unified Framework and Universal Optimization Techniques

Filip Hanzely · Boxin Zhao · Mladen Kolar


Abstract:

We study the optimization aspects of personalized Federated Learning (FL). We develop a universal optimization theory applicable to all convex personalized FL models in the literature. In particular, we propose a general personalized objective capable of recovering essentially any existing personalized FL objective as a special case. We design several optimization techniques to minimize the general objective, namely a tailored variant of Local SGD and variants of accelerated coordinate descent/accelerated SVRCD. We demonstrate the practicality and/or optimality of our methods both in terms of communication and local computation. Lastly, we argue about the implications of our general optimization theory when applied to solve specific personalized FL objectives.

Chat is not available.