Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Privacy Regulation and Protection in Machine Learning

Fed Up with Complexity: Simplifying Many-Task Federated Learning with NTKFedAvg

Aashiq Muhamed · Meher Mankikar · Virginia Smith


Abstract:

Recent work has introduced the challenging setting of many-task federated learning (MaT-FL), which considers a scenario in which each client in a federated network may solve a separate learning task. Unfortunately, existing methods addressing MaT-FL, such as dynamic client grouping and split FL, increase privacy risks and computational demands by maintaining separate models for each client or task on the server. We introduce a novel baseline for MaT-FL, NTKFedAvg, that leverages a unified multi-task model on the server and the Neural Tangent Kernel (NTK) linearization to accommodate task heterogeneity without client or task-specific model adjustments on the server. This approach enhances privacy, reduces complexity, and improves resistance to various threats. Our evaluations on two MaT-FL benchmarks demonstrate that NTKFedAvg surpasses FedAvg in mIoU and accuracy, achieves faster convergence, is competitive with existing baselines, and excels in task unlearning in fewer rounds. This work not only proposes a more efficient and potentially privacy-preserving baseline for MaT-FL but also contributes to the understanding of task composition and weight disentanglement in FL, offering insights into the design of FL algorithms for environments characterized by significant task diversity.

Chat is not available.