Large Sparse Kernels for Federated Learning
feilong zhang
2023 Virtual oral
in
Affinity Event: Tiny Papers Showcase Day (a DEI initiative)
in
Affinity Event: Tiny Papers Showcase Day (a DEI initiative)
Abstract
Existing approaches to address non-iid data in federated learning are often tailored to specific types of heterogeneity and may lack generalizability to all scenarios. In this paper, we present empirical evidence supporting the claim that employing large sparse convolution kernels can lead to enhanced robustness against distribution shifts in the context of federated learning for various non-iid problems, including imbalanced data volumes, different feature spaces, and label distributions. Our experimental results demonstrate that the substitution of convolutional kernels with large sparse kernels can yield substantial improvements in the ability to resist non-iid problems across multiple methods.
Video
Chat is not available.
Successful Page Load