Skip to yearly menu bar Skip to main content


Poster

Adaptive Gradient Clipping for Robust Federated Learning

Youssef Allouah · Rachid Guerraoui · Nirupam Gupta · Ahmed Jellouli · Geovani Rizk · John Stephan

Hall 3 + Hall 2B #377
[ ]
Fri 25 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Robust federated learning aims to maintain reliable performance despite the presence of adversarial or misbehaving workers. While state-of-the-art (SOTA) robust distributed gradient descent (Robust-DGD) methods were proven theoretically optimal, their empirical success has often relied on pre-aggregation gradient clipping.However, existing static clipping strategies yield inconsistent results: enhancing robustness against some attacks while being ineffective or even detrimental against others.To address this limitation, we propose a principled adaptive clipping strategy, Adaptive Robust Clipping (ARC), which dynamically adjusts clipping thresholds based on the input gradients. We prove that ARC not only preserves the theoretical robustness guarantees of SOTA Robust-DGD methods but also provably improves asymptotic convergence when the model is well-initialized. Extensive experiments on benchmark image classification tasks confirm these theoretical insights, demonstrating that ARC significantly enhances robustness, particularly in highly heterogeneous and adversarial settings.

Live content is unavailable. Log in and register to view live content