Skip to yearly menu bar Skip to main content


Poster

Simple Minimax Optimal Byzantine Robust Algorithm for Nonconvex Objectives with Uniform Gradient Heterogeneity

Tomoya Murata · Kenta Niwa · Takumi Fukami · Iifan Tyou

Halle B #166

Abstract: In this study, we consider nonconvex federated learning problems with the existence of Byzantine workers. We propose a new simple Byzantine robust algorithm called Momentum Screening. The algorithm is adaptive to the Byzantine fraction, i.e., all its hyperparameters do not depend on the number of Byzantine workers. We show that our method achieves the best optimization error of O(δ2ζmax2) for nonconvex smooth local objectives satisfying ζmax-uniform gradient heterogeneity condition under δ-Byzantine fraction, which can be better than the best known error rate of O(δζmean2) for local objectives satisfying ζmean-mean heterogeneity condition when δ(ζmax/ζmean)2. Furthermore, we derive an algorithm independent lower bound for local objectives satisfying ζmax-uniform gradient heterogeneity condition and show the minimax optimality of our proposed method on this class. In numerical experiments, we validate the superiority of our method over the existing robust aggregation algorithms and verify our theoretical results.

Chat is not available.