Skip to yearly menu bar Skip to main content


In-Person Oral presentation / top 5% paper

SAM as an Optimal Relaxation of Bayes

Thomas Möllenhoff · Mohammad Emtiyaz Khan

AD1

Abstract:

Sharpness-aware minimization (SAM) and related adversarial deep-learning methods can drastically improve generalization, but their underlying mechanisms are not yet fully understood. Here, we establish SAM as a relaxation of the Bayes objective where the expected negative-loss is replaced by the optimal convex lower bound, obtained by using the so-called Fenchel biconjugate. The connection enables a new Adam-like extension of SAM to automatically obtain reasonable uncertainty estimates, while sometimes also improving its accuracy. By connecting adversarial and Bayesian methods, our work opens a new path to robustness.

Chat is not available.