In-Person Oral presentation / top 5% paper

SAM as an Optimal Relaxation of Bayes

Thomas Möllenhoff · Mohammad Emtiyaz Khan

AD1
[ Abstract ] [ Livestream: Visit Oral 4 Track 2: Probabilistic Methods ]
Tue 2 May 6:10 a.m. — 6:20 a.m. PDT

Sharpness-aware minimization (SAM) and related adversarial deep-learning methods can drastically improve generalization, but their underlying mechanisms are not yet fully understood. Here, we establish SAM as a relaxation of the Bayes objective where the expected negative-loss is replaced by the optimal convex lower bound, obtained by using the so-called Fenchel biconjugate. The connection enables a new Adam-like extension of SAM to automatically obtain reasonable uncertainty estimates, while sometimes also improving its accuracy. By connecting adversarial and Bayesian methods, our work opens a new path to robustness.

Chat is not available.