Skip to yearly menu bar Skip to main content


Poster

Mode Normalization

Lucas Deecke · Iain Murray · Hakan Bilen

Great Hall BC #50

Keywords: [ deep learning ] [ computer vision ] [ expert models ] [ normalization ]


Abstract:

Normalization methods are a central building block in the deep learning toolbox. They accelerate and stabilize training, while decreasing the dependence on manually tuned learning rate schedules. When learning from multi-modal distributions, the effectiveness of batch normalization (BN), arguably the most prominent normalization method, is reduced. As a remedy, we propose a more flexible approach: by extending the normalization to more than a single mean and variance, we detect modes of data on-the-fly, jointly normalizing samples that share common features. We demonstrate that our method outperforms BN and other widely used normalization techniques in several experiments, including single and multi-task datasets.

Live content is unavailable. Log in and register to view live content