Skip to yearly menu bar Skip to main content


Poster

Equi-normalization of Neural Networks

Pierre Stock · Benjamin Graham · Rémi Gribonval · Hervé Jégou

Great Hall BC #24

Keywords: [ normalization ] [ regularization ] [ convolutional neural networks ] [ sinkhorn ]


Abstract:

Modern neural networks are over-parametrized. In particular, each rectified linear hidden unit can be modified by a multiplicative factor by adjusting input and out- put weights, without changing the rest of the network. Inspired by the Sinkhorn-Knopp algorithm, we introduce a fast iterative method for minimizing the l2 norm of the weights, equivalently the weight decay regularizer. It provably converges to a unique solution. Interleaving our algorithm with SGD during training improves the test accuracy. For small batches, our approach offers an alternative to batch- and group- normalization on CIFAR-10 and ImageNet with a ResNet-18.

Live content is unavailable. Log in and register to view live content