Virtual presentation / poster accept

Mind the Pool: Convolutional Neural Networks Can Overfit Input Size

Bilal Alsallakh · David Yan · Narine Kokhlikyan · Vivek Miglani · Orion Reblitz-Richardson · Pamela Bhattacharya

MH1-2-3-4 #168

Keywords: [ Deep Learning and representational learning ] [ Overfitting ] [ Input Size ] [ pooling ] [ convolutional neural networks ]


Abstract:

We demonstrate how convolutional neural networks can overfit the input size: The accuracy drops significantly when using certain sizes, compared with favorable ones. This issue is inherent to pooling arithmetic, with standard downsampling layers playing a major role in favoring certain input sizes and skewing the weights accordingly. We present a solution to this problem by depriving these layers from the arithmetic cues they use to overfit the input size. Through various examples, we show how our proposed spatially-balanced pooling improves the generalization of the network to arbitrary input sizes and its robustness to translational shifts.

Chat is not available.