Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neural Network Weights as a New Data Modality

Mimetic Initialization of MLPs

Asher Trockman · Zico Kolter

Keywords: [ weight analysis ] [ multilayer perceptrons ] [ mimetic initialization ] [ weight space ] [ mlps ] [ convnext ] [ initialization ]


Abstract:

Mimetic initialization uses pre-trained models as case studies of good initialization, using observations of structures in trained weights to inspire new, simple initialization techniques. So far, it has been applied only to spatial mixing layers, such convolutional, self-attention, and state space layers. In this work, we present the first attempt to apply the method to channel mixing layers, namely multilayer perceptrons (MLPs). Our extremely simple technique for MLPs---to give the first layer a nonzero mean---speeds up training on small-scale vision tasks like CIFAR-10 and ImageNet-1k. Though its effect is much smaller than spatial mixing initializations, itcan be used in conjunction with them for an additional positive effect.

Chat is not available.