Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neural Network Weights as a New Data Modality

Flow to Learn: Flow Matching on Neural Network Parameters

Daniel G. Saragih · Deyu Cao · Tejas Balaji · Ashwin Santhosh

Keywords: [ meta-learning ] [ few-shot learning ] [ conditional flow matching ] [ generative hyper-representation learning ] [ neural network weights generation ]


Abstract:

Foundational language models show a remarkable ability to learn new concepts during inference via context data. However, similar work for images lag behind. To address this challenge, we introduce FLoWN, a flow matching model that learns to generate neural network parameters for different tasks. Our approach models the flow on latent space, while conditioning the process on context data. Experiments verify that FLoWN attains various desiderata for a meta-learning model. In addition, it matches or exceeds baselines on in-distribution tasks, provides better initializations for classifier training, and is performant on out-of-distribution few-shot tasks while having a fine-tuning mechanism to improve performance.

Chat is not available.