Poster
in
Workshop: Neural Network Weights as a New Data Modality
Integrating Meta-Trained Hypernetworks with GBDTs and Retrieval for Tabular Data
David Bonet · Marçal Comajoan Cara · Alvaro Calafell · Daniel Mas Montserrat · Alexander Ioannidis
Keywords: [ meta-learning ] [ tabular data ] [ weight space ] [ boosted trees ] [ tabular classification ] [ hypernetworks ]
Recent progress in deep learning has not fully carried over to tabular data, where gradient-boosted decision trees (GBDTs) still dominate real-world applications. We introduce iLTM, an integrated Large Tabular Model that unifies GBDT embeddings, dimensionality-agnostic representations, meta-trained hypernetworks, strong multilayer perceptrons (MLPs), and retrieval within a single architecture. Leveraging tree-based inductive biases and neural scalability, iLTM is pre-trained on over 1,800 heterogeneous datasets, achieving consistently superior performance on a wide range of disjoint classification tasks, from small datasets to large and high-dimensional real-world data. Extensive experiments show that iLTM is competitive with GBDTs and state-of-the-art deep tabular models while requiring less task-specific tuning. By bridging the gap between tree-based and neural methods, while also providing a new perspective on the structure of neural weight spaces through meta-trained hypernetworks, iLTM offers a new framework for tabular foundation models for robust, adaptable, and scalable tabular learning settings.