Skip to yearly menu bar Skip to main content


In-Person Poster presentation / top 25% paper

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

Noah Hollmann · Samuel Müller · Katharina Eggensperger · Frank Hutter

MH1-2-3-4 #25

Keywords: [ Deep Learning and representational learning ] [ Bayesian prediction ] [ Causal Reasoning ] [ tabular data ] [ Real-time Machine Learning ] [ automl ] [ Green AI ]


Abstract: We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.TabPFN is fully entailed in the weights of our network, which accepts training and test samples as a set-valued input and yields predictions for the entire test set in a single forward pass.TabPFN is a Prior-Data Fitted Network (PFN) and is trained offline once, to approximate Bayesian inference on synthetic datasets drawn from our prior.This prior incorporates ideas from causal reasoning: It entails a large space of structural causal models with a preference for simple structures.On the $18$ datasets in the OpenML-CC18 suite that contain up to 1000 training data points, up to 100 purely numerical features without missing values, and up to 10 classes, we show that our method clearly outperforms boosted trees and performs on par with complex state-of-the-art AutoML systems with up to $230\times$ speedup.This increases to a $5\,700\times$ speedup when using a GPU. We also validate these results on an additional 67 small numerical datasets from OpenML.We provide all our code, the trained TabPFN, an interactive browser demo and a Colab notebook at https://github.com/automl/TabPFN.

Chat is not available.