Learning meta-features for AutoML

Herilalaina Rakotoarison · Louisot Milijaona · Andry RASOANAIVO · Michele Sebag · Marc Schoenauer


Keywords: [ optimal transport ] [ Hyper-Parameter Optimization ] [ automl ]

[ Abstract ]
[ Visit Poster at Spot G0 in Virtual World ] [ Slides [ OpenReview
Tue 26 Apr 2:30 a.m. PDT — 4:30 a.m. PDT
Spotlight presentation:


This paper tackles the AutoML problem, aimed to automatically select an ML algorithm and its hyper-parameter configuration most appropriate to the dataset at hand. The proposed approach, MetaBu, learns new meta-features via an Optimal Transport procedure, aligning the manually designed \mf s with the space of distributions on the hyper-parameter configurations. MetaBu meta-features, learned once and for all, induce a topology on the set of datasets that is exploited to define a distribution of promising hyper-parameter configurations amenable to AutoML. Experiments on the OpenML CC-18 benchmark demonstrate that using MetaBu meta-features boosts the performance of state of the art AutoML systems, AutoSklearn (Feurer et al. 2015) and Probabilistic Matrix Factorization (Fusi et al. 2018). Furthermore, the inspection of MetaBu meta-features gives some hints into when an ML algorithm does well. Finally, the topology based on MetaBu meta-features enables to estimate the intrinsic dimensionality of the OpenML benchmark w.r.t. a given ML algorithm or pipeline.

Chat is not available.