Skip to yearly menu bar Skip to main content

In-Person Poster presentation / poster accept

Towards Better Selective Classification

Leo Feng · Mohamed Osama Ahmed · Hossein Hajimirsadeghi · Amir Abdi

MH1-2-3-4 #59

Keywords: [ selective classification ] [ semi-supervised learning ] [ Deep Learning and representational learning ]


We tackle the problem of Selective Classification where the objective is to achieve the best performance on a predetermined ratio (coverage) of the dataset. Recent state-of-the-art selective methods come with architectural changes either via introducing a separate selection head or an extra abstention logit. In this paper, we challenge the aforementioned methods. The results suggest that the superior performance of state-of-the-art methods is owed to training a more generalizable classifier rather than their proposed selection mechanisms. We argue that the best performing selection mechanism should instead be rooted in the classifier itself. Our proposed selection strategy uses the classification scores and achieves better results by a significant margin, consistently, across all coverages and all datasets, without any added compute cost. Furthermore, inspired by semi-supervised learning, we propose an entropy-based regularizer that improves the performance of selective classification methods. Our proposed selection mechanism with the proposed entropy-based regularizer achieves new state-of-the-art results.

Chat is not available.