Skip to yearly menu bar Skip to main content


Querying Easily Flip-flopped Samples for Deep Active Learning

Seong Jin Cho · Gwangsu Kim · Junghyun Lee · Jinwoo Shin · Chang Yoo

Halle B #299
[ ]
Wed 8 May 1:45 a.m. PDT — 3:45 a.m. PDT


Active learning, a paradigm within machine learning, aims to select and query unlabeled data to enhance model performance strategically. A crucial selection strategy leverages the model's predictive uncertainty, reflecting the informativeness of a data point. While the sample's distance to the decision boundary intuitively measures predictive uncertainty, its computation becomes intractable for complex decision boundaries formed in multiclass classification tasks. This paper introduces the least disagree metric (LDM), the smallest probability of predicted label disagreement. We propose an asymptotically consistent estimator for LDM under mild assumptions. The estimator boasts computational efficiency and straightforward implementation for deep learning models using parameter perturbation. The LDM-based active learning algorithm queries unlabeled data with the smallest LDM, achieving state-of-the-art overall performance across various datasets and deep architectures, as demonstrated by the experimental results.

Chat is not available.