Skip to yearly menu bar Skip to main content


Poster

Leave-One-Out Stable Conformal Prediction

Kiljae Lee · Yuan Zhang

Hall 3 + Hall 2B #456
[ ] [ Project Page ]
Thu 24 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract:

Conformal prediction (CP) is an important tool for distribution-free predictive uncertainty quantification.Yet, a major challenge is to balance computational efficiency and prediction accuracy, particularly for multiple predictions.We propose Leave-One-Out Stable Conformal Prediction (LOO-StabCP), a novel method to speed up full conformal using algorithmic stability without sample splitting.By leveraging leave-one-out stability, our method is much faster in handling a large number of prediction requests compared to existing method RO-StabCP based on replace-one stability.We derived stability bounds for several popular machine learning tools: regularized loss minimization (RLM) and stochastic gradient descent (SGD), as well as kernel method, neural networks and bagging.Our method is theoretically justified and demonstrates superior numerical performance on synthetic and real-world data.We applied our method to a screening problem, where its effective exploitation of training data led to improved test power compared to state-of-the-art method based on split conformal.

Live content is unavailable. Log in and register to view live content