Skip to yearly menu bar Skip to main content


Poster

Wasserstein-Regularized Conformal Prediction under General Distribution Shift

Rui Xu · Chao Chen · Yue Sun · Parvathinathan Venkitasubramaniam · Sihong Xie

Hall 3 + Hall 2B #432
[ ] [ Project Page ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: Conformal prediction yields a prediction set with guaranteed 1α coverage of the true target under the i.i.d. assumption, which can fail and lead to a gap between 1α and the actual coverage. Prior studies bound the gap using total variation distance, which cannot identify the gap changes under distribution shift at different α, thus serving as a weak indicator of prediction set validity. Besides, existing methods are mostly limited to covariate shifts, while general joint distribution shifts are more common in practice but less researched. In response, we first propose a Wasserstein distance-based upper bound of the coverage gap and analyze the bound using probability measure pushforwards between the shifted joint data and conformal score distributions, enabling a separation of the effect of covariate and concept shifts over the coverage gap. We exploit the separation to design algorithms based on importance weighting and regularized representation learning (WR-CP) to reduce the Wasserstein bound with a finite-sample error bound. WR-CP achieves a controllable balance between conformal prediction accuracy and efficiency. Experiments on six datasets prove that WR-CP can reduce coverage gaps to 3.2% across different confidence levels and outputs prediction sets 38% smaller than the worst-case approach on average.

Live content is unavailable. Log in and register to view live content