Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bridging the Gap Between Practice and Theory in Deep Learning

Conformal Prediction Sets Improve Human Decision Making

Jesse Cresswell · Yi Sui · Bhargava Kumar · Noël Vouitsis


Abstract:

Theoretical work on conformal prediction shows that prediction sets can satisfy useful risk control guarantees. These same works often neglect to consider that prediction sets are not actionable. Most automated decision pipelines require a single option to act on, not a set, making conformal prediction theoretically interesting, but with limited practicality. In this work, we bridge the gap between conformal prediction and actionable decisions by considering humans as the mechanism for converting prediction sets into decisions. We conduct a pre-registered randomized controlled trial with conformal prediction sets provided to human subjects and find, with statistical significance, that when humans are given conformal prediction sets their accuracy on tasks improves compared to fixed-size prediction sets with the same coverage guarantee. The results show that quantifying model uncertainty with conformal prediction is helpful for human-in-the-loop decision making and human-AI teams.

Chat is not available.