Skip to yearly menu bar Skip to main content


Oral
in
Affinity Workshop: Tiny Papers Oral Session 1

TOWARDS FAIRNESS CONSTRAINED RESTLESS MULTI-ARMED BANDITS: A CASE STUDY OF MATERNAL AND CHILD CARE DOMAIN

Gargi Singh · Milind Tambe · Aparna Taneja


Abstract:

Restless multi-armed bandits (RMABs) are widely used for resource allocation in dynamic environments, but they typically do not consider fairness implications. This paper introduces a fairness-aware approach for offline RMABs. We propose a Kullback-Leibler (KL) divergence-based fairness metric to quantify the discrepancy between the selected and the overall population. This is incorporated as a regularizer into the soft whittle index optimization. We evaluate our fairness-aware algorithm on a real-world RMAB dataset where initial results suggest that our approach can potentially improve fairness while preserving solution quality.

Live content is unavailable. Log in and register to view live content