Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions

Structured Robustness for Distribution Shifts

Erfan Darzi · Alexander Marx

Keywords: [ Mirror Descent ] [ Distributionally Robust Optimization ] [ Transformation-Invariance ]


Abstract:

Out-of-distribution (OOD) data often undermines reliable model deployment inhigh-stakes domains such as financial markets, where overlooked correlationsand unexpected shifts can render predictive systems ineffective. We proposeSTAR (Structured Transformations and Adversarial Reweighting), a frameworkthat leverages the geometry of distribution shifts by combining transformation-based invariances with divergence-based robust optimization. Specifically, STARplaces an f -divergence ball around each label-preserving transformation of thetraining sample, empowering an adversary to apply known transformations andreweight the resulting data within a specified divergence radius. This design cap-tures both large, structured shifts and subtle, unmodeled perturbations—a criticalstep toward mitigating shortcuts and spurious correlations. Notably, STAR recov-ers standard distributionally robust optimization if no structured transformationsare assumed. We establish a uniform-convergence analysis showing that minimiz-ing STAR’s empirical nested min–max objective achieves low worst-case errorover all admissible shifts with high probability. Our results quantify the additionalsamples needed to handle the adversary’s flexibility, providing theoretical guid-ance for selecting the divergence radius based on problem complexity. Empiricalstudies on synthetic and image benchmarks confirm that STAR outperforms base-lines, consistent with our theoretical findings.

Chat is not available.