Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions

Exploiting What Trained Models Learn for Making Them Robust to Spurious Correlations without Group Annotations

Mahdi Ghaznavi · Hesam Asadollahzadeh · Fahimeh Hosseini Noohdani · Soroush Vafaie Tabar · Hosein Hasani · Taha Alvanagh · Mohammad Hossein Rohban · Mahdieh Baghshah

Keywords: [ Spurious Correlation ] [ Zero Group Annotation ] [ Distribution Shift ] [ Out-of-Distribution Generalization ] [ Group Robustness ]


Abstract:

Classifiers trained with Empirical Risk Minimization (ERM) often rely on spurious correlations, degrading performance on underrepresented groups and challenging out-of-distribution generalization and fairness. While prior methods aim to address this, many require group annotations for training or validation, limiting their applicability when spurious correlations or group labels are unknown. We demonstrate that what has been learned during ERM training can be utilized to \textit{fully} remove group supervision for both training and model selection. To show this, we design Environment-based Validation and Loss-based Sampling (EVaLS), which uses losses from an ERM-trained model to construct datasets with mitigated group imbalance. EVaLS leverages environment inference to create diverse environments with correlation shifts, enabling model selection without group-annotated validation data. By using worst environment accuracy as a tuning surrogate, EVaLS achieves robust performance across groups through simple last-layer retraining. This fast and effective approach eliminates the need for group annotations, achieving competitive worst-group accuracy and improving robustness to known and unknown spurious correlations.

Chat is not available.