Poster
in
Workshop: Advances in Financial AI: Opportunities, Innovations, and Responsible AI
Bridging Efficiency, Stability, and Fairness: Self-Supervised GNN-to-MLP Knowledge Distillation for Financial Networks
VIPUL KUMAR SINGH · Jyotismita Barman · Sandeep Kumar · Jayadeva Jayadeva
In recent times, financial data has increasingly been modeled as network structures, enabling the capture of both individual attributes and complex relationships between financial entities. Graph Neural Networks (GNNs) have become dominant tools for analyzing such data. However, GNNs face two major challenges in critical financial applications: 1) high computational costs during inference and 2) biased predictions that can disproportionately affect underrepresented groups, which can undermine the reliability of financial decision-making. To address these issues, we propose a novel self-supervised knowledge distillation framework, transferring knowledge from GNNs to Multi-Layer Perceptrons (MLPs). This framework reduces computational costs while improving model fairness, stability, and robustness in financial contexts. Specifically, we introduce feature augmentation by adding random noise and generating counterfactual versions of the input data. Extensive experiments on real-world financial datasets show that our approach surpasses existing GNN-to-MLP distillation methods, achieving the optimal balance between utility, stability, and fairness.