Poster
in
Workshop: Learning Meaningful Representations of Life (LMRL) Workshop @ ICLR 2025
DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations
Bingzheng Wu · QI WANG
Predicting synergistic drug combinations is critical for treating complex diseases, yet existing graph-based methods struggle to balance noise suppression and interpretability in molecular representations. Specifically, the heterogeneity of molecular graphs causes Transformer-based models to amplify high-frequency noise while masking low-frequency signals linked to functional groups. To address this, we propose the Differential Graph Transformer (DiffGraphTrans), which integrates a learnable differential filter into multi-head attention. Our model dynamically suppresses irrelevant atomic interactions and amplifies key functional groups. Experiments on lung cancer drug combinations show that DiffGraphTrans outperforms baseline models and significantly improves biochemical interpretability through attention weight analysis. Our framework provides a principled approach to learning robust embeddings based on noise and biologically meaningful, advancing interpretable AI for drug discovery.