Causal Subgraphs and Information Bottlenecks: Redefining OOD Robustness in Graph Neural Networks

0citations
PDF
0
Citations
#1500
in ECCV 2024
of 2387 papers
5
Authors
4
Data Points

Abstract

Graph Neural Networks (GNNs) are increasingly popular in processing graph-structured data, yet they face significant challenges when training and testing distributions diverge, common in real-world scenarios. This divergence often leads to substantial performance drops in GNN models. To address this, our study introduces a novel approach that effectively enhances GNN performance in Out-of-Distribution (OOD) scenarios. We propose a method CSIB guided by causal modeling principles to generate causal subgraphs, while concurrently consider both Fully Informative Invariant Features (FIIF) and Partially Informative Invariant Features (PIIF) situations. Our approach uniquely combines the principles of invariant risk minimization and graph information bottleneck. This integration not only guides the generation of causal subgraphs but also underscores the necessity of balancing invariant principles with information compression in the face of various distribution shifts. We validate our model through extensive experiments across diverse shift types, demonstrating its effectiveness in maintaining robust performance under OOD conditions.

Citation History

Jan 25, 2026
0
Jan 26, 2026
0
Jan 26, 2026
0
Jan 28, 2026
0