Simple and Efficient Heterogeneous Temporal Graph Neural Network

0
Citations
#2028
in NeurIPS 2025
of 5858 papers
5
Authors
3
Data Points

Abstract

Heterogeneous temporal graphs (HTGs) are ubiquitous data structures in the real world. Recently, to enhance representation learning on HTGs, numerous attention-based neural networks have been proposed. Despite these successes, existing methods rely on a decoupled temporal and spatial learning paradigm, which weakens interactions of spatio-temporal information and leads to a high model complexity. To bridge this gap, we propose a novel learning paradigm for HTGs called Simple and Efficient Heterogeneous Temporal Graph Neural Network (SE-HTGNN). Specifically, we innovatively integrate temporal modeling into spatial learning via a novel dynamic attention mechanism, which substantially reduces model complexity while enhancing discriminative representation learning on HTGs. Additionally, to comprehensively and adaptively understand HTGs, we leverage large language models to prompt SE-HTGNN, enabling the model to capture the implicit properties of node types as prior knowledge. Extensive experiments demonstrate that SE-HTGNN achieves up to 10× speed-up over the state-of-the-art and latest baseline while maintaining the best forecasting accuracy.

Citation History

Jan 26, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0