Multi-order Orchestrated Curriculum Distillation for Model-Heterogeneous Federated Graph Learning

0citations
0
Citations
#1770
in NeurIPS 2025
of 5858 papers
9
Authors
3
Data Points

Abstract

Federated Graph Learning (FGL) has been shown to be particularly effective in enabling collaborative training of Graph Neural Networks (GNNs) in decentralized settings. Model-heterogeneous FGL further enhances practical applicability by accommodating client preferences for diverse model architectures. However, existing model-heterogeneous approaches primarily target Euclidean data and fail to account for a crucial aspect of graph-structured data: topological relationships. To address this limitation, we propose **TRUST**, a novel knowledge distillation-based **model-heterogeneous FGL** framework. Specifically, we propose Progressive Curriculum Node Scheduler to progressively introduce challenging nodes based on learning difficulty. In Adaptive Curriculum Distillation Modulator, we propose an adaptive temperature modulator that dynamically adjusts knowledge distillation temperature to accommodate varying client capabilities and graph complexity. Moreover, we leverage Wasserstein‑Driven Affinity Distillation to enable models to capture cross-class structural relationships through optimal transport. Extensive experiments on multiple graph benchmarks and model-heterogeneous settings show that **TRUST** outperforms existing methods, achieving an average 3.6\% $\uparrow$ performance gain, particularly under moderate heterogeneity conditions. The code is available for anonymous access at https://anonymous.4open.science/r/TRUST-NeurIPS2025.

Citation History

Jan 26, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0