FedLPA: Local Prior Alignment for Heterogeneous Federated Generalized Category Discovery

0citations
0
Citations
#2219
in NeurIPS 2025
of 5858 papers
3
Authors
4
Data Points

Abstract

Federated Generalized Category Discovery (Fed-GCD) requires a global model to classify seen classes and discover novel classes when data are siloed across heterogeneous clients. Existing GCD work often makes unrealistic assumptions, such as the need for prior knowledge of the number of novel classes or the assumption of uniform class distribution. We present Federated Local Prior Alignment (FedLPA), which eliminates these unrealistic assumptions by grounding learning in client-local structure and aligning predictions to client-local priors. Each client builds a similarity graph refined with reliable seen-class signals and discovers client-specific concepts and prototypes via Infomap. Leveraging the discovered concept structures, we introduce Local Prior Alignment (LPA): a self-distillation loss that matches the batch-mean prediction to an empirical prior computed from current concept assignments. The iterative process of local structure discovery and dynamic prior adaptation enables robust generalized category discovery under severe data heterogeneity. Our framework significantly outperforms existing federated generalized category discovery approaches on fine-grained and standard benchmarks, as demonstrated by extensive experimental results.

Citation History

Jan 26, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Feb 2, 2026
0