2025 Poster "logit distillation" Papers
3 papers found
InfiGFusion: Graph-on-Logits Distillation via Efficient Gromov-Wasserstein for Model Fusion
Yuanyi Wang, Zhaoyi Yan, Yiming Zhang et al.
NEURIPS 2025posterarXiv:2505.13893
2
citations
Local Dense Logit Relations for Enhanced Knowledge Distillation
Liuchi Xu, Kang Liu, Jinshuai Liu et al.
ICCV 2025posterarXiv:2507.15911
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
Kairong Yu, Chengting Yu, Tianqing Zhang et al.
CVPR 2025posterarXiv:2503.03144
10
citations