Poster "teacher-student models" Papers
4 papers found
DistillHGNN: A Knowledge Distillation Approach for High-Speed Hypergraph Neural Networks
Saman Forouzandeh, Parham Moradi Dowlatabadi, Mahdi Jalili
ICLR 2025poster
1
citations
Single-Teacher View Augmentation: Boosting Knowledge Distillation via Angular Diversity
Seonghoon Yu, Dongjun Nam, Dina Katabi et al.
NeurIPS 2025posterarXiv:2510.22480
Adversarially Robust Distillation by Reducing the Student-Teacher Variance Gap
Junhao Dong, Piotr Koniusz, Junxi Chen et al.
ECCV 2024poster
10
citations
Revisit the Essence of Distilling Knowledge through Calibration
Wen-Shu Fan, Su Lu, Xin-Chun Li et al.
ICML 2024poster