ECCV 2024 by Linfeng Ye Papers
2 papers found
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi, Xizhen Deng, Renhao Tan et al.
ECCV 2024poster
11
citations
Markov Knowledge Distillation: Make Nasty Teachers trained by Self-undermining Knowledge Distillation Fully Distillable
En-Hui Yang, Linfeng Ye
ECCV 2024poster