Most Cited ECCV Highlight by Linfeng Ye Papers
2 papers found
Conference
#1
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi, Xizhen Deng, Renhao Tan et al.
ECCV 2024arXiv:2407.18041
13
citations
#2
Markov Knowledge Distillation: Make Nasty Teachers trained by Self-undermining Knowledge Distillation Fully Distillable
En-Hui Yang, Linfeng Ye
ECCV 2024
8
citations