2025 Oral "knowledge distillation" Papers
2 papers found
KINDLE: Knowledge-Guided Distillation for Prior-Free Gene Regulatory Network Inference
Rui Peng, Yuchen Lu, Qichen Sun et al.
NeurIPS 2025oralarXiv:2505.09664
TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Makoto Shing, Kou Misaki, Han Bao et al.
ICLR 2025oralarXiv:2501.16937
12
citations