"knowledge distillation" Papers

113 papers found • Page 3 of 3

Recurrent Early Exits for Federated Learning with Heterogeneous Clients

Royson Lee, Javier Fernandez-Marques, Xu Hu et al.

ICML 2024poster

Rethinking Momentum Knowledge Distillation in Online Continual Learning

Nicolas MICHEL, Maorong Wang, Ling Xiao et al.

ICML 2024poster

Revisit the Essence of Distilling Knowledge through Calibration

Wen-Shu Fan, Su Lu, Xin-Chun Li et al.

ICML 2024poster

Select and Distill: Selective Dual-Teacher Knowledge Transfer for Continual Learning on Vision-Language Models

Yu-Chu Yu, Chi-Pin Huang, Jr-Jen Chen et al.

ECCV 2024posterarXiv:2403.09296
16
citations

Self-Cooperation Knowledge Distillation for Novel Class Discovery

Yuzheng Wang, Zhaoyu Chen, Dingkang Yang et al.

ECCV 2024posterarXiv:2407.01930
5
citations

SimDistill: Simulated Multi-Modal Distillation for BEV 3D Object Detection

Haimei Zhao, Qiming Zhang, Shanshan Zhao et al.

AAAI 2024paperarXiv:2303.16818
24
citations

Simple Image-Level Classification Improves Open-Vocabulary Object Detection

Ruohuan Fang, Guansong Pang, Xiao Bai

AAAI 2024paperarXiv:2312.10439
22
citations

SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Malyaban Bal, Abhronil Sengupta

AAAI 2024paperarXiv:2308.10873
70
citations

Summarizing Stream Data for Memory-Constrained Online Continual Learning

Jianyang Gu, Kai Wang, Wei Jiang et al.

AAAI 2024paperarXiv:2305.16645
22
citations

Sunshine to Rainstorm: Cross-Weather Knowledge Distillation for Robust 3D Object Detection

Xun Huang, Hai Wu, Xin Li et al.

AAAI 2024paperarXiv:2402.18493

Synchronization is All You Need: Exocentric-to-Egocentric Transfer for Temporal Action Segmentation with Unlabeled Synchronized Video Pairs

Camillo Quattrocchi, Antonino Furnari, Daniele Di Mauro et al.

ECCV 2024posterarXiv:2312.02638
17
citations

Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation

Hyunjune Shin, Dong-Wan Choi

AAAI 2024paperarXiv:2402.12406
7
citations

UNIC: Universal Classification Models via Multi-teacher Distillation

Yannis Kalantidis, Larlus Diane, Mert Bulent SARIYILDIZ et al.

ECCV 2024posterarXiv:2408.05088
18
citations