ICML 2024 "knowledge distillation" Papers
18 papers found
Bayesian Knowledge Distillation: A Bayesian Perspective of Distillation with Uncertainty Quantification
Luyang Fang, Yongkai Chen, Wenxuan Zhong et al.
Data-free Distillation of Diffusion Models with Bootstrapping
Jiatao Gu, Chen Wang, Shuangfei Zhai et al.
DetKDS: Knowledge Distillation Search for Object Detectors
Lujun Li, Yufan Bao, Peijie Dong et al.
DFD: Distilling the Feature Disparity Differently for Detectors
Kang Liu, Yingyi Zhang, Jingyun Zhang et al.
DistiLLM: Towards Streamlined Distillation for Large Language Models
Jongwoo Ko, Sungnyun Kim, Tianyi Chen et al.
Do Topological Characteristics Help in Knowledge Distillation?
Jungeun Kim, Junwon You, Dongjin Lee et al.
DSD-DA: Distillation-based Source Debiasing for Domain Adaptive Object Detection
Yongchao Feng, Shiwei Li, Yingjie Gao et al.
Embodied CoT Distillation From LLM To Off-the-shelf Agents
Wonje Choi, Woo Kyung Kim, Minjong Yoo et al.
Enhancing Class-Imbalanced Learning with Pre-Trained Guidance through Class-Conditional Knowledge Distillation
Lan Li, Xin-Chun Li, Han-Jia Ye et al.
From Coarse to Fine: Enable Comprehensive Graph Self-supervised Learning with Multi-granular Semantic Ensemble
Qianlong Wen, Mingxuan Ju, Zhongyu Ouyang et al.
Keypoint-based Progressive Chain-of-Thought Distillation for LLMs
Kaituo Feng, Changsheng Li, Xiaolu Zhang et al.
Knowledge Distillation with Auxiliary Variable
Bo Peng, zhen fang, Guangquan Zhang et al.
MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis
Luyuan Xie, Manqing Lin, Tianyu Luan et al.
Online Speculative Decoding
Xiaoxuan Liu, Lanxiang Hu, Peter Bailis et al.
Overcoming Data and Model heterogeneities in Decentralized Federated Learning via Synthetic Anchors
Chun-Yin Huang, Kartik Srinivas, Xin Zhang et al.
Recurrent Early Exits for Federated Learning with Heterogeneous Clients
Royson Lee, Javier Fernandez-Marques, Xu Hu et al.
Rethinking Momentum Knowledge Distillation in Online Continual Learning
Nicolas MICHEL, Maorong Wang, Ling Xiao et al.
Revisit the Essence of Distilling Knowledge through Calibration
Wen-Shu Fan, Su Lu, Xin-Chun Li et al.