ICML "knowledge distillation" Papers

18 papers found

Bayesian Knowledge Distillation: A Bayesian Perspective of Distillation with Uncertainty Quantification

Luyang Fang, Yongkai Chen, Wenxuan Zhong et al.

ICML 2024poster

Data-free Distillation of Diffusion Models with Bootstrapping

Jiatao Gu, Chen Wang, Shuangfei Zhai et al.

ICML 2024poster

DetKDS: Knowledge Distillation Search for Object Detectors

Lujun Li, Yufan Bao, Peijie Dong et al.

ICML 2024poster

DFD: Distilling the Feature Disparity Differently for Detectors

Kang Liu, Yingyi Zhang, Jingyun Zhang et al.

ICML 2024poster

DistiLLM: Towards Streamlined Distillation for Large Language Models

Jongwoo Ko, Sungnyun Kim, Tianyi Chen et al.

ICML 2024posterarXiv:2402.03898

Do Topological Characteristics Help in Knowledge Distillation?

Jungeun Kim, Junwon You, Dongjin Lee et al.

ICML 2024poster

DSD-DA: Distillation-based Source Debiasing for Domain Adaptive Object Detection

Yongchao Feng, Shiwei Li, Yingjie Gao et al.

ICML 2024posterarXiv:2311.10437

Embodied CoT Distillation From LLM To Off-the-shelf Agents

Wonje Choi, Woo Kyung Kim, Minjong Yoo et al.

ICML 2024posterarXiv:2412.11499

Enhancing Class-Imbalanced Learning with Pre-Trained Guidance through Class-Conditional Knowledge Distillation

Lan Li, Xin-Chun Li, Han-Jia Ye et al.

ICML 2024poster

From Coarse to Fine: Enable Comprehensive Graph Self-supervised Learning with Multi-granular Semantic Ensemble

Qianlong Wen, Mingxuan Ju, Zhongyu Ouyang et al.

ICML 2024poster

Keypoint-based Progressive Chain-of-Thought Distillation for LLMs

Kaituo Feng, Changsheng Li, Xiaolu Zhang et al.

ICML 2024posterarXiv:2405.16064

Knowledge Distillation with Auxiliary Variable

Bo Peng, zhen fang, Guangquan Zhang et al.

ICML 2024poster

MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis

Luyuan Xie, Manqing Lin, Tianyu Luan et al.

ICML 2024posterarXiv:2405.06822

Online Speculative Decoding

Xiaoxuan Liu, Lanxiang Hu, Peter Bailis et al.

ICML 2024posterarXiv:2310.07177

Overcoming Data and Model heterogeneities in Decentralized Federated Learning via Synthetic Anchors

Chun-Yin Huang, Kartik Srinivas, Xin Zhang et al.

ICML 2024posterarXiv:2405.11525

Recurrent Early Exits for Federated Learning with Heterogeneous Clients

Royson Lee, Javier Fernandez-Marques, Xu Hu et al.

ICML 2024posterarXiv:2405.14791

Rethinking Momentum Knowledge Distillation in Online Continual Learning

Nicolas MICHEL, Maorong Wang, Ling Xiao et al.

ICML 2024posterarXiv:2309.02870

Revisit the Essence of Distilling Knowledge through Calibration

Wen-Shu Fan, Su Lu, Xin-Chun Li et al.

ICML 2024poster