ECCV "knowledge distillation" Papers

22 papers found

Adaptive Multi-task Learning for Few-shot Object Detection

Yan Ren, Yanling Li, Wai-Kin Adams Kong

ECCV 2024poster

Adversarially Robust Distillation by Reducing the Student-Teacher Variance Gap

Junhao Dong, Piotr Koniusz, Junxi Chen et al.

ECCV 2024poster
10
citations

AMD: Automatic Multi-step Distillation of Large-scale Vision Models

Cheng Han, Qifan Wang, Sohail A Dianat et al.

ECCV 2024posterarXiv:2407.04208
14
citations

BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation

Zekai Xu, Kang You, Qinghai Guo et al.

ECCV 2024posterarXiv:2407.09083
13
citations

Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection

QIJIE MO, Yipeng Gao, Shenghao Fu et al.

ECCV 2024posterarXiv:2407.11499
14
citations

Direct Distillation between Different Domains

Jialiang Tang, Shuo Chen, Gang Niu et al.

ECCV 2024posterarXiv:2401.06826
6
citations

Distilling Knowledge from Large-Scale Image Models for Object Detection

Gang Li, Wenhai Wang, Xiang Li et al.

ECCV 2024poster
3
citations

DSMix: Distortion-Induced Saliency Map Based Pre-training for No-Reference Image Quality Assessment

Jinsong Shi, Jinsong Shi, Xiaojiang Peng et al.

ECCV 2024poster
5
citations

DεpS: Delayed ε-Shrinking for Faster Once-For-All Training

Aditya Annavajjala, Alind Khare, Animesh Agrawal et al.

ECCV 2024posterarXiv:2407.06167
1
citations

Good Teachers Explain: Explanation-Enhanced Knowledge Distillation

Amin Parchami, Moritz Böhle, Sukrut Rao et al.

ECCV 2024posterarXiv:2402.03119
18
citations

Harmonizing knowledge Transfer in Neural Network with Unified Distillation

yaomin huang, faming Fang, Zaoming Yan et al.

ECCV 2024posterarXiv:2409.18565
1
citations

Human Motion Forecasting in Dynamic Domain Shifts: A Homeostatic Continual Test-time Adaptation Framework

Qiongjie Cui, Huaijiang Sun, Bin Li et al.

ECCV 2024poster
1
citations

Is Retain Set All You Need in Machine Unlearning? Restoring Performance of Unlearned Models with Out-Of-Distribution Images

Jacopo Bonato, Marco Cotogni, Luigi Sabetta

ECCV 2024posterarXiv:2404.12922
19
citations

LASS3D: Language-Assisted Semi-Supervised 3D Semantic Segmentation with Progressive Unreliable Data Exploitation

Jianan Li, Qiulei Dong

ECCV 2024poster
1
citations

MobileDiffusion: Instant Text-to-Image Generation on Mobile Devices

Yang Zhao, Zhisheng Xiao, Yanwu Xu et al.

ECCV 2024posterarXiv:2311.16567
35
citations

Multi-scale Cross Distillation for Object Detection in Aerial Images

Kun Wang, Zi Wang, Zhang Li et al.

ECCV 2024poster
2
citations

Open Vocabulary 3D Scene Understanding via Geometry Guided Self-Distillation

Pengfei Wang, Yuxi Wang, Shuai Li et al.

ECCV 2024posterarXiv:2407.13362
10
citations

PEA-Diffusion: Parameter-Efficient Adapter with Knowledge Distillation in non-English Text-to-Image Generation

Jian Ma, Chen Chen, Qingsong Xie et al.

ECCV 2024posterarXiv:2311.17086
8
citations

Progressive Pretext Task Learning for Human Trajectory Prediction

Xiaotong Lin, Tianming Liang, Jian-Huang Lai et al.

ECCV 2024posterarXiv:2407.11588
27
citations

Select and Distill: Selective Dual-Teacher Knowledge Transfer for Continual Learning on Vision-Language Models

Yu-Chu Yu, Chi-Pin Huang, Jr-Jen Chen et al.

ECCV 2024posterarXiv:2403.09296
16
citations

Self-Cooperation Knowledge Distillation for Novel Class Discovery

Yuzheng Wang, Zhaoyu Chen, Dingkang Yang et al.

ECCV 2024posterarXiv:2407.01930
5
citations

UNIC: Universal Classification Models via Multi-teacher Distillation

Yannis Kalantidis, Larlus Diane, Mert Bulent SARIYILDIZ et al.

ECCV 2024posterarXiv:2408.05088
18
citations