2024 "model distillation" Papers
5 papers found
AMD: Automatic Multi-step Distillation of Large-scale Vision Models
Cheng Han, Qifan Wang, Sohail A Dianat et al.
ECCV 2024posterarXiv:2407.04208
14
citations
Knowledge Transfer from Vision Foundation Models for Efficient Training of Small Task-specific Models
Raviteja Vemulapalli, Hadi Pouransari, Fartash Faghri et al.
ICML 2024poster
MGit: A Model Versioning and Management System
Wei Hao, Daniel Mendoza, Rafael Mendes et al.
ICML 2024poster
MobileNetV4: Universal Models for the Mobile Ecosystem
Danfeng Qin, Chas Leichner, Manolis Delakis et al.
ECCV 2024posterarXiv:2404.10518
407
citations
USTAD: Unified Single-model Training Achieving Diverse Scores for Information Retrieval
Seungyeon Kim, Ankit Singh Rawat, Manzil Zaheer et al.
ICML 2024poster