2024 "dataset distillation" Papers
8 papers found
Data-to-Model Distillation: Data-Efficient Learning Framework
Ahmad Sajedi, Samir Khaki, Lucy Z. Liu et al.
ECCV 2024posterarXiv:2411.12841
3
citations
Distill Gold from Massive Ores: Bi-level Data Pruning towards Efficient Dataset Distillation
YUE XU, Yong-Lu Li, Kaitong Cui et al.
ECCV 2024posterarXiv:2305.18381
8
citations
Large Scale Dataset Distillation with Domain Shift
Noel Loo, Alaa Maalouf, Ramin Hasani et al.
ICML 2024poster
Low-Rank Similarity Mining for Multimodal Dataset Distillation
Yue Xu, Zhilin Lin, Yusong Qiu et al.
ICML 2024posterarXiv:2406.03793
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching
Yongmin Lee, Hye Won Chung
ICML 2024posterarXiv:2406.18561
Teddy: Efficient Large-Scale Dataset Distillation via Taylor-Approximated Matching
Ruonan Yu, Songhua Liu, Jingwen Ye et al.
ECCV 2024posterarXiv:2410.07579
13
citations
Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents
Yuqi Jia, Saeed Vahidian, Jingwei Sun et al.
ECCV 2024posterarXiv:2312.01537
17
citations
What is Dataset Distillation Learning?
William Yang, Ye Zhu, Zhiwei Deng et al.
ICML 2024posterarXiv:2406.04284