2025 "dataset distillation" Papers

19 papers found

Beyond Modality Collapse: Representation Blending for Multimodal Dataset Distillation

xin zhang, Ziruo Zhang, JIAWEI DU et al.

NEURIPS 2025posterarXiv:2505.14705
3
citations

Beyond Random: Automatic Inner-loop Optimization in Dataset Distillation

Muquan Li, Hang Gou, Dongyang Zhang et al.

NEURIPS 2025posterarXiv:2510.04838
1
citations

Boost Self-Supervised Dataset Distillation via Parameterization, Predefined Augmentation, and Approximation

Sheng-Feng Yu, Jia-Jiun Yao, Wei-Chen Chiu

ICLR 2025posterarXiv:2507.21455
1
citations

Dataset Distillation for Pre-Trained Self-Supervised Vision Models

George Cazenavette, Antonio Torralba, Vincent Sitzmann

NEURIPS 2025posterarXiv:2511.16674

Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks

Siddharth Joshi, Jiayi Ni, Baharan Mirzasoleiman

ICLR 2025posterarXiv:2410.02116
4
citations

Dataset Distillation via Vision-Language Category Prototype

YAWEN ZOU, Guang Li, Duo Su et al.

ICCV 2025highlightarXiv:2506.23580
3
citations

DELT: A Simple Diversity-driven EarlyLate Training for Dataset Distillation

Zhiqiang Shen, Ammar Sherif, Zeyuan Yin et al.

CVPR 2025posterarXiv:2411.19946
10
citations

Distilling Dataset into Neural Field

Donghyeok Shin, HeeSun Bae, Gyuwon Sim et al.

ICLR 2025posterarXiv:2503.04835
4
citations

Does Training with Synthetic Data Truly Protect Privacy?

Yunpeng Zhao, Jie Zhang

ICLR 2025posterarXiv:2502.12976

Efficient Multimodal Dataset Distillation via Generative Models

Zhenghao Zhao, Haoxuan Wang, Junyi Wu et al.

NEURIPS 2025posterarXiv:2509.15472
2
citations

Enhancing Dataset Distillation via Non-Critical Region Refinement

Minh-Tuan Tran, Trung Le, Xuan-May Le et al.

CVPR 2025posterarXiv:2503.18267
4
citations

GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost

Xinyi Shang, Peng Sun, Tao Lin

ICLR 2025posterarXiv:2405.14736
9
citations

Group Distributionally Robust Dataset Distillation with Risk Minimization

Saeed Vahidian, Mingyu Wang, Jianyang Gu et al.

ICLR 2025posterarXiv:2402.04676
9
citations

Heavy Labels Out! Dataset Distillation with Label Space Lightening

Ruonan Yu, Songhua Liu, Zigeng Chen et al.

ICCV 2025posterarXiv:2408.08201
3
citations

Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation

Xinhao Zhong, Hao Fang, Bin Chen et al.

CVPR 2025posterarXiv:2406.05704
3
citations

Hyperbolic Dataset Distillation

Wenyuan Li, Guang Li, Keisuke Maeda et al.

NEURIPS 2025posterarXiv:2505.24623
7
citations

Influence-Guided Diffusion for Dataset Distillation

Mingyang Chen, Jiawei Du, Bo Huang et al.

ICLR 2025poster
19
citations

Towards Adversarially Robust Dataset Distillation by Curvature Regularization

Eric Xue, Yijiang Li, Haoyang Liu et al.

AAAI 2025paperarXiv:2403.10045
18
citations

Towards Stable and Storage-efficient Dataset Distillation: Matching Convexified Trajectory

Wenliang Zhong, Haoyu Tang, Qinghai Zheng et al.

CVPR 2025posterarXiv:2406.19827
7
citations