DeiT-LT: Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets

17citations
Project
17
Citations
#672
in CVPR 2024
of 2716 papers
5
Authors
1
Data Points

Citation History

Jan 27, 2026
17