"large-scale pre-training" Papers
6 papers found
Revisiting MAE Pre-training for 3D Medical Image Segmentation
Tassilo Wald, Constantin Ulrich, Stanislav Lukyanenko et al.
CVPR 2025highlightarXiv:2410.23132
16
citations
THD-BAR: Topology Hierarchical Derived Brain Autoregressive Modeling for EEG Generic Representations
Wenchao Yang, Weidong Yan, Wenkang Liu et al.
NeurIPS 2025oralarXiv:2511.13733
This Time is Different: An Observability Perspective on Time Series Foundation Models
Ben Cohen, Emaad Khwaja, Youssef Doubli et al.
NeurIPS 2025posterarXiv:2505.14766
11
citations
Scalable Pre-training of Large Autoregressive Image Models
Alaaeldin Ali, Michal Klein, Shuangfei Zhai et al.
ICML 2024poster
Timer: Generative Pre-trained Transformers Are Large Time Series Models
Yong Liu, Haoran Zhang, Chenyu Li et al.
ICML 2024poster
Unified Training of Universal Time Series Forecasting Transformers
Gerald Woo, Chenghao Liu, Akshat Kumar et al.
ICML 2024poster