NeurIPS 2025 "representation learning" Papers
16 papers found
AmorLIP: Efficient Language-Image Pretraining via Amortization
Haotian Sun, Yitong Li, Yuchen Zhuang et al.
NeurIPS 2025posterarXiv:2505.18983
2
citations
A Statistical Theory of Contrastive Learning via Approximate Sufficient Statistics
Licong Lin, Song Mei
NeurIPS 2025posterarXiv:2503.17538
3
citations
Can LLMs Reason Over Non-Text Modalities in a Training-Free Manner? A Case Study with In-Context Representation Learning
Tianle Zhang, Wanlong Fang, Jonathan Woo et al.
NeurIPS 2025posterarXiv:2509.17552
1
citations
Enhancing Training Data Attribution with Representational Optimization
Weiwei Sun, Haokun Liu, Nikhil Kandpal et al.
NeurIPS 2025spotlightarXiv:2505.18513
Harnessing Feature Resonance under Arbitrary Target Alignment for Out-of-Distribution Node Detection
Shenzhi Yang, Junbo Zhao, Sharon Li et al.
NeurIPS 2025posterarXiv:2502.16076
How Classifier Features Transfer to Downstream: An Asymptotic Analysis in a Two-Layer Model
HEE BIN YOO, Sungyoon Lee, Cheongjae Jang et al.
NeurIPS 2025poster
Minimal Semantic Sufficiency Meets Unsupervised Domain Generalization
Tan Pan, Kaiyu Guo, Dongli Xu et al.
NeurIPS 2025posterarXiv:2509.15791
Neural Thermodynamics: Entropic Forces in Deep and Universal Representation Learning
Liu Ziyin, Yizhou Xu, Isaac Chuang
NeurIPS 2025posterarXiv:2505.12387
4
citations
OLinear: A Linear Model for Time Series Forecasting in Orthogonally Transformed Domain
Wenzhen Yue, Yong Liu, Hao Wang et al.
NeurIPS 2025oralarXiv:2505.08550
9
citations
On the creation of narrow AI: hierarchy and nonlocality of neural network skills
Eric Michaud, Asher Parker-Sartori, Max Tegmark
NeurIPS 2025posterarXiv:2505.15811
2
citations
Provable Meta-Learning with Low-Rank Adaptations
Jacob Block, Sundararajan Srinivasan, Liam Collins et al.
NeurIPS 2025posterarXiv:2410.22264
Rotary Masked Autoencoders are Versatile Learners
Uros Zivanovic, Serafina Di Gioia, Andre Scaffidi et al.
NeurIPS 2025posterarXiv:2505.20535
1
citations
Self-Supervised Contrastive Learning is Approximately Supervised Contrastive Learning
Achleshwar Luthra, Tianbao Yang, Tomer Galanti
NeurIPS 2025posterarXiv:2506.04411
1
citations
T-REGS: Minimum Spanning Tree Regularization for Self-Supervised Learning
Julie Mordacq, David Loiseaux, Vicky Kalogeiton et al.
NeurIPS 2025spotlightarXiv:2510.23484
Understanding Representation Dynamics of Diffusion Models via Low-Dimensional Modeling
Xiao Li, Zekai Zhang, Xiang Li et al.
NeurIPS 2025posterarXiv:2502.05743
6
citations
Vision‑Language‑Vision Auto‑Encoder: Scalable Knowledge Distillation from Diffusion Models
Tiezheng Zhang, Yitong Li, Yu-Cheng Chou et al.
NeurIPS 2025posterarXiv:2507.07104
2
citations