"mixture-of-experts" Papers
8 papers found
CL-MoE: Enhancing Multimodal Large Language Model with Dual Momentum Mixture-of-Experts for Continual Visual Question Answering
Tianyu Huai, Jie Zhou, Xingjiao Wu et al.
CVPR 2025highlightarXiv:2503.00413
10
citations
MIRA: Medical Time Series Foundation Model for Real-World Health Data
Hao Li, Bowen Deng, Chang Xu et al.
NeurIPS 2025oralarXiv:2506.07584
4
citations
Probabilistic Learning to Defer: Handling Missing Expert Annotations and Controlling Workload Distribution
Cuong Nguyen, Thanh-Toan Do, Gustavo Carneiro
ICLR 2025poster
6
citations
Robust Ego-Exo Correspondence with Long-Term Memory
Yijun Hu, Bing Fan, Xin Gu et al.
NeurIPS 2025posterarXiv:2510.11417
1
citations
Stretching Each Dollar: Diffusion Training from Scratch on a Micro-Budget
Vikash Sehwag, Xianghao Kong, Jingtao Li et al.
CVPR 2025posterarXiv:2407.15811
26
citations
Theory on Mixture-of-Experts in Continual Learning
Hongbo Li, Sen Lin, Lingjie Duan et al.
ICLR 2025posterarXiv:2406.16437
40
citations
$\texttt{MoE-RBench}$: Towards Building Reliable Language Models with Sparse Mixture-of-Experts
Guanjie Chen, Xinyu Zhao, Tianlong Chen et al.
ICML 2024poster
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts
Jianan Zhou, Zhiguang Cao, Yaoxin Wu et al.
ICML 2024poster