NeurIPS 2025 "mixture of experts" Papers
5 papers found
HMVLM:Human Motion-Vision-Language Model via MoE LoRA
Lei Hu, Yongjing Ye, Shihong Xia
NeurIPS 2025poster
MoBA: Mixture of Block Attention for Long-Context LLMs
Enzhe Lu, Zhejun Jiang, Jingyuan Liu et al.
NeurIPS 2025spotlightarXiv:2502.13189
94
citations
MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation
Shen Yuan, Yin Zheng, Taifeng Wang et al.
NeurIPS 2025posterarXiv:2506.14436
1
citations
Multi-Task Vehicle Routing Solver via Mixture of Specialized Experts under State-Decomposable MDP
Yuxin Pan, Zhiguang Cao, Chengyang GU et al.
NeurIPS 2025posterarXiv:2510.21453
Towards Interpretability Without Sacrifice: Faithful Dense Layer Decomposition with Mixture of Decoders
James Oldfield, Shawn Im, Sharon Li et al.
NeurIPS 2025posterarXiv:2505.21364