NeurIPS "mixture of experts" Papers
8 papers found
CryptoMoE: Privacy-Preserving and Scalable Mixture of Experts Inference via Balanced Expert Routing
Yifan Zhou, Tianshi Xu, Jue Hong et al.
NeurIPS 2025posterarXiv:2511.01197
1
citations
HMVLM:Human Motion-Vision-Language Model via MoE LoRA
Lei Hu, Yongjing Ye, Shihong Xia
NeurIPS 2025poster
JanusDNA: A Powerful Bi-directional Hybrid DNA Foundation Model
Qihao Duan, Bingding Huang, Zhenqiao Song et al.
NeurIPS 2025posterarXiv:2505.17257
1
citations
Learning to Specialize: Joint Gating-Expert Training for Adaptive MoEs in Decentralized Settings
Yehya Farhat, Hamza ElMokhtar Shili, Fangshuo Liao et al.
NeurIPS 2025posterarXiv:2306.08586
3
citations
MoBA: Mixture of Block Attention for Long-Context LLMs
Enzhe Lu, Zhejun Jiang, Jingyuan Liu et al.
NeurIPS 2025spotlightarXiv:2502.13189
94
citations
MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation
Shen Yuan, Yin Zheng, Taifeng Wang et al.
NeurIPS 2025posterarXiv:2506.14436
1
citations
Multi-Task Vehicle Routing Solver via Mixture of Specialized Experts under State-Decomposable MDP
Yuxin Pan, Zhiguang Cao, Chengyang GU et al.
NeurIPS 2025posterarXiv:2510.21453
Towards Interpretability Without Sacrifice: Faithful Dense Layer Decomposition with Mixture of Decoders
James Oldfield, Shawn Im, Sharon Li et al.
NeurIPS 2025posterarXiv:2505.21364