NeurIPS Poster "mixture of experts" Papers

9 papers found

CryptoMoE: Privacy-Preserving and Scalable Mixture of Experts Inference via Balanced Expert Routing

Yifan Zhou, Tianshi Xu, Jue Hong et al.

NeurIPS 2025posterarXiv:2511.01197
1
citations

GRAVER: Generative Graph Vocabularies for Robust Graph Foundation Models Fine-tuning

Haonan Yuan, Qingyun Sun, Junhua Shi et al.

NeurIPS 2025posterarXiv:2511.05592
3
citations

HMVLM:Human Motion-Vision-Language Model via MoE LoRA

Lei Hu, Yongjing Ye, Shihong Xia

NeurIPS 2025poster

JanusDNA: A Powerful Bi-directional Hybrid DNA Foundation Model

Qihao Duan, Bingding Huang, Zhenqiao Song et al.

NeurIPS 2025posterarXiv:2505.17257
1
citations

Learning to Specialize: Joint Gating-Expert Training for Adaptive MoEs in Decentralized Settings

Yehya Farhat, Hamza ElMokhtar Shili, Fangshuo Liao et al.

NeurIPS 2025posterarXiv:2306.08586
3
citations

MoE-Gyro: Self-Supervised Over-Range Reconstruction and Denoising for MEMS Gyroscopes

Feiyang Pan, Shenghe Zheng, Chunyan Yin et al.

NeurIPS 2025posterarXiv:2506.06318
2
citations

MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation

Shen Yuan, Yin Zheng, Taifeng Wang et al.

NeurIPS 2025posterarXiv:2506.14436
1
citations

Multi-Task Vehicle Routing Solver via Mixture of Specialized Experts under State-Decomposable MDP

Yuxin Pan, Zhiguang Cao, Chengyang GU et al.

NeurIPS 2025posterarXiv:2510.21453

Towards Interpretability Without Sacrifice: Faithful Dense Layer Decomposition with Mixture of Decoders

James Oldfield, Shawn Im, Sharon Li et al.

NeurIPS 2025posterarXiv:2505.21364