Poster "mixture-of-experts" Papers
8 papers found
Controllable-LPMoE: Adapting to Challenging Object Segmentation via Dynamic Local Priors from Mixture-of-Experts
Yanguang Sun, Jiawei Lian, jian Yang et al.
ICCV 2025posterarXiv:2510.21114
1
citations
Probabilistic Learning to Defer: Handling Missing Expert Annotations and Controlling Workload Distribution
Cuong Nguyen, Thanh-Toan Do, Gustavo Carneiro
ICLR 2025poster
6
citations
Robust Ego-Exo Correspondence with Long-Term Memory
Yijun Hu, Bing Fan, Xin Gu et al.
NeurIPS 2025posterarXiv:2510.11417
1
citations
S'MoRE: Structural Mixture of Residual Experts for Parameter-Efficient LLM Fine-tuning
Hanqing Zeng, Yinglong Xia, Zhuokai Zhao et al.
NeurIPS 2025posterarXiv:2504.06426
2
citations
Stretching Each Dollar: Diffusion Training from Scratch on a Micro-Budget
Vikash Sehwag, Xianghao Kong, Jingtao Li et al.
CVPR 2025posterarXiv:2407.15811
26
citations
Theory on Mixture-of-Experts in Continual Learning
Hongbo Li, Sen Lin, Lingjie Duan et al.
ICLR 2025posterarXiv:2406.16437
40
citations
$\texttt{MoE-RBench}$: Towards Building Reliable Language Models with Sparse Mixture-of-Experts
Guanjie Chen, Xinyu Zhao, Tianlong Chen et al.
ICML 2024poster
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts
Jianan Zhou, Zhiguang Cao, Yaoxin Wu et al.
ICML 2024poster