Poster "mixture-of-experts" Papers

12 papers found

Complexity Experts are Task-Discriminative Learners for Any Image Restoration

Eduard Zamfir, Zongwei Wu, Nancy Mehta et al.

CVPR 2025arXiv:2411.18466
32
citations

Controllable-LPMoE: Adapting to Challenging Object Segmentation via Dynamic Local Priors from Mixture-of-Experts

Yanguang Sun, Jiawei Lian, jian Yang et al.

ICCV 2025arXiv:2510.21114
1
citations

FedVLA: Federated Vision-Language-Action Learning with Dual Gating Mixture-of-Experts for Robotic Manipulation

Cui Miao, Tao Chang, meihan wu et al.

ICCV 2025arXiv:2508.02190
5
citations

Localist Topographic Expert Routing: A Barrel Cortex-Inspired Modular Network for Sensorimotor Processing

Tianfang Zhu, Dongli Hu, Jiandong Zhou et al.

NEURIPS 2025

Probabilistic Learning to Defer: Handling Missing Expert Annotations and Controlling Workload Distribution

Cuong Nguyen, Thanh-Toan Do, Gustavo Carneiro

ICLR 2025
6
citations

Robust Ego-Exo Correspondence with Long-Term Memory

Yijun Hu, Bing Fan, Xin Gu et al.

NEURIPS 2025arXiv:2510.11417
1
citations

S'MoRE: Structural Mixture of Residual Experts for Parameter-Efficient LLM Fine-tuning

Hanqing Zeng, Yinglong Xia, Zhuokai Zhao et al.

NEURIPS 2025arXiv:2504.06426
2
citations

Stretching Each Dollar: Diffusion Training from Scratch on a Micro-Budget

Vikash Sehwag, Xianghao Kong, Jingtao Li et al.

CVPR 2025arXiv:2407.15811
26
citations

Theory on Mixture-of-Experts in Continual Learning

Hongbo Li, Sen Lin, Lingjie Duan et al.

ICLR 2025arXiv:2406.16437
44
citations

Towards Efficient Foundation Model for Zero-shot Amodal Segmentation

Zhaochen Liu, Limeng Qiao, Xiangxiang Chu et al.

CVPR 2025
3
citations

$\texttt{MoE-RBench}$: Towards Building Reliable Language Models with Sparse Mixture-of-Experts

Guanjie Chen, Xinyu Zhao, Tianlong Chen et al.

ICML 2024arXiv:2406.11353
6
citations

MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

Jianan Zhou, Zhiguang Cao, Yaoxin Wu et al.

ICML 2024arXiv:2405.01029
59
citations