α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
AAAI 2025
/
MoE-LPR: Multilingual Extension of Large Language ...
AAAI 2025
paper
MoE-LPR: Multilingual Extension of Large Language Models Through Mixture-of-Experts with Language Priors Routing
0
citations
PDF
Project
0
Citations
#1168
in AAAI 2025
of 3028 papers
9
Authors
1
Data Points
Authors
Hao Zhou
Zhijun Wang
Shujian Huang
Xin Huang
Xue Han
Junlan Feng
Chao Deng
Weihua Luo
Jiajun Chen
Citation History
Jan 28, 2026
0