by Xiangxiang Dai Papers
2 papers found
Conference
Demystifying Online Clustering of Bandits: Enhanced Exploration Under Stochastic and Smoothed Adversarial Contexts
Zhuohua Li, Maoli Liu, Xiangxiang Dai et al.
ICLR 2025arXiv:2501.00891
3
citations
Offline Learning for Combinatorial Multi-armed Bandits
Xutong Liu, Xiangxiang Dai, Jinhang Zuo et al.
ICML 2025arXiv:2501.19300
3
citations