"multi-task learning" Papers
72 papers found • Page 2 of 2
Efficient Pareto Manifold Learning with Low-Rank Structure
Weiyu CHEN, James Kwok
Every Node Is Different: Dynamically Fusing Self-Supervised Tasks for Attributed Graph Clustering
Pengfei Zhu, Qian Wang, Yu Wang et al.
Exploring Correlations of Self-Supervised Tasks for Graphs
Taoran Fang, Wei Chow, Yifei Sun et al.
Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters
Yuhang Zhou, Zhao Zihua, Siyuan Du et al.
Fair Resource Allocation in Multi-Task Learning
Hao Ban, Kaiyi Ji
Fast and Sample Efficient Multi-Task Representation Learning in Stochastic Contextual Bandits
Jiabin Lin, Shana Moothedath, Namrata Vaswani
Guarantees for Nonlinear Representation Learning: Non-identical Covariates, Dependent Data, Fewer Samples
Thomas T. Zhang, Bruce Lee, Ingvar Ziemann et al.
Learning with Adaptive Resource Allocation
Jing Wang, Miao Yu, Peng Zhao et al.
Localizing Task Information for Improved Model Merging and Compression
Ke Wang, Nikolaos Dimitriadis, Guillermo Ortiz-Jimenez et al.
Merging Multi-Task Models via Weight-Ensembling Mixture of Experts
Anke Tang, Li Shen, Yong Luo et al.
Multi-Task Domain Adaptation for Language Grounding with 3D Objects
Penglei SUN, Yaoxian Song, Xinglin Pan et al.
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts
Jianan Zhou, Zhiguang Cao, Yaoxin Wu et al.
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks
Liam Collins, Hamed Hassani, Mahdi Soltanolkotabi et al.
Quality-Diversity with Limited Resources
Ren-Jian Wang, Ke Xue, Cong Guan et al.
Representation Surgery for Multi-Task Model Merging
Enneng Yang, Li Shen, Zhenyi Wang et al.
Robust Multi-Task Learning with Excess Risks
Yifei He, Shiji Zhou, Guojun Zhang et al.
Sparse-to-dense Multimodal Image Registration via Multi-Task Learning
Kaining Zhang, Jiayi Ma
STEM: Unleashing the Power of Embeddings for Multi-Task Recommendation
Liangcai Su, Junwei Pan, Ximei Wang et al.
Switch Diffusion Transformer: Synergizing Denoising Tasks with Sparse Mixture-of-Experts
Byeongjun Park, Hyojun Go, Jin-Young Kim et al.
Thermometer: Towards Universal Calibration for Large Language Models
Maohao Shen, Subhro Das, Kristjan Greenewald et al.
Towards Modular LLMs by Building and Reusing a Library of LoRAs
Oleksiy Ostapenko, Zhan Su, Edoardo Ponti et al.
VersatileGaussian: Real-time Neural Rendering for Versatile Tasks using Gaussian Splatting
Renjie Li, Zhiwen Fan, Bohua Wang et al.