ICLR Poster "model merging" Papers
4 papers found
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
Lucas Bandarkar, Benjamin Muller, Pritish Yuvraj et al.
ICLR 2025posterarXiv:2410.01335
13
citations
Leveraging Submodule Linearity Enhances Task Arithmetic Performance in LLMs
Rui Dai, Sile Hu, Xu Shen et al.
ICLR 2025posterarXiv:2504.10902
6
citations
Merging LoRAs like Playing LEGO: Pushing the Modularity of LoRA to Extremes Through Rank-Wise Clustering
Ziyu Zhao, tao shen, Didi Zhu et al.
ICLR 2025posterarXiv:2409.16167
33
citations
Multimodal Lego: Model Merging and Fine-Tuning Across Topologies and Modalities in Biomedicine
Konstantin Hemker, Nikola Simidjievski, Mateja Jamnik
ICLR 2025posterarXiv:2405.19950
2
citations