NeurIPS Poster "in-context learning" Papers
19 papers found
Attention-based clustering
Rodrigo Maulen Soto, Pierre Marion, Claire Boyer
Can LLMs Reason Over Non-Text Modalities in a Training-Free Manner? A Case Study with In-Context Representation Learning
Tianle Zhang, Wanlong Fang, Jonathan Woo et al.
Explore In-Context Message Passing Operator for Graph Neural Networks in A Mean Field Game
Tingting Dan, Xinwei Huang, Won Hwa Kim et al.
In-Context Learning Strategies Emerge Rationally
Daniel Wurgaft, Ekdeep S Lubana, Core Francisco Park et al.
Knowledge Starts with Practice: Knowledge-Aware Exercise Generative Recommendation with Adaptive Multi-Agent Cooperation
Yangtao Zhou, Hua Chu, chen et al.
Learning to Rank for In-Context Example Retrieval
Yuwen Ji, Luodan Zhang, Ambyer han et al.
Linear Transformers Implicitly Discover Unified Numerical Algorithms
Patrick Lutz, Aditya Gangrade, Hadi Daneshmand et al.
On the Robustness of Transformers against Context Hijacking for Linear Classification
Tianle Li, Chenyang Zhang, Xingwu Chen et al.
Optimal Dynamic Regret by Transformers for Non-Stationary Reinforcement Learning
Baiyuan Chen, Shinji Ito, Masaaki Imaizumi
Optimality and NP-Hardness of Transformers in Learning Markovian Dynamical Functions
Yanna Ding, Songtao Lu, Yingdong Lu et al.
Reasoning Models Better Express Their Confidence
Dongkeun Yoon, Seungone Kim, Sohee Yang et al.
RelationAdapter: Learning and Transferring Visual Relation with Diffusion Transformers
Yan Gong, Yiren Song, Yicheng Li et al.
Self-Generated In-Context Examples Improve LLM Agents for Sequential Decision-Making Tasks
Vishnu Sarukkai, Zhiqiang Xie, Kayvon Fatahalian
Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context
Taejong Joo, Diego Klabjan
Theoretical Insights into In-context Learning with Unlabeled Data
Yingcong Li, Xiangyu Chang, Muti Kara et al.
TiRex: Zero-Shot Forecasting Across Long and Short Horizons with Enhanced In-Context Learning
Andreas Auer, Patrick Podest, Daniel Klotz et al.
Transformers are almost optimal metalearners for linear classification
Roey Magen, Gal Vardi
Unlabeled Data Can Provably Enhance In-Context Learning of Transformers
Renpu Liu, Jing Yang
Vocabulary In-Context Learning in Transformers: Benefits of Positional Encoding
Qian Ma, Ruoxiang Xu, Yongqiang Cai