Spotlight "self-attention mechanism" Papers
3 papers found
Bipolar Self-attention for Spiking Transformers
Shuai Wang, Malu Zhang, Jingya Wang et al.
NEURIPS 2025spotlight
LLM Maybe LongLM: SelfExtend LLM Context Window Without Tuning
Hongye Jin, Xiaotian Han, Jingfeng Yang et al.
ICML 2024spotlight
One Meta-tuned Transformer is What You Need for Few-shot Learning
Xu Yang, Huaxiu Yao, Ying WEI
ICML 2024spotlight