ICML 2024 "positional encoding" Papers
7 papers found
How do Transformers Perform In-Context Autoregressive Learning ?
Michael Sander, Raja Giryes, Taiji Suzuki et al.
ICML 2024poster
Learning High-Frequency Functions Made Easy with Sinusoidal Positional Encoding
Chuanhao Sun, Zhihang Yuan, Kai Xu et al.
ICML 2024poster
Mol-AE: Auto-Encoder Based Molecular Representation Learning With 3D Cloze Test Objective
Junwei Yang, Kangjie Zheng, Siyu Long et al.
ICML 2024poster
Recurrent Distance Filtering for Graph Representation Learning
Yuhui Ding, Antonio Orvieto, Bobby He et al.
ICML 2024poster
Subgraphormer: Unifying Subgraph GNNs and Graph Transformers via Graph Products
Guy Bar Shalom, Beatrice Bevilacqua, Haggai Maron
ICML 2024poster
Two Stones Hit One Bird: Bilevel Positional Encoding for Better Length Extrapolation
Zhenyu He, Guhao Feng, Shengjie Luo et al.
ICML 2024poster
What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding
Hongkang Li, Meng Wang, Tengfei Ma et al.
ICML 2024poster