Naoki Nishikawa
5
Papers
8
Total Citations
Papers (5)
State Space Models are Provably Comparable to Transformers in Dynamic Token Selection
ICLR 2025arXiv
6
citations
From Shortcut to Induction Head: How Data Diversity Shapes Algorithm Selection in Transformers
NeurIPS 2025arXiv
1
citations
Degrees of Freedom for Linear Attention: Distilling Softmax Attention with Optimal Feature Efficiency
NeurIPS 2025arXiv
1
citations
Two-layer neural network on infinite dimensional data: global optimization guarantee in the mean-field regime
NeurIPS 2022
0
citations
Adaptive Topological Feature via Persistent Homology: Filtration Learning for Point Clouds
NeurIPS 2023arXiv
0
citations