by Jack Cai Papers
3 papers found
Attention-Level Speculation
Jack Cai, Ammar Vora, Randolph Zhang et al.
ICML 2025poster
Everything Everywhere All at Once: LLMs can In-Context Learn Multiple Tasks in Superposition
Zheyang Xiong, Jack Cai, John Cooper et al.
ICML 2025spotlight
Self-Improving Transformers Overcome Easy-to-Hard and Length Generalization Challenges
Nayoung Lee, Jack Cai, Avi Schwarzschild et al.
ICML 2025poster