"inference-time computation" Papers
5 papers found
Bag of Tricks for Inference-time Computation of LLM Reasoning
Fan LIU, Wen-Shuo Chao, Naiqiang Tan et al.
NeurIPS 2025posterarXiv:2502.07191
12
citations
Let Me Think! A Long Chain of Thought Can Be Worth Exponentially Many Short Ones
Parsa Mirtaheri, Ezra Edelman, Samy Jelassi et al.
NeurIPS 2025posterarXiv:2505.21825
4
citations
Multi-LLM-Agents Debate - Performance, Efficiency, and Scaling Challenges
Hangfan Zhang, Zhiyao Cui, Qiaosheng Zhang et al.
ICLR 2025poster
Temporal Chain of Thought: Long-Video Understanding by Thinking in Frames
Anurag Arnab, Ahmet Iscen, Mathilde Caron et al.
NeurIPS 2025oralarXiv:2507.02001
8
citations
Wider or Deeper? Scaling LLM Inference-Time Compute with Adaptive Branching Tree Search
Yuichi Inoue, Kou Misaki, Yuki Imajuku et al.
NeurIPS 2025spotlightarXiv:2503.04412
18
citations