Shiyu Chang
7
Papers
40
Total Citations
Papers (7)
KVLink: Accelerating Large Language Models via Efficient KV Cache Reuse
NeurIPS 2025arXiv
24
citations
Correcting Diffusion Generation through Resampling
CVPR 2024
12
citations
Fictitious Synthetic Data Can Improve LLM Factuality via Prerequisite Learning
ICLR 2025
4
citations
VSP: Diagnosing the Dual Challenges of Perception and Reasoning in Spatial Planning Tasks for MLLMs
ICCV 2025
0
citations
Sparse Cocktail: Every Sparse Pattern Every Sparse Ratio All At Once
ICML 2024
0
citations
Speech Self-Supervised Learning Using Diffusion Model Synthetic Data
ICML 2024
0
citations
Decomposing Uncertainty for Large Language Models through Input Clarification Ensembling
ICML 2024
0
citations