NeurIPS 2025 "post-training quantization" Papers
3 papers found
GPLQ: A General, Practical, and Lightning QAT Method for Vision Transformers
Guang Liang, Xinyao Liu, Jianxin Wu
NeurIPS 2025posterarXiv:2506.11784
4
citations
Quantization Error Propagation: Revisiting Layer-Wise Post-Training Quantization
Yamato Arai, Yuma Ichikawa
NeurIPS 2025posterarXiv:2504.09629
7
citations
VETA-DiT: Variance-Equalized and Temporally Adaptive Quantization for Efficient 4-bit Diffusion Transformers
Qinkai XU, yijin liu, YangChen et al.
NeurIPS 2025oral