2025 "rate-based backpropagation" Papers
2 papers found
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
Shu Yang, Chengting Yu, Lei Liu et al.
CVPR 2025posterarXiv:2503.16572
5
citations
Enhanced Self-Distillation Framework for Efficient Spiking Neural Network Training
Xiaochen Zhao, Chengting Yu, Kairong Yu et al.
NEURIPS 2025oralarXiv:2510.06254