Chengting Yu
3
Papers
14
Total Citations
Papers (3)
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
CVPR 2025arXiv
10
citations
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
ICML 2025
4
citations
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
CVPR 2025
0
citations