Aili Wang
3
Papers
9
Total Citations
Papers (3)
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
CVPR 2025arXiv
5
citations
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
ICML 2025
4
citations
Enhanced Self-Distillation Framework for Efficient Spiking Neural Network Training
NeurIPS 2025arXiv
0
citations