Shu Yang
8
Papers
21
Total Citations
Papers (8)
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
CVPR 2025arXiv
10
citations
Enhancing Statistical Validity and Power in Hybrid Controlled Trials: A Randomization Inference Approach with Conformal Selective Borrowing
ICML 2025
6
citations
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
ICML 2025
4
citations
Efficient Causal Decision Making with One-sided Feedback
ICLR 2025
1
citations
Causal Customer Churn Analysis with Low-rank Tensor Block Hazard Model
ICML 2024
0
citations
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
CVPR 2025
0
citations
Evaluating and Learning Optimal Dynamic Treatment Regimes under Truncation by Death
NeurIPS 2025
0
citations
ABQ-LLM: Arbitrary-Bit Quantized Inference Acceleration for Large Language Models
AAAI 2025
0
citations