"task-aware distillation" Papers
2 papers found
Few-Shot Knowledge Distillation of LLMs With Counterfactual Explanations
Faisal Hamman, Pasan Dissanayake, Yanjun Fu et al.
NEURIPS 2025posterarXiv:2510.21631
1
citations
Filling Memory Gaps: Enhancing Continual Semantic Parsing via SQL Syntax Variance-Guided LLMs Without Real Data Replay
Ruiheng Liu, Jinyu Zhang, Yanqi Song et al.
AAAI 2025paperarXiv:2412.07246
4
citations