α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
AAAI 2024
/
Complementary Knowledge Distillation for Robust an...
AAAI 2024
paper
Complementary Knowledge Distillation for Robust and Privacy-Preserving Model Serving in Vertical Federated Learning
0
citations
0
Citations
#878
in AAAI 2024
of 2289 papers
5
Authors
1
Data Points
Authors
Dashan Gao
Sheng Wan
Lixin Fan
Xin Yao
Qiang Yang
Citation History
Jan 28, 2026
0