One Stone, Two Birds: Enhancing Adversarial Defense Through the Lens of Distributional Discrepancy

2
citations
#1626
in ICML 2025
of 3340 papers
4
Top Authors
4
Data Points

Abstract

Statistical adversarial data detection(SADD) detects whether an upcoming batch containsadversarial examples(AEs) by measuring the distributional discrepancies betweenclean examples(CEs) and AEs. In this paper, we explore the strength of SADD-based methods by theoretically showing that minimizing distributional discrepancy can help reduce the expected loss on AEs. Despite these advantages, SADD-based methods have a potential limitation: they discard inputs that are detected as AEs, leading to the loss of clean information within those inputs. To address this limitation, we propose a two-pronged adversarial defense method, namedDistributional-discrepancy-basedAdversarialDefense (DAD). In the training phase, DAD first optimizes the test power of themaximum mean discrepancy(MMD) to derive MMD-OPT, which isa stone that kills two birds. MMD-OPT first serves as aguiding signalto minimize the distributional discrepancy between CEs and AEs to train a denoiser. Then, it serves as adiscriminatorto differentiate CEs and AEs during inference. Overall, in the inference stage, DAD consists of a two-pronged process: (1) directly feeding the detected CEs into the classifier, and (2) removing noise from the detected AEs by the distributional-discrepancy-based denoiser. Extensive experiments show that DAD outperforms currentstate-of-the-art(SOTA) defense methods bysimultaneouslyimproving clean and robust accuracy on CIFAR-10 and ImageNet-1K against adaptive white-box attacks. Codes are publicly available at: https://github.com/tmlr-group/DAD.

Citation History

Jan 28, 2026
0
Feb 13, 2026
2+2
Feb 13, 2026
2
Feb 13, 2026
2