In:
電腦學刊, Angle Publishing Co., Ltd., Vol. 34, No. 1 ( 2023-02), p. 029-043
Abstract:
〈p〉Recent studies have shown that robust overfitting and robust generalization gap are a major trouble in adversarial training of deep neural networks. These interesting problems of robust overfitting and robust generalization gap motivate us to explore more solutions. Inspired by recent research on the idea of smoothness, this paper introduces the latest research work on the Adversarial Model Perturbation (AMP) method of finding the flatter minimum of the weight loss landscape into the adversarial training (AT) framework of deep neural networks to alleviate the robust overfitting and robust generalization gap troubles, called AT-AMP method. The validity of the flat minimum is explained from the perspective of statistical generalization theory. Although the idea is plain, this approach is surprisingly effective. Experiments demonstrate that by incorporating the AMP method into adversarial training framework, we can boost the robust accuracy by 1.14% ~ 5.73%, on three different benchmark datasets SVHN, CIFAR-10, CIFAR-100 and two threat models norm constraint and L2 norm constraint, across diverse types of adversarial training framework such as AT, TRADES, MART, AT with pre-training and RST and diverse white-box and black-box attack, achieving the state-of-the-art performance in adversarial training framework. In addition, we compare several classical regularization and modern deep learning data augmentation tricks for robust overfitting and robust generalization with the AMP method, and the experimental research results consistently indicate that introducing the AMP method achieves advanced adversarial robustness in the adversarial training framework.〈/p〉
〈p〉 〈/p〉
Type of Medium:
Online Resource
ISSN:
1991-1599
,
1991-1599
Uniform Title:
Improving Adversarial Robustness via Finding Flat Minimum of the Weight Loss Landscape
DOI:
10.53106/199115992023023401003
Language:
Unknown
Publisher:
Angle Publishing Co., Ltd.
Publication Date:
2023