We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Imperceptible adversarial attack via spectral sensitivity of human visual system.
- Authors
Chiang, Chen-Kuo; Lin, Ying-Dar; Hwang, Ren-Hung; Lin, Po-Ching; Chang, Shih-Ya; Li, Hao-Ting
- Abstract
Adversarial attacks reveals that deep neural networks are vulnerable to adversarial examples. Intuitively, adversarial examples with more perturbations result in a strong attack, leading to a lower recognition accuracy. However, increasing perturbations also causes visually noticeable changes in the images. In order to address the problem on how to improve the attack strength while maintaining the visual perception quality, an imperceptible adversarial attack via spectral sensitivity of the human visual system is proposed. Based on the analysis of human visual system, the proposed method allows more perturbations as attack information and re-distributes perturbations into pixels where the changes are imperceptible to human eyes. Therefore, it presents better Accuracy under Attack(AuA) than existing attack methods whereas the image quality can be maintained to the similar level as other methods. Experimental results demonstrate that our method improves the attack strength of existing adversarial attack methods by adding 3 % to 23 % while mostly maintaining the visual quality of SSIM lower than 0.05.
- Subjects
SPECTRAL sensitivity; ARTIFICIAL neural networks; VISUAL perception
- Publication
Multimedia Tools & Applications, 2024, Vol 83, Issue 20, p59291
- ISSN
1380-7501
- Publication type
Article
- DOI
10.1007/s11042-023-17750-3