We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Square-Based Black-Box Adversarial Attack on Time Series Classification Using Simulated Annealing and Post-Processing-Based Defense.
- Authors
Liu, Sichen; Luo, Yuan
- Abstract
While deep neural networks (DNNs) have been widely and successfully used for time series classification (TSC) over the past decade, their vulnerability to adversarial attacks has received little attention. Most existing attack methods focus on white-box setups, which are unrealistic as attackers typically only have access to the model's probability outputs. Defensive methods also have limitations, relying primarily on adversarial retraining which degrades classification accuracy and requires excessive training time. On top of that, we propose two new approaches in this paper: (1) A simulated annealing-based random search attack that finds adversarial examples without gradient estimation, searching only on the l ∞ -norm hypersphere of allowable perturbations. (2) A post-processing defense technique that periodically reverses the trend of corresponding loss values while maintaining the overall trend, using only the classifier's confidence scores as input. Experiments applying these methods to InceptionNet models trained on the UCR dataset benchmarks demonstrate the effectiveness of the attack, achieving up to 100% success rates. The defense method provided protection against up to 91.24% of attacks while preserving prediction quality. Overall, this work addresses important gaps in adversarial TSC by introducing novel black-box attack and lightweight defense techniques.
- Subjects
TIME series analysis; ARTIFICIAL neural networks; SIMULATED annealing; ANNEALING of metals
- Publication
Electronics (2079-9292), 2024, Vol 13, Issue 3, p650
- ISSN
2079-9292
- Publication type
Article
- DOI
10.3390/electronics13030650