A taxonomy of system-level attacks on deep learning models in autonomous vehicles
-
Tehrani, Masoud Jamshidiyan
ORCID
Facoltà di scienze informatiche, Università della Svizzera italiana, Svizzera
-
Kim, Jinhan
ORCID
Facoltà di scienze informatiche, Università della Svizzera italiana, Svizzera
-
Foulefack, Rosmael Zidane Lekeufack
University of Trento, Italy
-
Marchetto, Alessandro
ORCID
University of Trento, Italy
-
Tonella, Paolo
ORCID
Facoltà di scienze informatiche, Università della Svizzera italiana, Svizzera
Published in:
- ACM Transactions on Software Engineering and Methodology. - 2025, p. 3769009
English
The advent of deep learning and its astonishing performance has enabled its usage in complex systems, including autonomous vehicles. On the other hand, deep learning models are susceptible to mis-predictions when small, adversarial changes are introduced into their input. Such mis-predictions can be triggered in the real world and can result in a failure of the entire system. In recent years, a growing number of research works have investigated ways to mount attacks against autonomous vehicles that exploit deep learning components. Such attacks are directed toward elements of the environment where these systems operate and their effectiveness is assessed in terms of system-level failures triggered by them. There has been however no systematic attempt to analyze and categorize such attacks. In this paper, we present the first taxonomy of system-level attacks against autonomous vehicles. We constructed our taxonomy by selecting 21 highly relevant papers, then we tagged them with 12 top-level taxonomy categories and several sub-categories. The taxonomy allowed us to investigate the attack features, the most attacked components and systems, the underlying threat models, and the failure chains from input perturbation to system-level failure. We distilled several lessons for practitioners and identified possible directions for future work for researchers.
-
Collections
-
-
Language
-
-
Classification
-
Computer science and technology
-
License
-
-
Open access status
-
hybrid
-
Identifiers
-
-
ISSN
1049-331X
-
ISSN
1557-7392
-
DOI
10.1145/3769009
-
RICERCO
37542
-
ARK
ark:/12658/srd1334159
-
Persistent URL
-
https://n2t.net/ark:/12658/srd1334159
Statistics
Document views: 0
File downloads:
-
Tonella_2025_ACM_3769009: 0