Echo: Reverberation-based Fast Black-Box Adversarial Attacks on Intelligent Audio Systems

Meng Xue, Kuang Peng, Xueluan Gong, Qian Zhang*, Yanjiao Chen*, Routing Li

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

1 Citation (Scopus)

Abstract

Intelligent audio systems are ubiquitous in our lives, such as speech command recognition and speaker recognition. However, it is shown that deep learning-based intelligent audio systems are vulnerable to adversarial attacks. In this paper, we propose a physical adversarial attack that exploits reverberation, a natural indoor acoustic effect, to realize imperceptible, fast, and targeted black-box attacks. Unlike existing attacks that constrain the magnitude of adversarial perturbations within a fixed radius, we generate reverberation-alike perturbations that blend naturally with the original voice sample 1. Additionally, we can generate more robust adversarial examples even under over-the-air propagation by considering distortions in the physical environment. Extensive experiments are conducted using two popular intelligent audio systems in various situations, such as different room sizes, distance, and ambient noises. The results show that Echo can invade into intelligent audio systems in both digital and physical over-the-air environment.

Original languageEnglish
Article number137
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume7
Issue number3
DOIs
Publication statusPublished - 27 Sept 2023

Bibliographical note

Publisher Copyright:
© 2023 ACM.

Keywords

  • Adversarial example attacks
  • Inconspicuous attack
  • Intelligent audio systems

Fingerprint

Dive into the research topics of 'Echo: Reverberation-based Fast Black-Box Adversarial Attacks on Intelligent Audio Systems'. Together they form a unique fingerprint.

Cite this