Natl Sci Open
Volume 2, Number 1, 2023
|Number of page(s)||29|
|Published online||26 August 2022|
Adversarial attacks and defenses in physiological computing: a systematic review
1 Ministry of Education Key Laboratory of Image Processing and Intelligent Control, School of Artificial Intelligence and Automation, Huazhong University of Science and Technology, Wuhan 430074, China
2 Zhejiang Lab, Hangzhou 311121, China
3 School of Civil and Hydraulic Engineering, Huazhong University of Science and Technology, Wuhan 430074, China
4 College of Public Administration, Huazhong University of Science and Technology, Wuhan 430074, China
5 Electrical Engineering and Computer Science Department, University of Michigan, Ann Arbor MI 48109, USA
6 School of Management and Sino-European Institute for Intellectual Property, Huazhong University of Science and Technology, Wuhan 430074, China
* Corresponding authors (emails: firstname.lastname@example.org (Xiaodong Xu); email@example.com (Hanbin Luo); firstname.lastname@example.org (Xiang Yu))
Revised: 10 May 2022
Accepted: 26 May 2022
Physiological computing uses human physiological data as system inputs in real time. It includes, or significantly overlaps with, brain-computer interfaces, affective computing, adaptive automation, health informatics, and physiological signal based biometrics. Physiological computing increases the communication bandwidth from the user to the computer, but is also subject to various types of adversarial attacks, in which the attacker deliberately manipulates the training and/or test examples to hijack the machine learning algorithm output, leading to possible user confusion, frustration, injury, or even death. However, the vulnerability of physiological computing systems has not been paid enough attention to, and there does not exist a comprehensive review on adversarial attacks to them. This study fills this gap, by providing a systematic review on the main research areas of physiological computing, different types of adversarial attacks and their applications to physiological computing, and the corresponding defense strategies. We hope this review will attract more research interests on the vulnerability of physiological computing systems, and more importantly, defense strategies to make them more secure.
Key words: physiological computing / brain-computer interfaces / health informatics / biometrics / machine learning / adversarial attack
© The Author(s) 2023. Published by China Science Publishing & Media Ltd. and EDP Sciences.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.