Abstract
EEG-based biometric systems verify the identity of a user by comparing the probe to a reference EEG template of the claimed user enrolled in the system, or by classifying the probe against a user verification model stored in the system. These approaches are often referred to as template-based and model-based methods, respectively. Compared with template-based methods, model-based methods, especially those based on deep learning models, tend to provide enhanced performance and more flexible applications. However, there is no public research report on the security and cancellability issue for model-based approaches. This becomes a critical issue considering the growing popularity of deep learning in EEG biometric applications. In this study, we investigate the security issue of deep learning model-based EEG biometric systems, and demonstrate that model inversion attacks post a threat for such model-based systems. That is to say, an adversary can produce synthetic data based on the output and parameters of the user verification model to gain unauthorized access by the system. We propose a cancellable deep learning framework to defend against such attacks and protect system security. The framework utilizes a generative adversarial network to approximate a non-invertible transformation whose parameters can be changed to produce different data distributions. A user verification model is then trained using output generated from the generator model, while information about the transformation is discarded. The proposed framework is able to revoke compromised models to defend against hill climbing attacks and model inversion attacks. Evaluation results show that the proposed method, while being cancellable, achieves better verification performance than the template-based methods and state-of-the-art non-cancellable deep learning methods.
Original language | English |
---|---|
Article number | 10443934 |
Pages (from-to) | 3745-3757 |
Number of pages | 13 |
Journal | IEEE Transactions on Information Forensics and Security |
Volume | 19 |
Issue number | 99 |
DOIs | |
Publication status | Published - Feb 2024 |
Externally published | Yes |