vUBM: A variational universal background model for EEG-based person authentication

Huyen Tran, Dat Tran, Wanli Ma, Phuoc Nguyen

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review

Abstract

EEG-based person authentication is an important means for modern biometrics. However EEG signals are well-known for small signal-to-noise ratio and have many factors of variation. These variations are caused by intrinsic factors, e.g. mental activity, mood, and health conditions, as well as extrinsic factors, e.g. sensor errors, electrode displacements, and user movements. These create complex variations of source signals going from inside our brain to the recording devices. We propose vUBM, a variational inference framework to learn a simple latent representation for complex data, facilitating authentication algorithms in the latent space. A variational universal background model is created for normalizing scores to further improve the performance. Extensive experiments show the advantages of our proposed framework.

Original languageEnglish
Title of host publicationNeural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
EditorsTom Gedeon, Kok Wai Wong, Minho Lee
Place of PublicationNetherlands
PublisherSpringer
Pages478-485
Number of pages8
Volume4
ISBN (Electronic)9783030368081
ISBN (Print)9783030368074
DOIs
Publication statusPublished - 1 Jan 2019
Event26th International Conference on Neural Information Processing, ICONIP 2019 - Sydney, Australia
Duration: 12 Dec 201915 Dec 2019

Publication series

NameCommunications in Computer and Information Science
Volume1142 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference26th International Conference on Neural Information Processing, ICONIP 2019
Country/TerritoryAustralia
CitySydney
Period12/12/1915/12/19

Fingerprint

Dive into the research topics of 'vUBM: A variational universal background model for EEG-based person authentication'. Together they form a unique fingerprint.

Cite this