Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface

Trung Le, Dat Tran, Tuan Hoang, Dharmendra Sharma

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review

2 Downloads (Pure)

Abstract

Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.
Original languageEnglish
Title of host publicationInternational Conference on Neural Information Processing (ICONIP 2012)
Subtitle of host publicationLecture Notes in Computer Science
EditorsTingwen Huang, Zhigang Zeng, Chuandong Li, Chi Sing Leung
Place of PublicationGermany
PublisherSpringer
Pages191-198
Number of pages8
Volume7665
ISBN (Electronic)9783642344879
ISBN (Print)9783642344800
DOIs
Publication statusPublished - 2012
Event19th International Conference on Neural Information Processing 2012 - Doha, Doha, Qatar
Duration: 12 Nov 201215 Nov 2012

Conference

Conference19th International Conference on Neural Information Processing 2012
Country/TerritoryQatar
CityDoha
Period12/11/1215/11/12

Cite this