Maximal Margin Learning Vector Quantisation

Dat TRAN, Van Nguyen, Wanli MA

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review

Abstract

Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yielded promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle, which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach (MLVQ) to the KGLVQ algorithm. MLVQ inherits the merits of KGLVQ and also follows the maximal margin principle to improve the generalisation capability. Experiments performed on the well-known data sets available in UCI repository show promising classification results for the proposed method
Original languageEnglish
Title of host publicationThe 2013 International Joint Conference on Neural Networks (IJCNN)
EditorsPlamen Angelov, Daniel Levine
Place of PublicationUSA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages1668-1673
Number of pages6
Volume1
ISBN (Print)9781467361293
DOIs
Publication statusPublished - 2013
Event2013 International Joint Conference on Neural Networks (IJCNN) - Dallas, Texas, United States
Duration: 4 Aug 20139 Aug 2013

Conference

Conference2013 International Joint Conference on Neural Networks (IJCNN)
Country/TerritoryUnited States
CityTexas
Period4/08/139/08/13

Cite this