Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface

Trung Le, Dat Tran, Tuan Hoang, Dharmendra Sharma

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    2 Downloads (Pure)

    Abstract

    Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.
    Original languageEnglish
    Title of host publicationInternational Conference on Neural Information Processing (ICONIP 2012)
    Subtitle of host publicationLecture Notes in Computer Science
    EditorsTingwen Huang, Zhigang Zeng, Chuandong Li, Chi Sing Leung
    Place of PublicationGermany
    PublisherSpringer
    Pages191-198
    Number of pages8
    Volume7665
    ISBN (Electronic)9783642344879
    ISBN (Print)9783642344800
    DOIs
    Publication statusPublished - 2012
    Event19th International Conference on Neural Information Processing 2012 - Doha, Doha, Qatar
    Duration: 12 Nov 201215 Nov 2012

    Conference

    Conference19th International Conference on Neural Information Processing 2012
    CountryQatar
    CityDoha
    Period12/11/1215/11/12

    Fingerprint

    Brain computer interface
    Vector quantization
    Pattern recognition
    Experiments

    Cite this

    Le, T., Tran, D., Hoang, T., & Sharma, D. (2012). Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface. In T. Huang, Z. Zeng, C. Li, & C. S. Leung (Eds.), International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science (Vol. 7665, pp. 191-198). Germany: Springer. https://doi.org/10.1007/978-3-642-34487-9_24
    Le, Trung ; Tran, Dat ; Hoang, Tuan ; Sharma, Dharmendra. / Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface. International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science. editor / Tingwen Huang ; Zhigang Zeng ; Chuandong Li ; Chi Sing Leung. Vol. 7665 Germany : Springer, 2012. pp. 191-198
    @inproceedings{2c3e4195798d48d19ad9f8e7a33912e0,
    title = "Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface",
    abstract = "Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.",
    keywords = "Generalised Learning Vector Quantisation, Brain-Computer Interface, Learning Vector Quantisation, Kernel Method, Maximising Margin",
    author = "Trung Le and Dat Tran and Tuan Hoang and Dharmendra Sharma",
    year = "2012",
    doi = "10.1007/978-3-642-34487-9_24",
    language = "English",
    isbn = "9783642344800",
    volume = "7665",
    pages = "191--198",
    editor = "Tingwen Huang and Zhigang Zeng and Chuandong Li and Leung, {Chi Sing}",
    booktitle = "International Conference on Neural Information Processing (ICONIP 2012)",
    publisher = "Springer",
    address = "Netherlands",

    }

    Le, T, Tran, D, Hoang, T & Sharma, D 2012, Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface. in T Huang, Z Zeng, C Li & CS Leung (eds), International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science. vol. 7665, Springer, Germany, pp. 191-198, 19th International Conference on Neural Information Processing 2012, Doha, Qatar, 12/11/12. https://doi.org/10.1007/978-3-642-34487-9_24

    Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface. / Le, Trung; Tran, Dat; Hoang, Tuan; Sharma, Dharmendra.

    International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science. ed. / Tingwen Huang; Zhigang Zeng; Chuandong Li; Chi Sing Leung. Vol. 7665 Germany : Springer, 2012. p. 191-198.

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    TY - GEN

    T1 - Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface

    AU - Le, Trung

    AU - Tran, Dat

    AU - Hoang, Tuan

    AU - Sharma, Dharmendra

    PY - 2012

    Y1 - 2012

    N2 - Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.

    AB - Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.

    KW - Generalised Learning Vector Quantisation

    KW - Brain-Computer Interface

    KW - Learning Vector Quantisation

    KW - Kernel Method

    KW - Maximising Margin

    UR - https://link.springer.com/chapter/10.1007%2F978-3-642-34487-9_24

    U2 - 10.1007/978-3-642-34487-9_24

    DO - 10.1007/978-3-642-34487-9_24

    M3 - Conference contribution

    SN - 9783642344800

    VL - 7665

    SP - 191

    EP - 198

    BT - International Conference on Neural Information Processing (ICONIP 2012)

    A2 - Huang, Tingwen

    A2 - Zeng, Zhigang

    A2 - Li, Chuandong

    A2 - Leung, Chi Sing

    PB - Springer

    CY - Germany

    ER -

    Le T, Tran D, Hoang T, Sharma D. Maximal Margin Approach to Kernel Generalised Learning Vector Quantisation for Brain-Computer Interface. In Huang T, Zeng Z, Li C, Leung CS, editors, International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science. Vol. 7665. Germany: Springer. 2012. p. 191-198 https://doi.org/10.1007/978-3-642-34487-9_24