Least square support vector machine for large scale dataset

Khanh Nguyen, Trung Le, Vinh Lai, Duy Nguyen, Dat TRAN, Wanli MA

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    3 Citations (Scopus)
    1 Downloads (Pure)

    Abstract

    Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGD-LSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines
    Original languageEnglish
    Title of host publication2015 International joint conference on neural networks (IJCNN)
    EditorsAmir Hussain
    Place of PublicationUSA
    PublisherIEEE
    Pages2057-2065
    Number of pages9
    Volume1
    ISBN (Electronic)9781479919604
    ISBN (Print)9781479919611
    DOIs
    Publication statusPublished - 2015
    EventInternational Joint Conference on Neural Networks IJCNN 2015 - Killarney, Ireland, Killarney, Ireland
    Duration: 12 Jul 201517 Jul 2015

    Conference

    ConferenceInternational Joint Conference on Neural Networks IJCNN 2015
    Abbreviated titleIJCNN 2015
    CountryIreland
    CityKillarney
    Period12/07/1517/07/15

    Fingerprint

    Support vector machines
    Experiments

    Cite this

    Nguyen, K., Le, T., Lai, V., Nguyen, D., TRAN, D., & MA, W. (2015). Least square support vector machine for large scale dataset. In A. Hussain (Ed.), 2015 International joint conference on neural networks (IJCNN) (Vol. 1, pp. 2057-2065). USA: IEEE. https://doi.org/10.1109/ijcnn.2015.7280575
    Nguyen, Khanh ; Le, Trung ; Lai, Vinh ; Nguyen, Duy ; TRAN, Dat ; MA, Wanli. / Least square support vector machine for large scale dataset. 2015 International joint conference on neural networks (IJCNN). editor / Amir Hussain. Vol. 1 USA : IEEE, 2015. pp. 2057-2065
    @inproceedings{3830fa7c55d44ad5bb75a2bc69c7f673,
    title = "Least square support vector machine for large scale dataset",
    abstract = "Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGD-LSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines",
    keywords = "Support Vector Machine, least square",
    author = "Khanh Nguyen and Trung Le and Vinh Lai and Duy Nguyen and Dat TRAN and Wanli MA",
    year = "2015",
    doi = "10.1109/ijcnn.2015.7280575",
    language = "English",
    isbn = "9781479919611",
    volume = "1",
    pages = "2057--2065",
    editor = "Amir Hussain",
    booktitle = "2015 International joint conference on neural networks (IJCNN)",
    publisher = "IEEE",

    }

    Nguyen, K, Le, T, Lai, V, Nguyen, D, TRAN, D & MA, W 2015, Least square support vector machine for large scale dataset. in A Hussain (ed.), 2015 International joint conference on neural networks (IJCNN). vol. 1, IEEE, USA, pp. 2057-2065, International Joint Conference on Neural Networks IJCNN 2015, Killarney, Ireland, 12/07/15. https://doi.org/10.1109/ijcnn.2015.7280575

    Least square support vector machine for large scale dataset. / Nguyen, Khanh; Le, Trung; Lai, Vinh; Nguyen, Duy; TRAN, Dat; MA, Wanli.

    2015 International joint conference on neural networks (IJCNN). ed. / Amir Hussain. Vol. 1 USA : IEEE, 2015. p. 2057-2065.

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    TY - GEN

    T1 - Least square support vector machine for large scale dataset

    AU - Nguyen, Khanh

    AU - Le, Trung

    AU - Lai, Vinh

    AU - Nguyen, Duy

    AU - TRAN, Dat

    AU - MA, Wanli

    PY - 2015

    Y1 - 2015

    N2 - Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGD-LSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines

    AB - Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGD-LSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines

    KW - Support Vector Machine

    KW - least square

    U2 - 10.1109/ijcnn.2015.7280575

    DO - 10.1109/ijcnn.2015.7280575

    M3 - Conference contribution

    SN - 9781479919611

    VL - 1

    SP - 2057

    EP - 2065

    BT - 2015 International joint conference on neural networks (IJCNN)

    A2 - Hussain, Amir

    PB - IEEE

    CY - USA

    ER -

    Nguyen K, Le T, Lai V, Nguyen D, TRAN D, MA W. Least square support vector machine for large scale dataset. In Hussain A, editor, 2015 International joint conference on neural networks (IJCNN). Vol. 1. USA: IEEE. 2015. p. 2057-2065 https://doi.org/10.1109/ijcnn.2015.7280575