Least square support vector machine for large scale dataset

Khanh Nguyen, Trung Le, Vinh Lai, Duy Nguyen, Dat TRAN, Wanli MA

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review

5 Citations (Scopus)
1 Downloads (Pure)


Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGD-LSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines
Original languageEnglish
Title of host publication2015 International Joint Conference on Neural Networks, IJCNN 2015
EditorsAmir Hussain
Place of PublicationUSA
Number of pages9
ISBN (Electronic)9781479919604
ISBN (Print)9781479919611
Publication statusPublished - 2015
Event2015 International Joint Conference on Neural Networks - Killarney, Ireland, Killarney, Ireland
Duration: 12 Jul 201517 Jul 2015

Publication series

NameProceedings of the International Joint Conference on Neural Networks


Conference2015 International Joint Conference on Neural Networks
Abbreviated titleIJCNN 2015


Dive into the research topics of 'Least square support vector machine for large scale dataset'. Together they form a unique fingerprint.

Cite this