Fuzzy Kernel Stochastic Gradient Descent Machines

Tuan Nguyen, Phuong Duong, Trung Le, Anh Le, Viet Ngo, Dat TRAN, Wanli MA

Research output: A Conference proceeding or a Chapter in BookConference contribution

1 Downloads (Pure)

Abstract

Stochastic Gradient Descent (SGD) based method offers a viable solution to training large-scale dataset. However, the traditional SGD-based methods cannot get benefit from the distribution or geometry information carried in data. The reason is that these methods make use of the uniform distribution over the entire training set so as to sample the next data point for updating the model. We address this issue by incorporating the distribution or geometry information carried in the data into the sampling procedure. In particular, we utilize the fuzzy-membership evaluation methods which allow transferring the distribution or geometry information carried in the data to the fuzzy memberships. The fuzzy memberships is then normalized to a discrete distribution from which the next data point is sampled. This allows the training staying more focused on the important data points and tending to ignore the less impact data points, e.g., the noises and outliers. We validate the proposed methods on 8 benchmark datasets. The experimental results show that the proposed methods are comparable with the standard SGD-based method in training time while offering a significant improvement in classification accuracy
Original languageEnglish
Title of host publication2016 International Joint Conference on Neural Networks (IJCNN)
EditorsHussein A. Abbass, Huanhuan Chen
Place of PublicationUnited States of America
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages3226-3232
Number of pages7
Volume1
ISBN (Electronic)9781509006205
ISBN (Print)9781509006212
DOIs
Publication statusPublished - 2016
Event2016 International Joint Conference on Neural Networks (IJCNN) - Vancouver, Vancouver, Canada
Duration: 24 Jul 201629 Jul 2016

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2016-October

Conference

Conference2016 International Joint Conference on Neural Networks (IJCNN)
Abbreviated titleIJCNN 2016
CountryCanada
CityVancouver
Period24/07/1629/07/16

Fingerprint

Geometry
Sampling

Cite this

Nguyen, T., Duong, P., Le, T., Le, A., Ngo, V., TRAN, D., & MA, W. (2016). Fuzzy Kernel Stochastic Gradient Descent Machines. In H. A. Abbass, & H. Chen (Eds.), 2016 International Joint Conference on Neural Networks (IJCNN) (Vol. 1, pp. 3226-3232). [7727611] (Proceedings of the International Joint Conference on Neural Networks; Vol. 2016-October). United States of America: IEEE, Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/IJCNN.2016.7727611
Nguyen, Tuan ; Duong, Phuong ; Le, Trung ; Le, Anh ; Ngo, Viet ; TRAN, Dat ; MA, Wanli. / Fuzzy Kernel Stochastic Gradient Descent Machines. 2016 International Joint Conference on Neural Networks (IJCNN). editor / Hussein A. Abbass ; Huanhuan Chen. Vol. 1 United States of America : IEEE, Institute of Electrical and Electronics Engineers, 2016. pp. 3226-3232 (Proceedings of the International Joint Conference on Neural Networks).
@inproceedings{2dc2f4b14e874c1c9ff658e12e4556d0,
title = "Fuzzy Kernel Stochastic Gradient Descent Machines",
abstract = "Stochastic Gradient Descent (SGD) based method offers a viable solution to training large-scale dataset. However, the traditional SGD-based methods cannot get benefit from the distribution or geometry information carried in data. The reason is that these methods make use of the uniform distribution over the entire training set so as to sample the next data point for updating the model. We address this issue by incorporating the distribution or geometry information carried in the data into the sampling procedure. In particular, we utilize the fuzzy-membership evaluation methods which allow transferring the distribution or geometry information carried in the data to the fuzzy memberships. The fuzzy memberships is then normalized to a discrete distribution from which the next data point is sampled. This allows the training staying more focused on the important data points and tending to ignore the less impact data points, e.g., the noises and outliers. We validate the proposed methods on 8 benchmark datasets. The experimental results show that the proposed methods are comparable with the standard SGD-based method in training time while offering a significant improvement in classification accuracy",
keywords = "fuzzy-support-vector-machine, stochastic-gradient-descent, fuzzy-membership",
author = "Tuan Nguyen and Phuong Duong and Trung Le and Anh Le and Viet Ngo and Dat TRAN and Wanli MA",
year = "2016",
doi = "10.1109/IJCNN.2016.7727611",
language = "English",
isbn = "9781509006212",
volume = "1",
series = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "IEEE, Institute of Electrical and Electronics Engineers",
pages = "3226--3232",
editor = "Abbass, {Hussein A.} and Huanhuan Chen",
booktitle = "2016 International Joint Conference on Neural Networks (IJCNN)",
address = "United States",

}

Nguyen, T, Duong, P, Le, T, Le, A, Ngo, V, TRAN, D & MA, W 2016, Fuzzy Kernel Stochastic Gradient Descent Machines. in HA Abbass & H Chen (eds), 2016 International Joint Conference on Neural Networks (IJCNN). vol. 1, 7727611, Proceedings of the International Joint Conference on Neural Networks, vol. 2016-October, IEEE, Institute of Electrical and Electronics Engineers, United States of America, pp. 3226-3232, 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, Canada, 24/07/16. https://doi.org/10.1109/IJCNN.2016.7727611

Fuzzy Kernel Stochastic Gradient Descent Machines. / Nguyen, Tuan; Duong, Phuong; Le, Trung; Le, Anh; Ngo, Viet; TRAN, Dat; MA, Wanli.

2016 International Joint Conference on Neural Networks (IJCNN). ed. / Hussein A. Abbass; Huanhuan Chen. Vol. 1 United States of America : IEEE, Institute of Electrical and Electronics Engineers, 2016. p. 3226-3232 7727611 (Proceedings of the International Joint Conference on Neural Networks; Vol. 2016-October).

Research output: A Conference proceeding or a Chapter in BookConference contribution

TY - GEN

T1 - Fuzzy Kernel Stochastic Gradient Descent Machines

AU - Nguyen, Tuan

AU - Duong, Phuong

AU - Le, Trung

AU - Le, Anh

AU - Ngo, Viet

AU - TRAN, Dat

AU - MA, Wanli

PY - 2016

Y1 - 2016

N2 - Stochastic Gradient Descent (SGD) based method offers a viable solution to training large-scale dataset. However, the traditional SGD-based methods cannot get benefit from the distribution or geometry information carried in data. The reason is that these methods make use of the uniform distribution over the entire training set so as to sample the next data point for updating the model. We address this issue by incorporating the distribution or geometry information carried in the data into the sampling procedure. In particular, we utilize the fuzzy-membership evaluation methods which allow transferring the distribution or geometry information carried in the data to the fuzzy memberships. The fuzzy memberships is then normalized to a discrete distribution from which the next data point is sampled. This allows the training staying more focused on the important data points and tending to ignore the less impact data points, e.g., the noises and outliers. We validate the proposed methods on 8 benchmark datasets. The experimental results show that the proposed methods are comparable with the standard SGD-based method in training time while offering a significant improvement in classification accuracy

AB - Stochastic Gradient Descent (SGD) based method offers a viable solution to training large-scale dataset. However, the traditional SGD-based methods cannot get benefit from the distribution or geometry information carried in data. The reason is that these methods make use of the uniform distribution over the entire training set so as to sample the next data point for updating the model. We address this issue by incorporating the distribution or geometry information carried in the data into the sampling procedure. In particular, we utilize the fuzzy-membership evaluation methods which allow transferring the distribution or geometry information carried in the data to the fuzzy memberships. The fuzzy memberships is then normalized to a discrete distribution from which the next data point is sampled. This allows the training staying more focused on the important data points and tending to ignore the less impact data points, e.g., the noises and outliers. We validate the proposed methods on 8 benchmark datasets. The experimental results show that the proposed methods are comparable with the standard SGD-based method in training time while offering a significant improvement in classification accuracy

KW - fuzzy-support-vector-machine

KW - stochastic-gradient-descent

KW - fuzzy-membership

UR - http://DTD7GC2.ucstaff.win.canberra.edu.au/RSO/rm%20publications%20folder/2017000357%20peer%20review.msg

UR - http://www.scopus.com/inward/record.url?scp=85007227100&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2016.7727611

DO - 10.1109/IJCNN.2016.7727611

M3 - Conference contribution

SN - 9781509006212

VL - 1

T3 - Proceedings of the International Joint Conference on Neural Networks

SP - 3226

EP - 3232

BT - 2016 International Joint Conference on Neural Networks (IJCNN)

A2 - Abbass, Hussein A.

A2 - Chen, Huanhuan

PB - IEEE, Institute of Electrical and Electronics Engineers

CY - United States of America

ER -

Nguyen T, Duong P, Le T, Le A, Ngo V, TRAN D et al. Fuzzy Kernel Stochastic Gradient Descent Machines. In Abbass HA, Chen H, editors, 2016 International Joint Conference on Neural Networks (IJCNN). Vol. 1. United States of America: IEEE, Institute of Electrical and Electronics Engineers. 2016. p. 3226-3232. 7727611. (Proceedings of the International Joint Conference on Neural Networks). https://doi.org/10.1109/IJCNN.2016.7727611