Fast online algorithm for nonlinear support vector machines and other alike models


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

Paper presents a unique novel online learning algorithm for eight popular nonlinear (i.e., kernel), classifiers based on a classic stochastic gradient descent in primal domain. In particular, the online learning algorithm is derived for following classifiers: L1 and L2 support vector machines with both a quadratic regularizer wtw and the l1 regularizer |w|1; regularized huberized hinge loss; regularized kernel logistic regression; regularized exponential loss with l1 regularizer |w|1 and Least squares support vector machines. The online learning algorithm is aimed primarily for designing classifiers for large datasets. The novel learning model is accurate, fast and extremely simple (i.e., comprised of few coding lines only). Comparisons of performances of the proposed algorithm with the state of the art support vector machine algorithm on few real datasets are shown.

About the authors

Vojislav Kecman

Computer Science Department

Author for correspondence.
Email: vkecman@vcu.edu
United States, Richmond, VA

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2016 Allerton Press, Inc.