Large-scale robust transductive support vector machines


NEUROCOMPUTING, vol.235, pp.199-209, 2017 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 235
  • Publication Date: 2017
  • Doi Number: 10.1016/j.neucom.2017.01.012
  • Journal Name: NEUROCOMPUTING
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.199-209
  • Keywords: Transductive support vector machines, Classification, Large-margin classifier, Optimization, SEMISUPERVISED CLASSIFICATION, OPTIMIZATION TECHNIQUES, HYPERSPECTRAL IMAGERY, FRAMEWORK, SVMS
  • Eskisehir Osmangazi University Affiliated: Yes


In this paper, we propose a robust and fast transductive support vector machine (RTSVM) classifier that can be applied to large-scale data. To this end, we use the robust Ramp loss instead of Hinge loss for labeled data samples. The resulting optimization problem is non-convex but it can be decomposed to a convex and concave parts. Therefore, the optimization is accomplished iteratively by solving a sequence of convex problems known as concave-convex procedure. Stochastic gradient (SG) is used to solve the convex problem at each iteration, thus the proposed method scales well with large training set size for the linear case (to the best of our knowledge, it is the second transductive classification method that is practical for more than a million data). To extend the proposed method to the nonlinear case, we proposed two alternatives where one uses the primal optimization problem and the other uses the dual. But in contrast to the linear case, both alternatives do not scale well with large-scale data. Experimental results show that the proposed method achieves comparable results to other related transductive SVM methods, but it is faster than other transductive learning methods and it is more robust to the noisy data.