Large-scale robust transductive support vector machines


ÇEVİKALP H., Franc V.

NEUROCOMPUTING, cilt.235, ss.199-209, 2017 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 235
  • Basım Tarihi: 2017
  • Doi Numarası: 10.1016/j.neucom.2017.01.012
  • Dergi Adı: NEUROCOMPUTING
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.199-209
  • Anahtar Kelimeler: Transductive support vector machines, Classification, Large-margin classifier, Optimization, SEMISUPERVISED CLASSIFICATION, OPTIMIZATION TECHNIQUES, HYPERSPECTRAL IMAGERY, FRAMEWORK, SVMS
  • Eskişehir Osmangazi Üniversitesi Adresli: Evet

Özet

In this paper, we propose a robust and fast transductive support vector machine (RTSVM) classifier that can be applied to large-scale data. To this end, we use the robust Ramp loss instead of Hinge loss for labeled data samples. The resulting optimization problem is non-convex but it can be decomposed to a convex and concave parts. Therefore, the optimization is accomplished iteratively by solving a sequence of convex problems known as concave-convex procedure. Stochastic gradient (SG) is used to solve the convex problem at each iteration, thus the proposed method scales well with large training set size for the linear case (to the best of our knowledge, it is the second transductive classification method that is practical for more than a million data). To extend the proposed method to the nonlinear case, we proposed two alternatives where one uses the primal optimization problem and the other uses the dual. But in contrast to the linear case, both alternatives do not scale well with large-scale data. Experimental results show that the proposed method achieves comparable results to other related transductive SVM methods, but it is faster than other transductive learning methods and it is more robust to the noisy data.