Best Fitting Hyperplanes for Classification


IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol.39, no.6, pp.1076-1088, 2017 (SCI-Expanded) identifier identifier identifier

  • Publication Type: Article / Article
  • Volume: 39 Issue: 6
  • Publication Date: 2017
  • Doi Number: 10.1109/tpami.2016.2587647
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.1076-1088
  • Keywords: Best fitting hyperlane classifier, open set recognition, large margin classifier, kernel methods, support vector machines, SUPPORT VECTOR MACHINE
  • Eskisehir Osmangazi University Affiliated: Yes


In this paper, we propose novel methods that are more suitable than classical large-margin classifiers for open set recognition and object detection tasks. The proposed methods use the best fitting hyperplanes approach, and the main idea is to find the best fitting hyperplanes such that each hyperplane is close to the samples of one of the classes and is as far as possible from the other class samples. To this end, we propose two different classifiers: The first classifier solves a convex quadratic optimization problem, but negative samples can lie on one side of the best fitting hyperplane. The second classifier, however, allows the negative samples to lie on both sides of the fitting hyperplane by using concave-convex procedure. Both methods are extended to the nonlinear case by using the kernel trick. In contrast to the existing hyperplane fitting classifiers in the literature, our proposed methods are suitable for large-scale problems, and they return sparse solutions. The experiments on several databases show that the proposed methods typically outperform other hyperplane fitting classifiers, and they work as good as the SVM classifier in classical recognition tasks. However, the proposed methods significantly outperform SVM in open set recognition and object detection tasks.