Discriminative common vector method with kernels


ÇEVİKALP H., Neamtu M., Wilkes M.

IEEE TRANSACTIONS ON NEURAL NETWORKS, cilt.17, sa.6, ss.1550-1565, 2006 (SCI-Expanded) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 17 Sayı: 6
  • Basım Tarihi: 2006
  • Doi Numarası: 10.1109/tnn.2006.881485
  • Dergi Adı: IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.1550-1565
  • Anahtar Kelimeler: discriminative common vectors, feature extraction, Fisher's linear discriminant analysis, kernel methods, small sample size, SMALL SAMPLE-SIZE, RECOGNITION
  • Eskişehir Osmangazi Üniversitesi Adresli: Hayır

Özet

In some pattern recognition tasks, the dimension of the sample space is larger than the number of samples in the training set. This is known as the "small sample size problem." Linear discriminant analysis (LDA) techniques cannot be applied directly to the small sample size case. The small sample size problem is also encountered when kernel approaches are used for recognition. In this paper, we attempt to answer the question of "How should one choose the optimal projection vectors for feature extraction in the small sample size case?" Based on our findings, we propose a new method called the kernel discriminative common vector method. In this method, we first nonlinearly map the original input space to an implicit higher dimensional feature space, in which the data are hoped to be linearly separable. Then, the optimal projection vectors are computed in this transformed space. The proposed method yields an optimal solution for maximizing a modified Fisher's linear discriminant criterion, discussed in the paper. Thus, under certain conditions, a 100% recognition rate is guaranteed for the training set samples. Experiments on test data also show that, in many situations, the generalization performance of the proposed method compares favorably with other kernel approaches.