Semi-supervised robust deep neural networks for multi-label image classification


ÇEVİKALP H., Benligiray B., Gerek O. N.

PATTERN RECOGNITION, cilt.100, 2020 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 100
  • Basım Tarihi: 2020
  • Doi Numarası: 10.1016/j.patcog.2019.107164
  • Dergi Adı: PATTERN RECOGNITION
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, BIOSIS, Compendex, Computer & Applied Sciences, INSPEC, MLA - Modern Language Association Database, zbMATH
  • Anahtar Kelimeler: Multi-label classification, Semi-supervised learning, Ramp loss, Image classification, Deep learning
  • Eskişehir Osmangazi Üniversitesi Adresli: Evet

Özet

This paper introduces a robust method for semi-supervised training of deep neural networks for multi-label image classification. To this end, a ramp loss is utilized since it is more robust against noisy and incomplete image labels compared to the classic hinge loss. The proposed method allows for learning from both labeled and unlabeled data in a semi-supervised setting. This is achieved by propagating labels from the labeled images to their unlabeled neighbors in the feature space. Using a robust loss function becomes crucial here, as the initial label propagations may include many errors, which degrades the performance of non-robust loss functions. In contrast, the proposed robust ramp loss restricts extreme penalties from the samples with incorrect labels, and the label assignment improves in each iteration and contributes to the learning process. The proposed method achieves state-of-the-art results in semi-supervised learning experiments on the CIFAR-10 and STL-10 datasets, and comparable results to the state-of the-art in supervised learning experiments on the NUS-WIDE and MS-COCO datasets. Experimental results also verify that our proposed method is more robust against noisy image labels as expected. (C) 2019 Elsevier Ltd. All rights reserved.