ExpTamed: An exponential tamed optimizer based on Langevin SDEs


Erdoğan U., IŞIK Ş., ANAGÜN Y., Lord G.

Neurocomputing, cilt.651, 2025 (SCI-Expanded) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 651
  • Basım Tarihi: 2025
  • Doi Numarası: 10.1016/j.neucom.2025.130949
  • Dergi Adı: Neurocomputing
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Biotechnology Research Abstracts, Compendex, Computer & Applied Sciences, INSPEC, zbMATH
  • Anahtar Kelimeler: Convolutional neural network, Deep learning, Langevin dynamics, Stochastic optimizers
  • Eskişehir Osmangazi Üniversitesi Adresli: Evet

Özet

This study presents a new method to improve optimization by regularizing the gradients in deep learning methods based on a novel taming strategy to regulate the growth of numerical solutions for stochastic differential equations. The method, ExpTamed, enhances stability and reduces the mean-square error across a short time horizon in comparison to existing techniques. The practical effectiveness of ExpTamed is rigorously evaluated on CIFAR-10, Tiny-ImageNet, and Caltech256 across diverse architectures. In direct comparisons with prominent optimizers like Adam, ExpTamed demonstrates significant performance gains. Specifically, it achieved increases in best top-1 test accuracy ranging from 0.86 to 2.76 percentage points on CIFAR-10, and up to 4.46 percentage points on Tiny-ImageNet (without learning rate schedule). On Caltech256, ExpTamed also yielded superior accuracy, precision, and Kappa metrics. These results clearly quantify ExpTamed's capability to deliver enhanced performance in practical deep learning applications.