ExpTamed: An exponential tamed optimizer based on Langevin SDEs


Erdoğan U., IŞIK Ş., ANAGÜN Y., Lord G.

Neurocomputing, vol.651, 2025 (SCI-Expanded, Scopus) identifier

  • Publication Type: Article / Article
  • Volume: 651
  • Publication Date: 2025
  • Doi Number: 10.1016/j.neucom.2025.130949
  • Journal Name: Neurocomputing
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Biotechnology Research Abstracts, Compendex, Computer & Applied Sciences, INSPEC, zbMATH
  • Keywords: Convolutional neural network, Deep learning, Langevin dynamics, Stochastic optimizers
  • Eskisehir Osmangazi University Affiliated: Yes

Abstract

This study presents a new method to improve optimization by regularizing the gradients in deep learning methods based on a novel taming strategy to regulate the growth of numerical solutions for stochastic differential equations. The method, ExpTamed, enhances stability and reduces the mean-square error across a short time horizon in comparison to existing techniques. The practical effectiveness of ExpTamed is rigorously evaluated on CIFAR-10, Tiny-ImageNet, and Caltech256 across diverse architectures. In direct comparisons with prominent optimizers like Adam, ExpTamed demonstrates significant performance gains. Specifically, it achieved increases in best top-1 test accuracy ranging from 0.86 to 2.76 percentage points on CIFAR-10, and up to 4.46 percentage points on Tiny-ImageNet (without learning rate schedule). On Caltech256, ExpTamed also yielded superior accuracy, precision, and Kappa metrics. These results clearly quantify ExpTamed's capability to deliver enhanced performance in practical deep learning applications.