A DNN Optimizer that Improves over AdaBelief by Suppression of the Adaptive Stepsize Range
Zhang, G; Niwa, K; Kleijn, WB
Date: 5 September 2023
Article
Journal
Transactions on Machine Learning Research (TMLR)
Publisher
Journal of Machine Learning Research Inc.
Related links
Abstract
We make contributions towards improving adaptive-optimizer performance. Our improvements are
based on suppression of the range of adaptive stepsizes in the AdaBelief optimizer. Firstly, we
show that the particular placement of the parameter ϵ within the update expressions of AdaBelief
reduces the range of the adaptive stepsizes, ...
We make contributions towards improving adaptive-optimizer performance. Our improvements are
based on suppression of the range of adaptive stepsizes in the AdaBelief optimizer. Firstly, we
show that the particular placement of the parameter ϵ within the update expressions of AdaBelief
reduces the range of the adaptive stepsizes, making AdaBelief closer to SGD with momentum.
Secondly, we extend AdaBelief by further suppressing the range of the adaptive stepsizes. To achieve
the above goal, we perform mutual layerwise vector projections between the gradient gt and its
first momentum mt before using them to estimate the second momentum. The new optimization
method is referred to as Aida. Thirdly, extensive experimental results show that Aida outperforms
nine optimizers when training transformers and LSTMs for NLP, and VGG and ResNet for image
classification over CIAF10 and CIFAR100 while matching the best performance of the nine methods
when training WGAN-GP models for image generation tasks. Furthermore, Aida produces higher
validation accuracies than AdaBelief for training ResNet18 over ImageNet. Our implementation is
available at https://github.com/guoqiang-zhang-x/Aida-Optimizer
Computer Science
Faculty of Environment, Science and Economy
Item views 0
Full item downloads 0
Except where otherwise noted, this item's licence is described as © 2024. Open access under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence