A DNN Optimizer that Improves over AdaBelief by Suppression of the Adaptive Stepsize Range
dc.contributor.author | Zhang, G | |
dc.contributor.author | Niwa, K | |
dc.contributor.author | Kleijn, WB | |
dc.date.accessioned | 2024-08-29T11:16:27Z | |
dc.date.issued | 2023-09-05 | |
dc.date.updated | 2024-08-29T09:38:19Z | |
dc.description.abstract | We make contributions towards improving adaptive-optimizer performance. Our improvements are based on suppression of the range of adaptive stepsizes in the AdaBelief optimizer. Firstly, we show that the particular placement of the parameter ϵ within the update expressions of AdaBelief reduces the range of the adaptive stepsizes, making AdaBelief closer to SGD with momentum. Secondly, we extend AdaBelief by further suppressing the range of the adaptive stepsizes. To achieve the above goal, we perform mutual layerwise vector projections between the gradient gt and its first momentum mt before using them to estimate the second momentum. The new optimization method is referred to as Aida. Thirdly, extensive experimental results show that Aida outperforms nine optimizers when training transformers and LSTMs for NLP, and VGG and ResNet for image classification over CIAF10 and CIFAR100 while matching the best performance of the nine methods when training WGAN-GP models for image generation tasks. Furthermore, Aida produces higher validation accuracies than AdaBelief for training ResNet18 over ImageNet. Our implementation is available at https://github.com/guoqiang-zhang-x/Aida-Optimizer | en_GB |
dc.description.sponsorship | Nippon Telegraph and Telephone Corporation | en_GB |
dc.identifier.uri | http://hdl.handle.net/10871/137281 | |
dc.language.iso | en | en_GB |
dc.publisher | Journal of Machine Learning Research Inc. | en_GB |
dc.relation.url | https://jmlr.org/tmlr/papers/ | en_GB |
dc.relation.url | https://openreview.net/forum?id=VI2JjIfU37 | en_GB |
dc.relation.url | https://github.com/guoqiang-zhang-x/Aida-Optimizer | en_GB |
dc.rights | © 2024. Open access under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence | en_GB |
dc.title | A DNN Optimizer that Improves over AdaBelief by Suppression of the Adaptive Stepsize Range | en_GB |
dc.type | Article | en_GB |
dc.date.available | 2024-08-29T11:16:27Z | |
dc.identifier.issn | 2835-8856 | |
dc.description | This is the final version. Available from Transactions on Machine Learning Research via the link in this record | en_GB |
dc.identifier.journal | Transactions on Machine Learning Research (TMLR) | en_GB |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | en_GB |
dcterms.dateSubmitted | 2023-03-13 | |
rioxxterms.version | VoR | en_GB |
rioxxterms.licenseref.startdate | 2023-09-05 | |
rioxxterms.type | Journal Article/Review | en_GB |
refterms.dateFCD | 2024-08-29T09:38:22Z | |
refterms.versionFCD | AM | |
refterms.panel | B | en_GB |
Files in this item
This item appears in the following Collection(s)
Except where otherwise noted, this item's licence is described as © 2024. Open access under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence