>
Fa   |   Ar   |   En
   whitened gradient descent, a new updating method for optimizers in deep neural networks  
   
نویسنده gholamalinejad hossein ,khosravi hossein
منبع journal of ai and data mining - 2022 - دوره : 10 - شماره : 4 - صفحه:467 -477
چکیده    Optimizers are the vital components of deep neural networks that perform weight updates. this paper introduces a new updating method for optimizers based on a gradient descent called the whitened gradient descent (wgd). this method is easy to implement, and can be used in every optimizer based on the gradient descent algorithm. it does not increase the training time of the network significantly. this method smooths the training curve, and improves the classification metrics. in order to evaluate the proposed algorithm, we perform 48 different tests on two datasets, cifar100 and animals-10, using three network structures including densenet121, resnet18, and resnet50. the experiments show that using the wgd method in the gradient descent-based optimizers improves the classification results significantly. for example, integrating wgd in the radam optimizer increases the accuracy of densenet from 87.69% to 90.02% on the animals-10 dataset.
کلیدواژه deep learning ,optimizer ,whitened gradient descent ,momentum
آدرس bozorgmehr university of qaenat, faculty of engineering, department of computer, iran. shahrood university of technology, faculty of electrical engineering, iran, shahrood university of technology, faculty of electrical engineering, iran
پست الکترونیکی hosseinkhosravi@shahroodut.ac.ir
 
     
   
Authors
  
 
 

Copyright 2023
Islamic World Science Citation Center
All Rights Reserved