Paper Information

Journal:   IRANIAN JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING (IJECE)   SUMMER-FALL 2010 , Volume 9 , Number 2; Page(s) 127 To 133.
 
Paper: 

A COMPLEMENTARY METHOD FOR PREVENTING HIDDEN NEURONS’ SATURATION IN FEED FORWARD NEURAL NETWORKS TRAINING

 
 
Author(s):  MOALEM P., AYOUGHI S.A.
 
* 
 
Abstract: 

In feed forward neural networks, hidden layer neurons’ saturation conditions, which are the cause of flat spots on the error surface, is one of the main disadvantages of any conventional gradient descent learning algorithm. In this paper, we propose a novel complementary scheme for the learning based on a suitable combination of anti saturated idden neurons learning process and accelerating methods like the momentum term and the parallel tangent technique. In our proposed method, a normalized saturation criterion (NSC) of hidden neurons, which is introduced in this paper, is monitored during learning process. When the NSC is higher than a specified threshold, it means that the algorithm moves towards a flat spot as the hidden neurons fall into saturation condition. In this case, in order to suppress the saturation of hidden neurons, a conventional gradient descent learning method can be accompanied by the proposed complementary gradient descent saturation prevention scheme. When the NSC assumes small values, no saturation detected and the network operates in its normal condition. Therefore, application of a saturation prevention scheme is not recommended. We have evaluated the proposed complementary method in accompaniment to the gradient descent plus momentum and parallel tangent, two conventional improvements on learning methods. We have recorded remarkable improvements in convergence success as well as generalization in some well known benchmarks.

 
Keyword(s): BACK PROPAGATION, HIDDEN NEURONS’ SATURATION, NORMALIZED SATURATION CRITERION, MOMENTUM TERM, PARALLEL TANGENT GRADIENT
 
References: 
  • ندارد
 
  pdf-File tarjomyar Yearly Visit 45
 
Latest on Blog
Enter SID Blog