|
计算机应用研究 2011
Back-propagation algorithm with magnified error signals
|
Abstract:
In view of the BP neural network existence convergence rate slow and easy to fall into local minimum, this paper presented a back-propagation (BP) algorithm with fast convergence. The new algorithm magnified the error signals by modi-fying the derivative of the activation function. This paper analyzed the convergence proof of the algorithm and compared the magnified algorithm with the standard BP algorithm and the improved algorithm of NG et al in the test. Simulation results show that the magnified algorithm can be more effective than the other two algorithms. It speeds up the convergence rate and enhances the global convergence capability.