%0 Journal Article %T A Hybrid Algorithm Based on Attention Model
基于注意力模型的混合学习算法 %A YANG Bo %A SU Xiao-Hong %A WANG Ya-Dong %A
杨博 %A 苏小红 %A 王亚东 %J 软件学报 %D 2005 %I %X A hybrid algorithm based on attention model (HAAM) is proposed to speed up the training of back-propagation neural networks and improve the performances. The algorithm combines the genetic algorithm with the BP algorithm based on magnified error signal. The key to this algorithm lies in the partition of the BP training process into many chips with each chip trained by the BP algorithm. The chips in the same iteration are optimized by the GA operators, and those in different iterations constitute the whole training. Therefore, the HAAM obtains the ability of searching the global optimum solution relying on these operations, and it is easy to be parallelly processed. The simulation experiments show that this algorithm can effectively avoid failure training caused by randomizing the initial weights and thresholds, and solve the slow convergence problem resulted from the Flat-Spots when the error signal becomes too small. Moreover, this algorithm improves the generalization of BP network by improving the training precision instead of adding hidden neurons. %K back-propagation algorithm %K artificial neural network %K attention model %K genetic algorithm %K Flat-Spots %K local optimum
BP算法 %K 人工神经网络 %K 注意力模型 %K 遗传算法 %K 饱和区域 %K 局部极值 %U http://www.alljournals.cn/get_abstract_url.aspx?pcid=5B3AB970F71A803DEACDC0559115BFCF0A068CD97DD29835&cid=8240383F08CE46C8B05036380D75B607&jid=7735F413D429542E610B3D6AC0D5EC59&aid=B831A75E54A94A7C&yid=2DD7160C83D0ACED&vid=7801E6FC5AE9020C&iid=B31275AF3241DB2D&sid=F79A45851FD04E0C&eid=4AB97D697AC3192E&journal_id=1000-9825&journal_name=软件学报&referenced_num=0&reference_num=35