%0 Journal Article %T BP神经网络子批量学习方法研究 %A 刘威 %A 刘尚 %A 周璇 %J 智能系统学报 %D 2016 %R 10.11992/tis.201509015 %X 针对浅层神经网络全批量学习收敛缓慢和单批量学习易受随机扰动的问题,借鉴深度神经网基于子批量的训练方法,提出了针对浅层神经网络的子批量学习方法和子批量学习参数优化配置方法。数值实验结果表明:浅层神经网络子批量学习方法是一种快速稳定的收敛算法,算法中批量和学习率等参数配置对于网络的收敛性、收敛时间和泛化能力有着重要的影响,学习参数经优化后可大幅缩短网络收敛迭代次数和训练时间,并提高网络分类准确率。</br>When solving problems in shallow neural networks, the full-batch learning method converges slowly and the single-batch learning method fluctuates easily. By referring to the subbatch training method for deep neural networks, this paper proposes the subbatch learning method and the subbatch learning parameter optimization and allocation method for shallow neural networks. Experimental comparisons indicate that subbatch learning in shallow neural networks converges quickly and stably. The batch size and learning rate have significant impacts on the net convergence, convergence time, and generation ability. Selecting the optimal parameters can dramatically shorten the iteration time for convergence and the training time as well as improve the classification accuracy %K 子批量学习 %K 神经网络 %K BP算法 %K 批量尺寸 %K 训练方法评估 %K 分类< %K /br> %K subbatch learning %K neural network %K backpropagation algorithms %K batch size %K training methods and evaluation %K classification %U http://tis.hrbeu.edu.cn/oa/darticle.aspx?type=view&id=20160210