%0 Journal Article %T Research on Learning Convergence of General Memory Neural Network
关于一般存储器神经网络的学习收敛性研究 %A PENG Hong-Jing CHEN Song-Can %A
彭宏京 %A 陈松灿 %J 计算机科学 %D 2004 %I %X In this paper, we will concentrate on learning convergence of a class of neural network architectures named general memory neural network (GMNN) that consists of: input space quantization, memory address generator, combination output by memory lookup operations. If the number of generated addresses is fixed the output of network is given by summation operator, the learning convergence of GMNN to the least square error can be proved. Both CMAC(Cerebellar Model Articulation Controller) and SLLUP( Single-Layer Lookup Perccptrons)are examples of GMNN. The main purpose of this paper is that it can provide a theoretical instruction about how to construct a new neural network model with local learning. Finally two constructive examples, generalized SDM (Sparse Distributed Memory)and generalized SLLUP models ,are given. %K Memory network %K CMAC %K n-tuple %K SLLUP %K SDM %K Learning convergence
一般存储器神经网络 %K 学习收敛性 %K GMNN %K 网络结构 %K 学习能力 %U http://www.alljournals.cn/get_abstract_url.aspx?pcid=5B3AB970F71A803DEACDC0559115BFCF0A068CD97DD29835&cid=8240383F08CE46C8B05036380D75B607&jid=64A12D73428C8B8DBFB978D04DFEB3C1&aid=78A6149364C7702D&yid=D0E58B75BFD8E51C&vid=4AD960B5AD2D111A&iid=CA4FD0336C81A37A&sid=7F5DDA4924737DF5&eid=CDEBD1ACE0A4C1C1&journal_id=1002-137X&journal_name=计算机科学&referenced_num=0&reference_num=14