|
计算机科学 2004
Research on Learning Convergence of General Memory Neural Network
|
Abstract:
In this paper, we will concentrate on learning convergence of a class of neural network architectures named general memory neural network (GMNN) that consists of: input space quantization, memory address generator, combination output by memory lookup operations. If the number of generated addresses is fixed the output of network is given by summation operator, the learning convergence of GMNN to the least square error can be proved. Both CMAC(Cerebellar Model Articulation Controller) and SLLUP( Single-Layer Lookup Perccptrons)are examples of GMNN. The main purpose of this paper is that it can provide a theoretical instruction about how to construct a new neural network model with local learning. Finally two constructive examples, generalized SDM (Sparse Distributed Memory)and generalized SLLUP models ,are given.