|
Mathematics 2015
Large deviations for spatially extended random neural networksAbstract: We investigate the asymptotic mesoscopic behavior of a spatially extended stochastic neural networks dynamics in random environment with highly random connectivity weights. These systems model the spatiotemporal activity of the brain, thus feature (i) communication delays depending on the distance between cells and (ii) heterogeneous synapses: connectivity coefficients are random variables whose law depends on the neurons positions and whose variance scales as the inverse of the network size. When the weights are independent Gaussian random variables, we show that the empirical measure satisfies a large-deviation principle. This holds under a technical condition on time horizon, noise and heterogeneity in the general case, and with no restriction in the case where delays do not depend on space. The associated good rate function achieves its minimum at a unique spatially extended probability measure, implying convergence of the empirical measure and propagation of chaos. The limit is characterized through complex non Markovian implicit equation in which the network interaction term is replaced by a non-local Gaussian process whose statistics depend on the solution over the whole neural field. We further demonstrate the universality of this limit, in the sense that neuronal networks with non-Gaussian interconnection weights converge towards it provided that synaptic weights have a sufficiently fast decay.
|