|
Exponential Stability of Stochastic Delayed Neural Networks with Inverse H?lder Activation Functions and Markovian Jump ParametersDOI: 10.1155/2014/784107 Abstract: The exponential stability issue for a class of stochastic neural networks (SNNs) with Markovian jump parameters, mixed time delays, and -inverse H?lder activation functions is investigated. The jumping parameters are modeled as a continuous-time finite-state Markov chain. Firstly, based on Brouwer degree properties, the existence and uniqueness of the equilibrium point for SNNs without noise perturbations are proved. Secondly, by applying the Lyapunov-Krasovskii functional approach, stochastic analysis theory, and linear matrix inequality (LMI) technique, new delay-dependent sufficient criteria are achieved in terms of LMIs to ensure the SNNs with noise perturbations to be globally exponentially stable in the mean square. Finally, two simulation examples are provided to demonstrate the validity of the theoretical results. 1. Introduction In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography [1–6]. In the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role. To solve problems of optimization, neural control, signal processing, and so forth, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. Hence, much effort has been made in the stability of neural networks, and a number of sufficient conditions have been proposed to guarantee the global asymptotic/exponential stability for neural networks with or without delays; see, for example, [7–19] and the references therein. As is well known, a real system is usually affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random. As pointed out in [20], in real nervous systems, and in the implementation of artificial neural networks, synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes; hence noise is unavoidable and should be taken into consideration in modeling. Moreover, in [21, 22], it has been shown that a neural network can be stabilized or destabilized by certain stochastic inputs. Therefore, the stochastic stability of various neural networks with or without delays under noise
|