oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Complexity and Competition in Appetitive and Aversive Neural Circuits  [PDF]
Crista L. Barberini,Sara E. Morrison,Alex Saez,Brian Lau,C. Daniel Salzman
Frontiers in Neuroscience , 2012, DOI: 10.3389/fnins.2012.00170
Abstract: Decision-making often involves using sensory cues to predict possible rewarding or punishing reinforcement outcomes before selecting a course of action. Recent work has revealed complexity in how the brain learns to predict rewards and punishments. Analysis of neural signaling during and after learning in the amygdala and orbitofrontal cortex, two brain areas that process appetitive and aversive stimuli, reveals a dynamic relationship between appetitive and aversive circuits. Specifically, the relationship between signaling in appetitive and aversive circuits in these areas shifts as a function of learning. Furthermore, although appetitive and aversive circuits may often drive opposite behaviors – approaching or avoiding reinforcement depending upon its valence – these circuits can also drive similar behaviors, such as enhanced arousal or attention; these processes also may influence choice behavior. These data highlight the formidable challenges ahead in dissecting how appetitive and aversive neural circuits interact to produce a complex and nuanced range of behaviors.
An Attractor-Based Complexity Measurement for Boolean Recurrent Neural Networks  [PDF]
Jérémie Cabessa, Alessandro E. P. Villa
PLOS ONE , 2014, DOI: 10.1371/journal.pone.0094204
Abstract: We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of -automata, and then translating the most refined classification of -automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits.
Dynamical Complexity in Cognitive Neural Networks
GOLES,ERIC; PALACIOS,ADRIáN G;
Biological Research , 2007, DOI: 10.4067/S0716-97602007000500009
Abstract: in the last twenty years an important effort in brain sciences, especially in cognitive science, has been the development of mathematical tool that can deal with the complexity of extensive recordings corresponding to the neuronal activity obtained from hundreds of neurons. we discuss here along with some historical issues, advantages and limitations of artificial neural networks (ann) that can help to understand how simple brain circuits work and whether ann can be helpful to understand brain neural complexity
Dynamical Complexity in Cognitive Neural Networks  [cached]
ERIC GOLES,ADRIáN G PALACIOS
Biological Research , 2007,
Abstract: In the last twenty years an important effort in brain sciences, especially in cognitive science, has been the development of mathematical tool that can deal with the complexity of extensive recordings corresponding to the neuronal activity obtained from hundreds of neurons. We discuss here along with some historical issues, advantages and limitations of Artificial Neural Networks (ANN) that can help to understand how simple brain circuits work and whether ANN can be helpful to understand brain neural complexity
Attractor dynamics in local neuronal networks  [PDF]
Jean-Philippe Thivierge,André Longtin
Frontiers in Neural Circuits , 2014, DOI: 10.3389/fncir.2014.00022
Abstract: Patterns of synaptic connectivity in various regions of the brain are characterized by the presence of synaptic motifs, defined as unidirectional and bidirectional synaptic contacts that follow a particular configuration and link together small groups of neurons. Recent computational work proposes that a relay network (two populations communicating via a third, relay population of neurons) can generate precise patterns of neural synchronization. Here, we employ two distinct models of neuronal dynamics and show that simulated neural circuits designed in this way are caught in a global attractor of activity that prevents neurons from modulating their response on the basis of incoming stimuli. To circumvent the emergence of a fixed global attractor, we propose a mechanism of selective gain inhibition that promotes flexible responses to external stimuli. We suggest that local neuronal circuits may employ this mechanism to generate precise patterns of neural synchronization whose transient nature delimits the occurrence of a brief stimulus.
Fractal Complexity in Spontaneous EEG Metastable-State Transitions: New Vistas on Integrated Neural Dynamics  [PDF]
Paolo Allegrini,Paolo Paradisi,Danilo Menicucci,Angelo Gemignani
Frontiers in Physiology , 2010, DOI: 10.3389/fphys.2010.00128
Abstract: Resting-state EEG signals undergo rapid transition processes (RTPs) that glue otherwise stationary epochs. We study the fractal properties of RTPs in space and time, supporting the hypothesis that the brain works at a critical state. We discuss how the global intermittent dynamics of collective excitations is linked to mentation, namely non-constrained non-task-oriented mental activity.
A topological approach to neural complexity  [PDF]
M. De Lucia,M. Bottaccio,M. Montuori,L. Pietronero
Physics , 2004, DOI: 10.1103/PhysRevE.71.016114
Abstract: Considerable efforts in modern statistical physics is devoted to the study of networked systems. One of the most important example of them is the brain, which creates and continuously develops complex networks of correlated dynamics. An important quantity which captures fundamental aspects of brain network organization is the neural complexity C(X)introduced by Tononi et al. This work addresses the dependence of this measure on the topological features of a network in the case of gaussian stationary process. Both anlytical and numerical results show that the degree of complexity has a clear and simple meaning from a topological point of view. Moreover the analytical result offers a straightforward algorithm to compute the complexity than the standard one.
Explicit Logic Circuits Discriminate Neural States
Lane Yoder
PLOS ONE , 2012, DOI: 10.1371/journal.pone.0004154
Abstract: The magnitude and apparent complexity of the brain's connectivity have left explicit networks largely unexplored. As a result, the relationship between the organization of synaptic connections and how the brain processes information is poorly understood. A recently proposed retinal network that produces neural correlates of color vision is refined and extended here to a family of general logic circuits. For any combination of high and low activity in any set of neurons, one of the logic circuits can receive input from the neurons and activate a single output neuron whenever the input neurons have the given activity state. The strength of the output neuron's response is a measure of the difference between the smallest of the high inputs and the largest of the low inputs. The networks generate correlates of known psychophysical phenomena. These results follow directly from the most cost-effective architectures for specific logic circuits and the minimal cellular capabilities of excitation and inhibition. The networks function dynamically, making their operation consistent with the speed of most brain functions. The networks show that well-known psychophysical phenomena do not require extraordinarily complex brain structures, and that a single network architecture can produce apparently disparate phenomena in different sensory systems.
Electronic circuits modeling using artificial neural networks  [PDF]
Andrejevi? Miona V.,Litovski Van?o B.
Journal of Automatic Control , 2003, DOI: 10.2298/jac0301031a
Abstract: In this paper artificial neural networks (ANN) are applied to modeling of electronic circuits. ANNs are used for application of the black-box modeling concept in the time domain. Modeling process is described, so the topology of the ANN, the testing signal used for excitation, together with the complexity of ANN are considered. The procedure is first exemplified in modeling of resistive circuits. MOS transistor, as a four-terminal device, is modeled. Then nonlinear negative resistive characteristic is modeled in order to be used as a piece-wise linear resistor in Chua's circuit. Examples of modeling nonlinear dynamic circuits are given encompassing a variety of modeling problems. A nonlinear circuit containing quartz oscillator is considered for modeling. Verification of the concept is performed by verifying the ability of the model to generalize i.e. to create acceptable responses to excitations not used during training. Implementation of these models within a behavioral simulator is exemplified. Every model is implemented in realistic surrounding in order to show its interaction, and of course, its usage and purpose.
Super-Linear Gate and Super-Quadratic Wire Lower Bounds for Depth-Two and Depth-Three Threshold Circuits  [PDF]
Daniel M. Kane,Ryan Williams
Computer Science , 2015,
Abstract: In order to formally understand the power of neural computing, we first need to crack the frontier of threshold circuits with two and three layers, a regime that has been surprisingly intractable to analyze. We prove the first super-linear gate lower bounds and the first super-quadratic wire lower bounds for depth-two linear threshold circuits with arbitrary weights, and depth-three majority circuits computing an explicit function. $\bullet$ We prove that for all $\epsilon\gg \sqrt{\log(n)/n}$, the linear-time computable Andreev's function cannot be computed on a $(1/2+\epsilon)$-fraction of $n$-bit inputs by depth-two linear threshold circuits of $o(\epsilon^3 n^{3/2}/\log^3 n)$ gates, nor can it be computed with $o(\epsilon^{3} n^{5/2}/\log^{7/2} n)$ wires. This establishes an average-case ``size hierarchy'' for threshold circuits, as Andreev's function is computable by uniform depth-two circuits of $o(n^3)$ linear threshold gates, and by uniform depth-three circuits of $O(n)$ majority gates. $\bullet$ We present a new function in $P$ based on small-biased sets, which we prove cannot be computed by a majority vote of depth-two linear threshold circuits with $o(n^{3/2}/\log^3 n)$ gates, nor with $o(n^{5/2}/\log^{7/2}n)$ wires. $\bullet$ We give tight average-case (gate and wire) complexity results for computing PARITY with depth-two threshold circuits; the answer turns out to be the same as for depth-two majority circuits. The key is a new random restriction lemma for linear threshold functions. Our main analytical tool is the Littlewood-Offord Lemma from additive combinatorics.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.