oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2020 ( 2 )

2019 ( 233 )

2018 ( 303 )

2017 ( 319 )

Custom range...

Search Results: 1 - 10 of 233092 matches for " Simon R. Schultz "
All listed articles are free for downloading (OA Articles)
Page 1 /233092
Display every page Item
Temporal correlations and neural spike train entropy
Simon R. Schultz,Stefano Panzeri
Physics , 2000, DOI: 10.1103/PhysRevLett.86.5823
Abstract: Sampling considerations limit the experimental conditions under which information theoretic analyses of neurophysiological data yield reliable results. We develop a procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data. This approach also yields insight upon the role of correlations between spikes in temporal coding mechanisms. The method, when applied to recordings from complex cells of the monkey primary visual cortex, results in lower RMS error information estimates in comparison to a `brute force' approach.
The Ising decoder: reading out the activity of large neural ensembles
Michael T. Schaub,Simon R. Schultz
Quantitative Biology , 2010,
Abstract: The Ising Model has recently received much attention for the statistical description of neural spike train data. In this paper, we propose and demonstrate its use for building decoders capable of predicting, on a millisecond timescale, the stimulus represented by a pattern of neural activity. After fitting to a training dataset, the Ising decoder can be applied "online" for instantaneous decoding of test data. While such models can be fit exactly using Boltzmann learning, this approach rapidly becomes computationally intractable as neural ensemble size increases. We show that several approaches, including the Thouless-Anderson-Palmer (TAP) mean field approach from statistical physics, and the recently developed Minimum Probability Flow Learning (MPFL) algorithm, can be used for rapid inference of model parameters in large-scale neural ensembles. Use of the Ising model for decoding, unlike other problems such as functional connectivity estimation, requires estimation of the partition function. As this involves summation over all possible responses, this step can be limiting. Mean field approaches avoid this problem by providing an analytical expression for the partition function. We demonstrate these decoding techniques by applying them to simulated neural ensemble responses from a mouse visual cortex model, finding an improvement in decoder performance for a model with heterogeneous as opposed to homogeneous neural tuning and response properties. Our results demonstrate the practicality of using the Ising model to read out, or decode, spatial patterns of activity comprised of many hundreds of neurons.
Synchronisation, binding and the role of correlated firing in fast information transmission
Simon R. Schultz,Huw D. R. Golledge,Stefano Panzeri
Physics , 2000,
Abstract: Does synchronization between action potentials from different neurons in the visual system play a substantial role in solving the binding problem? The binding problem can be studied quantitatively in the broader framework of the information contained in neural spike trains about some external correlate, which in this case is object configurations in the visual field. We approach this problem by using a mathematical formalism that quantifies the impact of correlated firing in short time scales. Using a power series expansion, the mutual information an ensemble of neurons conveys about external stimuli is broken down into firing rate and correlation components. This leads to a new quantification procedure directly applicable to simultaneous multiple neuron recordings. It theoretically constrains the neural code, showing that correlations contribute less significantly than firing rates to rapid information processing. By using this approach to study the limits upon the amount of information that an ideal observer is able to extract from a synchrony code, it may be possible to determine whether the available amount of information is sufficient to support computational processes such as feature binding.
Statistical modelling of higher-order correlations in pools of neural activity
Fernando Montani,Elena Phoka,Mariela Portesi,Simon R. Schultz
Quantitative Biology , 2012, DOI: 10.1016/j.physa.2013.03.012
Abstract: Simultaneous recordings from multiple neural units allow us to investigate the activity of very large neural ensembles. To understand how large ensembles of neurons process sensory information, it is necessary to develop suitable statistical models to describe the response variability of the recorded spike trains. Using the information geometry framework, it is possible to estimate higher-order correlations by assigning one interaction parameter to each degree of correlation, leading to a $(2^N-1)$-dimensional model for a population with $N$ neurons. However, this model suffers greatly from a combinatorial explosion, and the number of parameters to be estimated from the available sample size constitutes the main intractability reason of this approach. To quantify the extent of higher than pairwise spike correlations in pools of multiunit activity, we use an information-geometric approach within the framework of the extended central limit theorem considering all possible contributions from high-order spike correlations. The identification of a deformation parameter allows us to provide a statistical characterisation of the amount of high-order correlations in the case of a very large neural ensemble, significantly reducing the number of parameters, avoiding the sampling problem, and inferring the underlying dynamical properties of the network within pools of multiunit neural activity.
Applications of Information Theory to Analysis of Neural Data
Simon R. Schultz,Robin A. A. Ince,Stefano Panzeri
Quantitative Biology , 2015, DOI: 10.1007/978-1-4614-7320-6_280-1
Abstract: Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information theoretic quantities are commonly used in neuroscience - (see entry "Definitions of Information-Theoretic Quantities"). In this entry we review some applications of information theory in neuroscience to study encoding of information in both single neurons and neuronal populations.
Estimating Information-Theoretic Quantities
Robin A. A. Ince,Simon R. Schultz,Stefano Panzeri
Quantitative Biology , 2015, DOI: 10.1007/978-1-4614-7320-6_140-1
Abstract: Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience - (see entry "Summary of Information-Theoretic Quantities"). Estimating these quantities in an accurate and unbiased way from real neurophysiological data frequently presents challenges, which are explained in this entry.
Summary of Information Theoretic Quantities
Robin A. A. Ince,Stefano Panzeri,Simon R. Schultz
Quantitative Biology , 2015, DOI: 10.1007/978-1-4614-7320-6_306-1
Abstract: Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials, rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience - including the Shannon entropy, Kullback-Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in practice are provided in the entry "Estimation of Information-Theoretic Quantities" and examples of application of these techniques in neuroscience can be found in the entry "Applications of Information-Theoretic Quantities in Neuroscience".
Stability of the replica symmetric solution for the information conveyed by by a neural network
Simon Schultz,Alessandro Treves
Quantitative Biology , 1997, DOI: 10.1103/PhysRevE.57.3302
Abstract: The information that a pattern of firing in the output layer of a feedforward network of threshold-linear neurons conveys about the network's inputs is considered. A replica-symmetric solution is found to be stable for all but small amounts of noise. The region of instability depends on the contribution of the threshold and the sparseness: for distributed pattern distributions, the unstable region extends to higher noise variances than for very sparse distributions, for which it is almost nonexistant.
ALGORITHMS FOR MEAN-RISK STOCHASTIC INTEGER PROGRAMS IN ENERGY
Rüdiger Schultz, Frederike Neise
Revista Investigación Operacional , 2007,
Abstract: We introduce models and algorithms suitable for including risk aversion into stochastic programming problems in energy. For a system with dispersed generation of power and heat we present computational results showing the superiority of our decomposition algorithm over a standard mixed-integer linear programming solver.
A unified approach to the study of temporal, correlational and rate coding
S. Panzeri,S. R. Schultz
Physics , 1999,
Abstract: We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each of which reflect something about potential coding mechanisms. This is possible in the coding r{\'e}gime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts; to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions; and forms the basis for a new quantitative procedure for the analysis of simultaneous multiple neuron recordings. It also provides theoretical constraints upon neural coding strategies. We find a transition between two coding r{\'e}gimes, depending upon the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulus-induced response fluctuations, there exists a spike count coding phase, where the purely temporal information is of third order in time. For time windows much longer than the characteristic timescale, there can be additional timing information of first order, leading to a temporal coding phase in which timing information may affect the instantaneous information rate. We study the relative contributions of the dynamic firing rate and correlation variables to the full temporal information; the interaction of signal and noise correlations in temporal coding; synergy between spikes and between cells; and the effect of refractoriness. We illustrate the utility of the technique by analysis of a few cells from the rat barrel cortex.
Page 1 /233092
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.