全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

State-Space Algorithms for Estimating Spike Rate Functions

DOI: 10.1155/2010/426539

Full-Text   Cite this paper   Add to My Lib

Abstract:

The accurate characterization of spike firing rates including the determination of when changes in activity occur is a fundamental issue in the analysis of neurophysiological data. Here we describe a state-space model for estimating the spike rate function that provides a maximum likelihood estimate of the spike rate, model goodness-of-fit assessments, as well as confidence intervals for the spike rate function and any other associated quantities of interest. Using simulated spike data, we first compare the performance of the state-space approach with that of Bayesian adaptive regression splines (BARS) and a simple cubic spline smoothing algorithm. We show that the state-space model is computationally efficient and comparable with other spline approaches. Our results suggest both a theoretically sound and practical approach for estimating spike rate functions that is applicable to a wide range of neurophysiological data. 1. Introduction When does a neuron respond to an external sensory stimulus or to a motor movement? When is its maximum response to that stimulus? Does that response change over time with experience? Neurophysiologists and statisticians have been trying to develop approaches to address these questions ever since this experimental approach was developed. One of the most widely used approaches used to determine when and if a neuron fired to the stimulus is to use a peristimulus time histogram (PSTH), simply averaging the responses over some time bin over all the trials collected. However, because there is no principled way of choosing the bin size for the PSTH, its interpretation is difficult. An even more challenging question is characterizing neural activity of responses to a stimulus if it changes over time as is the case in learning. Again, averaging techniques are typically used to characterize changes across trials, but averaging across 5 or 10 trials severely limits the temporal resolution of this kind of analysis. Beyond averaging techniques, a range of more sophisticated statistical methods have been applied to characterize neural activity including regression or reverse correlation techniques [1], maximum likelihood fitting of parametric statistical models [2–9], and Bayesian approaches [10–13]. Recently models have been proposed for the analysis of spike train data using the state-space approach [4, 14, 15]. The state-space model is a standard approach in engineering, statistics, and computer science for analyzing dynamic hidden or unobservable processes [15–18, 23]. It is defined by two equations: the state equation that

References

[1]  M. E. Koelling and D. Q. Nykamp, “Computing linear approximations to nonlinear neuronal response,” Network: Computation in Neural Systems, vol. 19, no. 4, pp. 286–313, 2008.
[2]  M. B. Ahrens, L. Paninski, and M. Sahani, “Inferring input nonlinearities in neural encoding models,” Network: Computation in Neural Systems, vol. 19, no. 1, pp. 35–67, 2008.
[3]  G. Czanner, U. T. Eden, S. Wirth, M. Yanike, W. A. Suzuki, and E. N. Brown, “Analysis of between-trial and within-trial neural spiking dynamics,” Journal of Neurophysiology, vol. 99, no. 5, pp. 2672–2693, 2008.
[4]  Q. J. M. Huys, M. B. Ahrens, and L. Paninski, “Efficient estimation of detailed single-neuron models,” Journal of Neurophysiology, vol. 96, no. 2, pp. 872–890, 2006.
[5]  P. Mullowney and S. Iyengar, “Parameter estimation for a leaky integrate-and-fire neuronal model from ISI data,” Journal of Computational Neuroscience, vol. 24, no. 2, pp. 179–194, 2008.
[6]  H. Nalatore, M. Ding, and G. Rangarajan, “Denoising neural data with state-space smoothing: method and application,” Journal of Neuroscience Methods, vol. 179, no. 1, pp. 131–141, 2009.
[7]  L. Paninski, “The most likely voltage path and large deviations approximations for integrate-and-fire neurons,” Journal of Computational Neuroscience, vol. 21, no. 1, pp. 71–87, 2006.
[8]  L. Paninski, M. R. Fellows, N. G. Hatsopoulos, and J. P. Donoghue, “Spatiotemporal tuning of motor cortical neurons for hand position and velocity,” Journal of Neurophysiology, vol. 91, no. 1, pp. 515–532, 2004.
[9]  W. Truccolo, U. T. Eden, M. R. Fellows, J. P. Donoghue, and E. N. Brown, “A point process framework for relating neural spiking activity to spiking history, neural ensemble, and extrinsic covariate effects,” Journal of Neurophysiology, vol. 93, no. 2, pp. 1074–1089, 2005.
[10]  S. Behseta and R. E. Kass, “Testing equality of two functions using BARS,” Statistics in Medicine, vol. 24, no. 22, pp. 3523–3534, 2005.
[11]  S. Behseta, R. E. Kass, D. E. Moorman, and C. R. Olson, “Testing equality of several functions: analysis of single-unit firing-rate curves across multiple experimental conditions,” Statistics in Medicine, vol. 26, no. 21, pp. 3958–3975, 2007.
[12]  C. G. Kaufman, V. Ventura, and R. E. Kass, “Spline-based non-parametric regression for periodic functions and its applications to directional tuning of neurons,” Statistics in Medicine, vol. 24, no. 14, pp. 2255–2265, 2005.
[13]  G. Wallstrom, J. Liebner, and R. E. Kass, “An implementation of Bayesian adaptive regression splines (BARS) in C with S and R wrappers,” Journal of Statistical Software, vol. 26, no. 1, pp. 1–21, 2008.
[14]  J. E. Kulkarni and L. Paninski, “State-space decoding of goal-directed movements,” IEEE Signal Processing Magazine, vol. 25, no. 1, pp. 78–86, 2008.
[15]  A. C. Smith and E. N. Brown, “Estimating a state-space model from point process observations,” Neural Computation, vol. 15, pp. 965–991, 2003.
[16]  J. Durbin and S. J. Koopman, Time Series Analysis by State Space Methods, Oxford University Press, Oxford, UK, 2001.
[17]  N. Kashiwagi and T. Yanagimoto, “Smoothing serial count data through a state-space model,” Biometrics, vol. 48, no. 4, pp. 1187–1194, 1992.
[18]  G. Kitagawa and W. Gersch, Smoothness Priors Analysis of Time Series, Springer, New York, NY, USA, 1996.
[19]  E. N. Brown, D. P. Nguyen, L. M. Frank, M. A. Wilson, and V. Solo, “An analysis of neural receptive field plasticity by point process adaptive filtering,” Proceedings of the National Academy of Sciences of the United States of America, vol. 98, no. 21, pp. 12261–12266, 2001.
[20]  S. Wirth, M. Yanike, L. M. Frank, A. C. Smith, E. N. Brown, and W. A. Suzuki, “Single neurons in the monkey hippocampus and learning of new associations,” Science, vol. 300, no. 5625, pp. 1578–1581, 2003.
[21]  I. DiMatteo, C. R. Genovese, and R. E. Kass, “Bayesian curve-fitting with free-knot splines,” Biometrika, vol. 88, no. 4, pp. 1055–1071, 2001.
[22]  C. R. Olson, S. N. Gettner, V. Ventura, R. Carta, and R. E. Kass, “Neuronal activity in macaque supplementary eye field during planning of saccades in response to pattern and spatial cues,” Journal of Neurophysiology, vol. 84, no. 3, pp. 1369–1384, 2000.
[23]  G. Kitagawa, “Non-Gaussian state-space modeling of nonstationary times series,” Journal of the American Statistical Association, vol. 82, pp. 1032–1041, 1987.
[24]  L. Fahrmeir and G. Tutz, Multivariate Statistical Modelling Based on Generalized Linear Models, Springer, New York, NY, USA, 2nd edition, 2001.
[25]  S. Roweis and Z. Ghahramani, “A unifying review of linear Gaussian models,” Neural Computation, vol. 11, no. 2, pp. 305–345, 1999.
[26]  D. J. Daley and D. Vere-Jones, An Introduction to the Theory of Point Processes, Springer, New York, NY, USA, 2nd edition, 2003.
[27]  J. D. Kalbfleisch and R. L. Prentice, The Statistical Analysis of Failure Time Data, John Wiley & Sons, Hoboken, NJ, USA, 2nd edition, 2002.
[28]  E. N. Brown, “Theory of point processes for neural systems,” in Methods and Models in Neurophysics, C. C. Chow, B. Gutkin, D. Hansel, C. Meunier, and J. Dalibard, Eds., pp. 691–726, Elsevier, Paris, France, 2005.
[29]  A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via EM algorithm,” The Journal of the Royal Statistical Society, Series B, vol. 39, pp. 1–38, 1977.
[30]  P. G. Hoel, S. C. Port, and C. J. Stone, Introduction to Probability Theory, Houghton Mifflin, Boston, Mass, USA, 1971.
[31]  A. C. Smith, M. R. Stefani, B. Moghaddam, and E. N. Brown, “Analysis and design of behavioral experiments to characterize population learning,” Journal of Neurophysiology, vol. 93, no. 3, pp. 1776–1792, 2005.
[32]  N. L. Johnson and S. Kotz, Continuous Univariate Distributions, John Wiley & Sons, New York, NY, USA, 1970.
[33]  R. E. Kass, V. Ventura, and C. Cai, “Statistical smoothing of neuronal data,” Network: Computation in Neural Systems, vol. 14, no. 1, pp. 5–15, 2003.
[34]  D. G. T. Denison, B. K. Mallick, and A. F. M. Smith, “Automatic Bayesian curve fitting,” Journal of the Royal Statistical Society. Series B, vol. 60, no. 2, pp. 333–350, 1998.
[35]  G. E. P. Box, G. M. Jenkins, and G. C. Reinsel, Time Series Analysis: Forecasting and Control, John Wiley & Sons, Hoboken, NJ, USA, 4th edition, 2008.
[36]  A. C. Smith, L. M. Frank, S. Wirth, et al., “Dynamic analysis of learning in behavioral experiments,” Journal of Neuroscience, vol. 24, no. 2, pp. 447–461, 2004.
[37]  P. Congdon, Applied Bayesian Modelling, John Wiley & Sons, Chichester, UK, 2003.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133