All Title Author
Keywords Abstract


An Overview of Bayesian Methods for Neural Spike Train Analysis

DOI: 10.1155/2013/251905

Full-Text   Cite this paper   Add to My Lib

Abstract:

Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. 1. Introduction Neuronal action potentials or spikes are the basic language that neurons use to represent and transmit information. Understanding neuronal representations of spike trains remains a fundamental task in computational neuroscience [1, 2]. With the advancement of multielectrode array and imaging technologies, neuroscientists have been able to record a large population of neurons at a fine temporal and spatial resolution [3]. To extract (“read out”) information from or inject/restore (“write in”) signals to neural circuits [4], there are emerging needs for modeling and analyzing neural spike trains recorded directly or extracted indirectly from neural signals, as well as building closed-loop brain-machine interfaces (BMIs). Many good examples and applications can be found in the volumes of the current or other special issues [5, 6]. In recent years, cutting-edge Bayesian methods have gained increasing attention in the analysis of neural data and neural spike trains. Despite its well-established theoretic principle since the inception of Bayes’ rule [7], Bayesian machinery has not been widely used in large-scaled data analysis until very recently. This was partially ascribed to two facts: first, the development of new methodologies and effective algorithms; second, the ever-increasing computing power. The major theoretic or methodological development has been reported in the field of statistics, and numerous algorithms were developed in applied statistics and machine learning for successful real-world applications [8]. It is time to push this research frontier to

References

[1]  E. N. Brown, R. E. Kass, and P. P. Mitra, “Multiple neural spike train data analysis: state-of-the-art and future challenges,” Nature Neuroscience, vol. 7, no. 5, pp. 456–461, 2004.
[2]  S. Grün and S. Rotter, Analysis of Parallel Spike Trains, Springer, New York, NY, USA, 2010.
[3]  I. H. Stevenson and K. P. Kording, “How advances in neural recording affect data analysis,” Nature Neuroscience, vol. 14, no. 2, pp. 139–142, 2011.
[4]  G. B. Stanley, “Reading and writing the neural code,” Nature Neuroscience, vol. 16, pp. 259–263, 2013.
[5]  Z. Chen, T. W. Berger, A. Cichocki, K. G. Oweiss, R. Quian Quiroga, and N. V. Thakor, “Signal processing for neural spike trains,” Computational Intelligence and Neuroscience, vol. 2010, Article ID 698751, 2 pages, 2010.
[6]  J. Macke, P. Berens, and M. Bethge, “Statistical analysis of multi-cell recordings: linking population coding models to experimental data,” Frontiers in Computational Neuroscience, vol. 5, article 35, 2011.
[7]  J. Bernardo and A. F. M. Smith, Bayesian Theory, John & Wiley, New York, NY, USA, 1994.
[8]  A. Gelman, J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis, Chapman & Hall/CRC, New York, NY, USA, 2nd edition, 2004.
[9]  Y. Pawitan, In All Likelihood: Statistical Modelling and Inference Using Likelihood, Clarendon Press, New York, NY, USA, 2001.
[10]  D. J. Daley and D. Vere-Jones, An Introduction to the Theory of Point Processes, Springer, New York, NY, USA, 2nd edition, 2003.
[11]  E. N. Brown, R. Barbieri, U. T. Eden, and L. M. Frank, “Likelihood methods for neural data analysis,” in Computational Neuroscience: A Comprehensive Approach, J. Feng, Ed., pp. 253–286, CRC Press, New York, NY, USA, 2003.
[12]  E. N. Brown, “Theory of point processes for neural systems,” in Methods and Models in Neurophysics, C. C. Chow, B. Gutkin, D. Hansel, et al., Eds., pp. 691–727, Elsevier, San Diego, Calif, USA, 2005.
[13]  Z. Chen, R. Barbieri, and E. N. Brown, “State-space modeling of neural spike train and behavioral data,” in Statistical Signal Processing for Neuroscience and Neurotechnology, K. Oweiss, Ed., pp. 161–200, Elsevier, San Diego, Calif, USA, 2010.
[14]  W. Truccolo, U. T. Eden, M. R. Fellows, J. P. Donoghue, and E. N. Brown, “A point process framework for relating neural spiking activity to spiking history, neural ensemble, and extrinsic covariate effects,” Journal of Neurophysiology, vol. 93, no. 2, pp. 1074–1089, 2005.
[15]  P. McCullagh and A. Nelder, Generalized Linear Models, vol. 22 of Computational Intelligence and Neuroscience, Chapman & Hall/CRC Press, New York, NY, USA, 2nd edition, 1989.
[16]  E. Schneidman, M. J. Berry II, R. Segev, and W. Bialek, “Weak pairwise correlations imply strongly correlated network states in a neural population,” Nature, vol. 440, no. 7087, pp. 1007–1012, 2006.
[17]  H. Nasser, O. Marre, and B. Cessac, “Spatio-temporal spike train analysis for large scale networks using the maximum entropy principle and Monte Carlo method,” Journal of Statistical Mechanics, vol. 2013, Article ID P03006, 2013.
[18]  E. N. Brown, R. Barbieri, V. Ventura, R. E. Kass, and L. M. Frank, “The time-rescaling theorem and its application to neural spike train data analysis,” Neural Computation, vol. 14, no. 2, pp. 325–346, 2002.
[19]  S. Julier, J. Uhlmann, and H. F. Durrant-Whyte, “A new method for the nonlinear transformation of means and covariances in filters and estimators,” IEEE Transactions on Automatic Control, vol. 45, no. 3, pp. 477–482, 2000.
[20]  S. S?rkk?, “On unscented Kalman filtering for state estimation of continuous-time nonlinear systems,” IEEE Transactions on Automatic Control, vol. 52, no. 9, pp. 1631–1641, 2007.
[21]  M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul, “Introduction to variational methods for graphical models,” Machine Learning, vol. 37, no. 2, pp. 183–233, 1999.
[22]  H. Attias, “A variational Bayesian framework for graphical models,” in Advances in Neural Information Processing Systems (NIPS) 12, S. A. Solla, T. K. Leen, and K. R. Müller, Eds., MIT Press, Boston, Mass, USA, 2000.
[23]  M. Beal and Z. Ghahramani, “Variational Bayesian learning of directed graphical models,” Bayesian Analysis, vol. 1, no. 4, pp. 793–832, 2006.
[24]  D. J. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press, New York, NY, USA, 2003.
[25]  C. M. Bishop, Pattern Recognition and Machine Learning, Springer, New York, NY, USA, 2006.
[26]  K. P. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, Cambridge, Mass, USA, 2012.
[27]  D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press, New York, NY, USA, 2012.
[28]  D. Barber, A. T. Cemgil, and S. Chiappa, Bayesian Time Series Models, Cambridge University Press, New York, NY, USA, 2011.
[29]  T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley & Sons, New York, NY, USA, 2nd edition, 2006.
[30]  A. Dempster, N. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” Journal of the Royal Statistical Society B, vol. 39, pp. 1–38, 1977.
[31]  K. Katahira, K. Watanabe, and M. Okada, “Deterministic annealing variant of variational Bayes method,” Journal of Physics, vol. 95, no. 1, Article ID 012015, 2008.
[32]  K. Kurihara and M. Welling, “Bayesian k-means as a “maximization-expectation” algorithm,” Neural Computation, vol. 21, no. 4, pp. 1145–1172, 2009.
[33]  J. Sung, Z. Ghahramani, and S.-Y. Bang, “Latent-space variational bayes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 12, pp. 2236–2242, 2008.
[34]  J. Sung, Z. Ghahramani, and S.-Y. Bang, “Second-order latent-space variational bayes for approximate bayesian inference,” IEEE Signal Processing Letters, vol. 15, pp. 918–921, 2008.
[35]  R. E. Turner and M. Sahani, “Two problems with variational expectation maximisation for time series models,” in Bayesian Time Series Models, D. Barber, A. T. Cemgil, and S. Chiappa, Eds., pp. 115–138, Cambridge University Press, New York, NY, USA, 2011.
[36]  K. Watanabe, “An alternative view of variational Bayes and asymptotic approximations of free energy,” Machine Learning, vol. 86, no. 2, pp. 273–293, 2012.
[37]  A. Honkela, T. Raiko, M. Kuusela, M. Tornio, and J. Karhunen, “Approximate riemannian conjugate gradient learning for fixed-form variational bayes,” Journal of Machine Learning Research, vol. 11, pp. 3235–3268, 2010.
[38]  T. P. Minka, A family of algorithms for approximate Bayesian inference [Ph.D. thesis], Department of EECS, Massachusetts Institute of Technology, Cambridge, Mass, USA, 2001.
[39]  S.-I. Amari and H. Nagaoka, Methods of Information Geometry, Oxford University Press, New York, NY, USA, 2007.
[40]  W. R. Gilks, S. Richardson, and D. J. Spiegelhalter, Markov Chain Monte Carlo in Practice, Chapman & Hall/CRC, New York, NY, USA, 1995.
[41]  C. P. Robert and G. Casella, Monte Carlo Statistical Methods, Springer, New York, NY, USA, 2nd edition, 2004.
[42]  N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller, and E. Teller, “Equation of state calculations by fast computing machines,” The Journal of Chemical Physics, vol. 21, no. 6, pp. 1087–1092, 1953.
[43]  W. K. Hastings, “Monte carlo sampling methods using Markov chains and their applications,” Biometrika, vol. 57, no. 1, pp. 97–109, 1970.
[44]  S. Geman and D. Geman, “Stochastic relaxation, gibbs distributions, and the bayesian restoration of images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 6, no. 6, pp. 721–741, 1984.
[45]  R. M. Neal, “Suppressing random walks in Markov chain Monte Carlo using ordered overrelaxation,” Tech. Rep. 9508, University of Toronto; Department of Statistics, 1995.
[46]  T. Marshall and G. Roberts, “An adaptive approach to Langevin MCMC,” Statistics and Computing, vol. 22, no. 5, pp. 1041–1057, 2012.
[47]  Y. Qi and T. P. Minka, “Hessian-based Markov chain Monte-Carlo algorithms,” in Proceedings of the 1st Cape Cod Workshop on Monte Carlo Methods, Cape Cod, Mass, USA, September 2002.
[48]  P. J. Green, “Reversible jump Markov chain monte carlo computation and Bayesian model determination,” Biometrika, vol. 82, no. 4, pp. 711–732, 1995.
[49]  R. E. Kass and A. E. Raftery, “Bayes factors,” Journal of the American Statistical Association, vol. 90, no. 430, pp. 773–795, 1995.
[50]  M. Lavine and M. J. Schervish, “Bayes factors: what they are and what they are not,” American Statistician, vol. 53, no. 2, pp. 119–122, 1999.
[51]  S. M. Lewis and A. E. Raftery, “Estimating Bayes factors via posterior simulation with the Laplace-Metropolis estimator,” Journal of the American Statistical Association, vol. 92, no. 438, pp. 648–655, 1997.
[52]  T. Toni and M. P. H. Stumpf, “Simulation-based model selection for dynamical systems in systems and population biology,” Bioinformatics, vol. 26, no. 1, pp. 104–110, 2009.
[53]  R. M. Neal, Bayesian Learning for Neural Networks, Springer, New York, NY, USA, 1996.
[54]  J. A. Hoeting, D. Madigan, A. E. Raftery, and C. T. Volinsky, “Bayesian model averaging: a tutorial,” Statistical Science, vol. 14, no. 4, pp. 382–417, 1999.
[55]  A. E. Raftery, “Approximate Bayes factors and accounting for model uncertainty in generalised linear models,” Biometrika, vol. 83, no. 2, pp. 251–266, 1996.
[56]  Z. Chen and E. N. Brown, “State space model,” Scholarpedia, vol. 8, no. 3, Article ID 30868, 2013.
[57]  L. Paninski, Y. Ahmadian, D. G. Ferreira et al., “A new look at state-space models for neural data,” Journal of Computational Neuroscience, vol. 29, no. 1-2, pp. 107–126, 2010.
[58]  A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGraw-Hill, New York, NY, USA, 4th edition, 2002.
[59]  C. P. Robert, T. Rydén, and D. M. Titterington, “Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method,” Journal of the Royal Statistical Society B, vol. 62, no. 1, pp. 57–75, 2000.
[60]  S. L. Scott, “Bayesian methods for hidden Markov models: recursive computing in the 21st century,” Journal of the American Statistical Association, vol. 97, no. 457, pp. 337–351, 2002.
[61]  Z. Ghahramani, “Learning dynamic Bayesian networks,” in Adaptive Processing of Sequences and Data Structures, C. L. Giles and M. Gori, Eds., pp. 168–197, Springer, New York, NY, USA, 1998.
[62]  R. E. Kalman, “A new approach to linear filtering and prediction problems,” Transactions of the ASME, vol. 82, pp. 35–45, 1960.
[63]  W. Wu, Y. Gao, E. Bienenstock, J. P. Donoghue, and M. J. Black, “Bayesian population decoding of motor cortical activity using a Kalman filter,” Neural Computation, vol. 18, no. 1, pp. 80–118, 2006.
[64]  W. Wu, J. E. Kulkarni, N. G. Hatsopoulos, and L. Paninski, “Neural decoding of hand motion using a linear state-space model with hidden states,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 17, no. 4, pp. 370–378, 2009.
[65]  E. N. Brown, L. M. Frank, D. Tang, M. C. Quirk, and M. A. Wilson, “A statistical paradigm for neural spike train decoding applied to position prediction from ensemble firing patterns of rat hippocampal place cells,” Journal of Neuroscience, vol. 18, no. 18, pp. 7411–7425, 1998.
[66]  A. C. Smith and E. N. Brown, “Estimating a state-space model from point process observations,” Neural Computation, vol. 15, no. 5, pp. 965–991, 2003.
[67]  U. T. Eden, L. M. Frank, R. Barbieri, V. Solo, and E. N. Brown, “Dynamic analysis of neural encoding by point process adaptive filtering,” Neural Computation, vol. 16, no. 5, pp. 971–998, 2004.
[68]  S. Koyama, L. Castellanos Pérez-Bolde, C. Rohilla Shalizi, and R. E. Kass, “Approximate methods for state-space models,” Journal of the American Statistical Association, vol. 105, no. 489, pp. 170–180, 2010.
[69]  A. Doucet, N. de Freitas, and N. Gordon, Sequential Monte Carlo Methods in Practice, Springer, New York, NY, USA, 2001.
[70]  A. E. Brockwell, A. L. Rojas, and R. E. Kass, “Recursive Bayesian decoding of motor cortical signals by particle filtering,” Journal of Neurophysiology, vol. 91, no. 4, pp. 1899–1907, 2004.
[71]  A. Ergun, R. Barbieri, U. T. Eden, M. A. Wilson, and E. N. Brown, “Construction of point process adaptive filter algorithms for neural system using sequential Monte Carlo methods,” IEEE Transactions on Biomedical Engineering, vol. 54, pp. 419–428, 2007.
[72]  V. ?mídl and A. Quinn, “Variational Bayesian filtering,” IEEE Transactions on Signal Processing, vol. 56, no. 10, pp. 5020–5030, 2008.
[73]  Y. Salimpour, H. Soltanian-Zadeh, S. Salehi, N. Emadi, and M. Abouzari, “Neuronal spike train analysis in likelihood space,” PLoS ONE, vol. 6, no. 6, Article ID e21256, 2011.
[74]  N. L. Hjort, C. Holmes, P. Müller, and S. G. Walker, Bayesian Nonparametrics, Cambridge University Press, New York, NY, USA, 2010.
[75]  Z. Ghahramani, “Bayesian nonparametrics and the probabilistic approach to modeling,” Philosophical Transactions on Royal Society of London A, vol. 371, Article ID 20110553, 2012.
[76]  E. Fox, E. Sudderth, M. Jordan, and A. Willsky, “Bayesian nonparametric methods for learning markov switching processes,” IEEE Signal Processing Magazine, vol. 27, no. 6, pp. 43–54, 2010.
[77]  C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning, MIT Press, Cambridge, Mass, USA, 2005.
[78]  J. Van Gael, Y. Saatci, Y. W. Teh, and Z. Ghahramani, “Beam sampling for the infinite hidden Markov model,” in 25th International Conference on Machine Learning, pp. 1088–1095, fin, July 2008.
[79]  F. Gabbiani and C. Koch, “Principles of spike train analysis,” in Methods in Neuronal Modeling: From Synapses to Networks, C. Koch and I. Segev, Eds., pp. 313–360, MIT Press, Boston, Mass, USA, 2nd edition, 1998.
[80]  R. E. Kass, V. Ventura, and E. N. Brown, “Statistical issues in the analysis of neuronal data,” Journal of Neurophysiology, vol. 94, no. 1, pp. 8–25, 2005.
[81]  J. S. Prentice, J. Homann, K. D. Simmons, G. Tka?ik, V. Balasubramanian, and P. C. Nelson, “Fast, scalable, bayesian spike identification for Multi-Electrode arrays,” PLoS ONE, vol. 6, no. 7, Article ID e19884, 2011.
[82]  F. Wood, M. J. Black, C. Vargas-Irwin, M. Fellows, and J. P. Donoghue, “On the variability of manual spike sorting,” IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp. 912–918, 2004.
[83]  C. Ekanadham, D. Tranchina, and E. P. Simoncelli, “A blind deconvolution method for neural spike identification,” in Proceedings of the 25th Annual Conference on Neural Information Processing Systems (NIPS '11), vol. 23, MIT Press, December 2011.
[84]  M. S. Lewicki, “A review of methods for spike sorting: the detection and classification of neural action potentials,” Network, vol. 9, no. 4, pp. R53–R78, 1998.
[85]  D. P. Nguyen, L. M. Frank, and E. N. Brown, “An application of reversible-jump Markov chain Monte Carlo to spike classification of multi-unit extracellular recordings,” Network, vol. 14, no. 1, pp. 61–82, 2003.
[86]  F. Wood and M. J. Black, “A nonparametric Bayesian alternative to spike sorting,” Journal of Neuroscience Methods, vol. 173, no. 1, pp. 1–12, 2008.
[87]  J. A. Herbst, S. Gammeter, D. Ferrero, and R. H. R. Hahnloser, “Spike sorting with hidden Markov models,” Journal of Neuroscience Methods, vol. 174, no. 1, pp. 126–134, 2008.
[88]  A. Calabrese and L. Paninski, “Kalman filter mixture model for spike sorting of non-stationary data,” Journal of Neuroscience Methods, vol. 196, no. 1, pp. 159–169, 2011.
[89]  V. Ventura, “Automatic spike sorting using tuning information,” Neural Computation, vol. 21, no. 9, pp. 2466–2501, 2009.
[90]  V. Ventura, “Traditional waveform based spike sorting yields biased rate code estimates,” Proceedings of the National Academy of Sciences of the United States of America, vol. 106, no. 17, pp. 6921–6926, 2009.
[91]  M. Park and J. W. Pillow, “Receptive field inference with localized priors,” PLoS Computational Biology, vol. 7, no. 10, Article ID e1002219, 2011.
[92]  I. M. Park and J. W. Pillow, “Bayesian spike-triggered covariance analysis,” in Advances in Neural Information Processing Systems (NIPS), J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Fereira, and K. Q. Weinberger, Eds., vol. 24, pp. 1692–1700, MIT Press, Boston, Mass, USA, 2011.
[93]  D. Endres and M. Oram, “Feature extraction from spike trains with Bayesian binning: ‘Latency is where the signal starts’,” Journal of Computational Neuroscience, vol. 29, no. 1-2, pp. 149–169, 2010.
[94]  I. Dimatteo, C. R. Genovese, and R. E. Kass, “Bayesian curve-fitting with free-knot splines,” Biometrika, vol. 88, no. 4, pp. 1055–1071, 2001.
[95]  A. C. Smith, J. D. Scalon, S. Wirth, M. Yanike, W. A. Suzuki, and E. N. Brown, “State-space algorithms for estimating spike rate functions,” Computational Intelligence and Neuroscience, vol. 2010, Article ID 426539, 2010.
[96]  B. Cronin, I. H. Stevenson, M. Sur, and K. P. K?rding, “Hierarchical bayesian modeling and Markov chain Monte Carlo sampling for tuning-curve analysis,” Journal of Neurophysiology, vol. 103, no. 1, pp. 591–602, 2010.
[97]  H. Taubman, E. Vaadia, R. Paz, and G. Chechik, “A Bayesian approach for characterizing direction tuning curves in the supplementary motor area of behaving monkeys,” Journal of Neurophysiology, 2013.
[98]  L. Paninski, J. Pillow, and J. Lewi, “Statistical models for neural encoding, decoding, and optimal stimulus design,” in Computational Neuroscience: Theoretical Insights Into Brain Function, P. Cisek, T. Drew, and J. Kalaska, Eds., Elsevier, 2007.
[99]  S. Gerwinn, J. H. Macke, M. Seeger, and M. Bethge, “Bayesian inference for spiking neuron models with a sparsity prior,” in Advances in Neural Information Processing Systems (NIPS), J. C. Platt, D. Koller, Y. Singer, and S. Roweis, Eds., vol. 20, pp. 529–536, MIT Press, Boston, Mass, USA, 2008.
[100]  J. W. Pillow and J. G. Scott, “Fully Bayesian inference for neural models with negative-binomial spiking,” in Advances in Neural Information Processing Systems (NIPS), P. Bartlett, F. C. N. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, Eds., vol. 25, pp. 1907–1915, MIT Press, Boston, Mass, USA, 2012.
[101]  S. Koyama, U. T. Eden, E. N. Brown, and R. E. Kass, “Bayesian decoding of neural spike trains,” Annals of the Institute of Statistical Mathematics, vol. 62, no. 1, pp. 37–59, 2010.
[102]  S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, New York, NY, USA, 2004.
[103]  J. W. Pillow, Y. Ahmadian, and L. Paninski, “Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains,” Neural Computation, vol. 23, no. 1, pp. 1–45, 2011.
[104]  A. D. Ramirez, Y. Ahmadian, J. Schumacher, D. Schneider, S. M. N. Woolley, and L. Paninski, “Incorporating naturalistic correlation structure improves spectrogram reconstruction from neuronal activity in the songbird auditory midbrain,” Journal of Neuroscience, vol. 31, no. 10, pp. 3828–3842, 2011.
[105]  Z. Chen, K. Takahashi, and N. G. Hatsopoulos, “Sparse Bayesian inference methods for decoding 3D reach and grasp kinematics and joint angles with primary motor cortical ensembles,” in Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology (EMBC '13), pp. 5930–5933, 2013.
[106]  K. Zhang, I. Ginzburg, B. L. McNaughton, and T. J. Sejnowski, “Interpreting neuronal population activity by reconstruction: unified framework with application to hippocampal place cells,” Journal of Neurophysiology, vol. 79, no. 2, pp. 1017–1044, 1998.
[107]  W. Truccolo, G. M. Friehs, J. P. Donoghue, and L. R. Hochberg, “Primary motor cortex tuning to intended movement kinematics in humans with tetraplegia,” Journal of Neuroscience, vol. 28, no. 5, pp. 1163–1178, 2008.
[108]  W. Truccolo and J. P. Donoghue, “Nonparametric modeling of neural point processes via stochastic gradient boosting regression,” Neural Computation, vol. 19, no. 3, pp. 672–705, 2007.
[109]  T. P. Coleman and S. S. Sarma, “A computationally efficient method for nonparametric modeling of neural spiking activity with point processes,” Neural Computation, vol. 22, no. 8, pp. 2002–2030, 2010.
[110]  M. M. Shanechi, E. N. Brown, and Z. M. Williams, “Neural population partitioning and a concurrent brain-machine interface for sequential control motor function,” Nature Neuroscience, vol. 12, pp. 1715–1722, 2012.
[111]  M. M. Shanechi, G. W. Wornell, Z. Williams, and E. N. Brown, “A parallel point-process filter for estimation of goal-directed movements from neural signals,” in Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '10), pp. 521–524, Dallas, Tex, USA, March 2010.
[112]  Y. Ahmadian, J. W. Pillow, and L. Paninski, “Efficient Markov chain monte carlo methods for decoding neural spike trains,” Neural Computation, vol. 23, no. 1, pp. 46–96, 2011.
[113]  A. K. Bansal, W. Truccolo, C. E. Vargas-Irwin, and J. P. Donoghue, “Decoding 3D reach and grasp from hybrid signals in motor and premotor cortices: spikes, multiunit activity, and local field potentials,” Journal of Neurophysiology, vol. 107, no. 5, pp. 1337–1355, 2012.
[114]  V. Ventura, “Spike train decoding without spike sorting,” Neural Computation, vol. 20, no. 4, pp. 923–963, 2008.
[115]  Z. Chen, F. Kloosterman, S. Layton, and W. A. Wilson, “Transductive neural decoding of unsorted neuronal spikes of rat hippocampus,” in Proceedings of the 34th Annual International Conference of the IEEE Engineering in Medicine and Biology (EMBC '12), pp. 1310–1313, August 2012.
[116]  F. Kloosterman, S. Layton, Z. Chen, and M. A. Wilson, “Bayesian decoding of unsorted spikes in the rat hippocampus,” Journal of Neurophysiology, 2013.
[117]  A. L. Jacobs, G. Fridman, R. M. Douglas et al., “Ruling out and ruling in neural codes,” Proceedings of the National Academy of Sciences of the United States of America, vol. 106, no. 14, pp. 5936–5941, 2009.
[118]  D. H. Johnson, “Information theory and neural information processing,” IEEE Transactions on Information Theory, vol. 56, no. 2, pp. 653–666, 2010.
[119]  C. Smith and L. Paninski, “Computing loss of efficiency in optimal Bayesian decoders given noisy or incomplete spike trains,” Network, vol. 24, no. 2, pp. 75–98, 2013.
[120]  D. S. Greenberg, A. R. Houweling, and J. N. D. Kerr, “Population imaging of ongoing neuronal activity in the visual cortex of awake rats,” Nature Neuroscience, vol. 11, no. 7, pp. 749–751, 2008.
[121]  J. T. Vogelstein, B. O. Watson, A. M. Packer, R. Yuste, B. Jedynak, and L. Paninskik, “Spike inference from calcium imaging using sequential Monte Carlo methods,” Biophysical Journal, vol. 97, no. 2, pp. 636–655, 2009.
[122]  J. T. Vogelstein, A. M. Packer, T. A. Machado et al., “Fast nonnegative deconvolution for spike train inference from population calcium imaging,” Journal of Neurophysiology, vol. 104, no. 6, pp. 3691–3704, 2010.
[123]  C. Andrieu, E. Barat, and A. Doucet, “Bayesian deconvolution of noisy filtered point processes,” IEEE Transactions on Signal Processing, vol. 49, no. 1, pp. 134–146, 2001.
[124]  J. O?ativia, S. R. Schultz, and P. L. Dragotti, “A finite rate of innovation algorithm for fast and accurate spike detection from two-photon calcium imaging,” Journal of Neural Engineering, vol. 10, Article ID 046017, 2013.
[125]  J. W. Pillow, J. Shlens, L. Paninski et al., “Spatio-temporal correlations and visual signalling in a complete neuronal population,” Nature, vol. 454, no. 7207, pp. 995–999, 2008.
[126]  W. Truccolo, L. R. Hochberg, and J. P. Donoghue, “Collective dynamics in human and monkey sensorimotor cortex: predicting single neuron spikes,” Nature Neuroscience, vol. 13, no. 1, pp. 105–111, 2010.
[127]  E. S. Chornoboy, L. P. Schramm, and A. F. Karr, “Maximum likelihood identification of neural point process systems,” Biological Cybernetics, vol. 59, no. 4-5, pp. 265–275, 1988.
[128]  M. Okatan, M. A. Wilson, and E. N. Brown, “Analyzing functional connectivity using a network likelihood model of ensemble neural spiking activity,” Neural Computation, vol. 17, no. 9, pp. 1927–1961, 2005.
[129]  F. Rigat, M. de Gunst, and J. van Pelt, “Bayesian modelling and analysis of spatio-temporal neuronal networks,” Bayesian Analysis, vol. 1, no. 4, pp. 733–764, 2006.
[130]  I. H. Stevenson, J. M. Rebesco, N. G. Hatsopoulos, Z. Haga, L. E. Miller, and K. P. Kording, “Bayesian inference of functional connectivity and network structure from spikes,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 17, no. 3, pp. 203–213, 2009.
[131]  Z. Chen, D. F. Putrino, S. Ghosh, R. Barbieri, and E. N. Brown, “Statistical inference for assessing functional connectivity of neuronal ensembles with sparse spiking data,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, no. 2, pp. 121–135, 2011.
[132]  S. Eldawlatly, Y. Zhou, R. Jin, and K. G. Oweiss, “On the use of dynamic Bayesian networks in reconstructing functional neuronal networks from spike train ensembles,” Neural Computation, vol. 22, no. 1, pp. 158–189, 2010.
[133]  Y. Mishchenko, J. Vogelstein, and L. Paninski, “A Bayesian approach for inferring neuronal connectivity from calcium uorescent imaging data,” Annals of Applied Statistics, vol. 5, pp. 1229–1261, 2011.
[134]  L. Martignon, G. Deco, K. Laskey, M. Diamond, W. Freiwald, and E. Vaadia, “Neural coding: higher-order temporal patterns in the neurostatistics of cell assemblies,” Neural Computation, vol. 12, no. 11, pp. 2621–2653, 2000.
[135]  H. Shimazaki, S. Amari, E. N. Brown, and S. Gruen, “State-space analysis of time-varying higherorder spike correlation for multiple neural spike train data,” PLoS Computational Biology, vol. 8, no. 3, Article ID e1002385, 2012.
[136]  B. M. Turner, B. U. Forstmann, E.-J. Wagenmakers, S. D. Brown, P. B. Sederberg, and M. Steyvers, “A Bayesian framework for simultaneously modeling neural and behavioral data,” NeuroImage, vol. 72, pp. 193–206, 2013.
[137]  J. W. Pillow and P. Latham, “Neural characterization in partially observed populations of spiking neurons,” in Advances in Neural Information Processing Systems (NIPS), J. C. Platt, D. Koller, Y. Singer, and S. Roweis, Eds., vol. 20, pp. 1161–1168, MIT Press, Boston, Mass, USA, 2008.
[138]  L. Li, I. M. Park, S. Seth, J. C. Sanchez, and J. C. Príncipe, “Functional connectivity dynamics among cortical neurons: a dependence analysis,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 20, no. 1, pp. 18–30, 2012.
[139]  R. E. Kass, R. C. Kelly, and W.-L. Loh, “Assessment of synchrony in multiple neural spike trains using loglinear point process models,” The Annals of Applied Statistics, vol. 5, no. 2B, pp. 1262–1292, 2011.
[140]  S. Kim, D. Putrino, S. Ghosh, and E. N. Brown, “A Granger causality measure for point process models of ensemble neural spiking activity,” PLoS Computational Biology, vol. 7, no. 3, Article ID e1001110, 2011.
[141]  R. Vicente, M. Wibral, M. Lindner, and G. Pipa, “Transfer entropy-a model-free measure of effective connectivity for the neurosciences,” Journal of Computational Neuroscience, vol. 30, no. 1, pp. 45–67, 2011.
[142]  P. Berkes, F. Woood, and J. Pillow, “Characterizing neural dependencies with copula models,” in Advances in Neural Information Processing Systems (NIPS), J. C. Platt, D. Koller, Y. Singer, and S. Roweis, Eds., vol. 20, MIT Press, Boston, Mass, USA, 2008.
[143]  M. S. Smith, “Bayesian approaches to copula modelling,” in Bayesian Theory and Applications, P. Damien, P. Dellaportas, N. Polson, and D. Stephens, Eds., Oxford University Press, New York, NY, USA, 2013.
[144]  B. M. Yu, J. P. Cunningham, G. Santhanam, S. I. Ryu, K. V. Shenoy, and M. Sahani, “Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity,” Journal of Neurophysiology, vol. 102, no. 1, pp. 614–635, 2009.
[145]  Z. Chen, F. Kloosterman, E. N. Brown, and M. A. Wilson, “Uncovering spatial topology represented by rat hippocampal population neuronal codes,” Journal of Computational Neuroscience, vol. 33, no. 2, pp. 227–255, 2012.
[146]  Z. Chen, S. N. Gomperts, J. Yamamoto, and W. A. Wilson, “Neural representation of spatial topology in the rodent hippocampus,” Neural Computation, vol. 26, no. 1, pp. 1–39, 2014.
[147]  Z. Chen and M. A. Wilson, “A variational nonparametric Bayesian approach for inferring rat hippocampal population codes,” in Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology (EMBC '13), pp. 7092–7095, 2013.
[148]  K. Famm, B. Litt, K. J. Tracey, E. S. Boyden, and M. Slaoui, “A jump-start for electroceuticals,” Nature, vol. 496, pp. 159–161, 2013.

Full-Text

comments powered by Disqus