Entropy, as a complexity measure, is a fundamental concept for time series analysis. Among many methods, sample entropy (SampEn) has emerged as a robust, powerful measure for quantifying complexity of time series due to its insensitivity to data length and its immunity to noise. Despite its popular use, SampEn is based on the standardized data where the variance is routinely discarded, which may nonetheless provide additional information for discriminant analysis. Here we designed a simple, yet efficient, complexity measure, namely variance entropy (VarEn), to integrate SampEn with variance to achieve effective discriminant analysis. We applied VarEn to analyze local field potential (LFP) collected from visual cortex of macaque monkey while performing a generalized flash suppression task, in which a visual stimulus was dissociated from perceptual experience, to study neural complexity of perceptual awareness. We evaluated the performance of VarEn in comparison with SampEn on LFP, at both single and multiple scales, in discriminating different perceptual conditions. Our results showed that perceptual visibility could be differentiated by VarEn, with significantly better discriminative performance than SampEn. Our findings demonstrate that VarEn is a sensitive measure of perceptual visibility, and thus can be used to probe perceptual awareness of a stimulus. 1. Introduction Over the past decades, entropy [1] has been widely used for analysis of dynamic systems. Among many measures, sample entropy (SampEn) is thought of as an effective, robust method due to its insensitivity to data length and its immunity to noise [2]. Until now, SampEn has been successfully applied for discriminant analysis of cardiovascular data [3], electroencephalogram data [4], and many others [5]. In addition, SampEn has been used in multiscale analysis for computing entropy over multiple time scales inherent in time series. For example, multiscale entropy [6] and adaptive multiscale entropy (AME) [7] both use SampEn to estimate entropy over multiple scales of time series. Despite its popularity, it is not well recognized that there is an inherit drawback of SampEn used for discriminant analysis, that is, the calculation of SampEn is routinely based on the normalized data where the variance of data that may provide additional information for discrimination is discarded [8]. The normalization is essentially to rescale the data, which is appropriate if the analysis is driven by the search for order in the dynamics, but is otherwise inappropriate for discriminant analysis of two data
References
[1]
S. M. Pincus, “Approximate entropy as a measure of system complexity,” Proceedings of the National Academy of Sciences of the United States of America, vol. 88, no. 6, pp. 2297–2301, 1991.
[2]
J. S. Richman and J. R. Moorman, “Physiological time-series analysis using approximate and sample entropy,” American Journal of Physiology, vol. 278, no. 6, pp. H2039–H2049, 2000.
[3]
D. E. Lake, J. S. Richman, M. Pamela Griffin, and J. Randall Moorman, “Sample entropy analysis of neonatal heart rate variability,” American Journal of Physiology, vol. 283, no. 3, pp. R789–R797, 2002.
[4]
E. N. Bruce, M. C. Bruce, and S. Vennelaganti, “Sample entropy tracks changes in electroencephalogram power spectrum with sleep state and aging,” Journal of Clinical Neurophysiology, vol. 26, no. 4, pp. 257–266, 2009.
[5]
S. Ramdani, B. Seigle, J. Lagarde, F. Bouchara, and P. L. Bernard, “On the use of sample entropy to analyze human postural sway data,” Medical Engineering and Physics, vol. 31, no. 8, pp. 1023–1031, 2009.
[6]
M. Costa, A. L. Goldberger, and C. K. Peng, “Multiscale entropy analysis of complex physiologic time series,” Physical Review Letters, vol. 89, no. 6, Article ID 068102, 4 pages, 2002.
[7]
M. Hu and H. Liang, “Adaptive multiscale entropy analysis of multivariate neural data,” IEEE Transactions on Biomedical Engineering, vol. 59, no. 1, pp. 12–15, 2012.
[8]
J. S. Richman, D. E. Lake, and J. R. Moorman, “Sample entropy,” Methods in Enzymology, vol. 384, pp. 172–184, 2004.
[9]
M. Wilke, N. K. Logothetis, and D. A. Leopold, “Generalized flash suppression of salient visual targets,” Neuron, vol. 39, no. 6, pp. 1043–1052, 2003.
[10]
P. Grassberger and I. Procaccia, “Estimation of the Kolmogorov entropy from a chaotic signal,” Physical Review A, vol. 28, no. 4, pp. 2591–2593, 1983.
[11]
J. P. Eckmann and D. Ruelle, “Ergodic theory of chaos and strange attractors,” Reviews of Modern Physics, vol. 57, no. 3, pp. 617–656, 1985.
[12]
S. Lu, X. Chen, J. K. Kanters, I. C. Solomon, and K. H. Chon, “Automatic selection of the threshold value r for approximate entropy,” IEEE Transactions on Biomedical Engineering, vol. 55, no. 8, pp. 1966–1972, 2008.
[13]
N. Rehman and D. P. Mandic, “Multivariate empirical mode decomposition,” Proceedings of the Royal Society A, vol. 466, no. 2117, pp. 1291–1302, 2010.
[14]
M. Wilke, N. K. Logothetis, and D. A. Leopold, “Local field potential reflects perceptual suppression in monkey visual cortex,” Proceedings of the National Academy of Sciences of the United States of America, vol. 103, no. 46, pp. 17507–17512, 2006.
[15]
L. V. Hedges and I. Olkin, Statistical Methods for Meta-Analysis, Academic Press, Orlando, Fla, USA, 1985.