A two-parameter generalization of Boltzmann-Gibbs-Shannon entropy based on natural logarithm is introduced. The generalization of the Shannon-Khinchin axioms corresponding to the two-parameter entropy is proposed and verified. We present the relative entropy, Jensen-Shannon divergence measure and check their properties. The Fisher information measure, the relative Fisher information, and the Jensen-Fisher information corresponding to this entropy are also derived. Also the Lesche stability and the thermodynamic stability conditions are verified. We propose a generalization of a complexity measure and apply it to a two-level system and a system obeying exponential distribution. Using different distance measures we define the statistical complexity and analyze it for two-level and five-level system. 1. Introduction Entropy is a very important quantity and plays a key role in many aspects of statistical mechanics and information theory. The most widely used form of entropy was given by Boltzmann and Gibbs from the statistical mechanics point of view and by Shannon from information theory point of view. Later certain other generalized measures of entropy like the Rényi entropy [1] and the Sharma-Mittal-Taneja entropy [2, 3] were introduced and their information theoretic aspects were investigated. Recently in [4] a new expression for the entropy was proposed as a generalization of the Boltzmann-Gibbs entropy and the necessary properties like concavity, Lesche stability, and thermodynamic stability were verified. This entropy has been applied to a wide variety of physical systems, in particular to long-range interacting systems [5, 6] and non-Markovian systems [7]. Most of the generalized entropies introduced so far were constructed using a deformed logarithm. But two generalized entropies, one called the fractal entropy and the other known as fractional entropy, were proposed using the natural logarithm. The fractal entropy which was introduced in [8] attempts to describe complex systems which exhibit fractal or chaotic phase space. Similarly, the fractional entropy was put forward in [9] and later applied to study anomalous diffusion [10]. Merging these two entropies in our present work we propose a fractional entropy in a fractal phase space. Thus, there are two parameters, one characterizing the fractional nature of the entropy and the other describing the fractal dimension of the phase space. Thus, the functional form of the entropy depends on the natural logarithm. We give the generalized Shannon-Khinchin axioms corresponding to this two-parameter
References
[1]
A. Rényi, in Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, vol. 547, 1960.
[2]
D. P. Mittal, “On some functional equations concerning entropy, directed divergence and inaccuracy,” Metrika, vol. 22, pp. 35–45, 1975.
[3]
B. D. Sharma and I. J. Taneja, “Entropy of type and other generalized measures in information theory,” Metrika, vol. 22, no. 4, pp. 205–215, 1975.
[4]
C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988.
[5]
B. J. C. Cabral and C. Tsallis, “Metastability and weak mixing in classical long-range many-rotator systems,” Physical Review E, vol. 66, no. 6, Article ID 065101(R), 2002.
[6]
A. Pluchino, V. Latora, and A. Rapisarda, “Dynamics and thermodynamics of a model with long-range interactions,” Continuum Mechanics and Thermodynamics, vol. 16, no. 3, pp. 245–255, 2004.
[7]
A. M. Mariz and C. Tsallis, “Unified long-memory mesoscopic mechanism consistent with nonextensive statistical mechanics,” Physics Letters A: General, Atomic and Solid State Physics, vol. 376, no. 45, pp. 3088–3091, 2012.
[8]
Q. A. Wang, “Extensive generalization of statistical mechanics based on incomplete information theory,” Entropy, vol. 5, no. 2, pp. 220–232, 2003.
[9]
M. R. Ubriaco, “Entropies based on fractional calculus,” Physics Letters. A, vol. 373, no. 30, pp. 2516–2519, 2009.
[10]
M. R. Ubriaco, “A simple mathematical model for anomalous diffusion via Fisher's information theory,” Physics Letters A, vol. 373, no. 44, pp. 4017–4021, 2009.
[11]
R. López-Ruiz, H. L. Mancini, and X. Calbet, “A statistical measure of complexity,” Physics Letters A, vol. 209, no. 5-6, pp. 321–326, 1995.
[12]
F. Shafee, “A new nonextensive entropy,” IMA Journal of Applied Mathematics, 2007.
[13]
F. Shafee, “Generalized entropy from mixing: thermodynamics, mutual information and symmetry breaking,” http://arxiv.org/abs/0906.2458.
[14]
G. Kaniadakis, “Non-linear kinetics underlying generalized statistics,” Physica A, vol. 296, no. 3-4, pp. 405–425, 2001.
[15]
A. Lavagno, A. M. Scarfone, and P. N. Swamy, “Basic-deformed thermostatistics,” Journal of Physics. A. Mathematical and Theoretical, vol. 40, no. 30, pp. 8635–8654, 2007.
[16]
E. T. Jaynes, “Gibbs vs Boltzmann entropies,” American Journal of Physics, vol. 33, pp. 391–398, 1965.
[17]
A. F. T. Martins, N. A. Smith, E. P. Xing, P. M. Q. Aguiar, and M. A. T. Figueiredo, “Nonextensive information theoretic kernels on measures,” Journal of Machine Learning Research, vol. 10, pp. 935–975, 2009.
[18]
J. Lin, “Divergence measures based on the Shannon entropy,” IEEE Transactions on Information Theory, vol. 37, no. 1, pp. 145–151, 1991.
[19]
G. V. Vstovsky, “Interpretation of the extreme physical information principle in terms of shift information,” Physical Review E, vol. 51, p. 975, 1995.
[20]
F. Pennini and A. Plastino, “Fisher's information measure in a Tsallis' nonextensive setting and its application to diffusive processes,” Physica A, vol. 247, pp. 559–569, 1997.
[21]
P. Sànchez-Moreno, A. Zarzo, and J. S. Dehesa, “Jensen divergence based on Fisher's information,” Journal of Physics A: Mathematical and Theoretical, vol. 45, Article ID 125305, 2012.
[22]
B. Lesche, “Instabilities of Rényi entropies,” Journal of Statistical Physics, vol. 27, no. 2, pp. 419–422, 1982.
[23]
B. Lesche, “Rényi entropies and observables,” Physical Review E, vol. 70, Article ID 017102, 2004.
[24]
S. Abe, G. Kaniadakis, and A. M. Scarfone, “Stabilities of generalized entropies,” Journal of Physics A Mathematical and General, vol. 37, no. 44, pp. 10513–10519, 2004.
[25]
T. Wada, “Thermodynamic stabilities of the generalized Boltzmann entropies,” Physica A, vol. 340, no. 1-3, pp. 126–130, 2004.
[26]
A. M. Scarfone and T. Wada, “Thermodynamic equilibrium and its stability for microcanonical systems described by the Sharma-Taneja-Mittal entropy,” Physical Review E, vol. 62, Article ID 026123, 2005.
[27]
S. Lloyd and H. Pagels, “Complexity as thermodynamic depth,” Annals of Physics, vol. 188, no. 1, pp. 186–213, 1988.
[28]
J. S. Shiner, M. Davison, and P. T. Landsberg, “Simple measure for complexity,” Physical Review E, vol. 59, no. 2, pp. 1459–1464, 1999.
[29]
T. Yamano, “A statistical complexity measure with nonextensive entropy and quasi-multiplicativity,” Journal of Mathematical Physics, vol. 45, no. 5, pp. 1974–1987, 2004.
[30]
M. T. Martin, A. Plastino, and O. A. Rosso, “Statistical complexity and disequilibrium,” Physics Letters A, vol. 311, no. 2-3, pp. 126–132, 2003.
[31]
W. K. Wootters, “Statistical distance and Hilbert space,” Physical Review D: Particles and Fields, vol. 23, no. 2, pp. 357–362, 1981.
[32]
R. Metzler and J. Klafter, “The restaurant at the end of the random walk: recent developments in the description of anomalous transport by fractional dynamics,” Journal of Physics A Mathematical and General, vol. 37, no. 31, pp. R161–R208, 2004.
[33]
A. M. Kowalski, M. T. Martín, A. Plastino, O. A. Rosso, and M. Casas, “Distances in probability space and the statistical complexity setup,” Entropy, vol. 13, no. 6, pp. 1055–1075, 2011.