oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
A non extensive approach to the entropy of symbolic sequences  [PDF]
Marco Buiatti,Paolo Grigolini,Luigi Palatella
Physics , 1999, DOI: 10.1016/S0378-4371(99)00062-X
Abstract: Symbolic sequences with long-range correlations are expected to result in a slow regression to a steady state of entropy increase. However, we prove that also in this case a fast transition to a constant rate of entropy increase can be obtained, provided that the extensive entropy of Tsallis with entropic index q is adopted, thereby resulting in a new form of entropy that we shall refer to as Kolmogorov-Sinai-Tsallis (KST) entropy. We assume that the same symbols, either 1 or -1, are repeated in strings of length l, with the probability distribution p(l) proportional to 1/(l^mu). The numerical evaluation of the KST entropy suggests that at the value mu = 2 a sort of abrupt transition might occur. For the values of mu in the range 1 diverges, thereby breaking the balance between determinism and randomness in favor of determinism. In the region mu > 2 the entropic index q seems to depend on mu through the power law expression q = (mu-2)^(alpha) with alpha approximately 0.13 (q = 1 with mu > 3). It is argued that this phase-transition like property signals the onset of the thermodynamical regime at mu = 2.
Entropy and long-range correlations in random symbolic sequences  [PDF]
S. S. Melnik,O. V. Usatenko
Physics , 2014,
Abstract: The goal of this paper is to develop an estimate for the entropy of random long-range correlated symbolic sequences with elements belonging to a finite alphabet. As a plausible model, we use the high-order additive stationary ergodic Markov chain. Supposing that the correlations between random elements of the chain are weak we express the differential entropy of the sequence by means of the symbolic pair correlation function. We also examine an algorithm for estimating the differential entropy of finite symbolic sequences. We show that the entropy contains two contributions, the correlation and fluctuation ones. The obtained analytical results are used for numerical evaluation of the entropy of written English texts and DNA nucleotide sequences. The developed theory opens the way for constructing a more consistent and sophisticated approach to describe the systems with strong short- and weak long-range correlations.
Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis  [PDF]
Miguel A. Ré, Rajeev K. Azad
PLOS ONE , 2014, DOI: 10.1371/journal.pone.0093532
Abstract: Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.
On uniqueness theorems for Tsallis entropy and Tsallis relative entropy  [PDF]
Shigeru Furuichi
Mathematics , 2004,
Abstract: The uniqueness theorem for Tsallis entropy was presented in {\it H.Suyari, IEEE Trans. Inform. Theory, Vol.50, pp.1783-1787 (2004)} by introducing the generalized Shannon-Khinchin's axiom. In the present paper, this result is generalized and simplified as follows: {\it Generalization}: The uniqueness theorem for Tsallis relative entropy is shown by means of the generalized Hobson's axiom. {\it Simplification}: The uniqueness theorem for Tsallis entropy is shown by means of the generalized Faddeev's axiom.
Tsallis entropy: How unique?  [PDF]
Sumiyoshi Abe
Physics , 2003, DOI: 10.1007/s00161-003-0153-1
Abstract: It is shown how, among a class of generalized entropies, the Tsallis entropy can uniquely be identified by the principles of thermodynamics, the concept of stability and the axiomatic foundation.
On Tsallis nonequilibrium entropy evolution  [PDF]
Xing Xiu-San
Physics , 2014,
Abstract: In this paper we derived a 6N dimensional non-homogeneous evolution equation of Tsallis non-equilibrium entropy; presented a formula for entropy production rate (i.e. the law of entropy increase) for Tsallis entropy only when its index q>0, otherwise the law of entropy increase does not hold when q<0 or q=0.
Special Issue: Tsallis Entropy  [PDF]
Anastasios Anastasiadis
Entropy , 2012, DOI: 10.3390/e14020174
Abstract: One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields. [...]
Escort distributions and Tsallis entropy  [PDF]
Nikos Kalogeropoulos
Physics , 2012,
Abstract: We present an argument justifying the origin of the escort distributions used in calculations involving the Tsallis entropy. We rely on an induced hyperbolic Riemannian metric reflecting the generalized composition property of the Tsallis entropy. The mapping of the corresponding Riemannian measure on the space of thermodynamic variables gives the specific form of the escort distributions and provides a geometric interpretation of the non-extensive parameter. In addition, we explain the polynomial rate of increase of the sample space volume for systems described by the Tsallis entropy, thus extending the previously reached conclusions for discrete systems to the case of systems whose evolution is described by flows on Riemannian manifolds.
Is the Tsallis entropy stable?  [PDF]
James F. Lutsko,Jean Pierre Boon,Patrick Grosfils
Physics , 2009, DOI: 10.1209/0295-5075/86/40005
Abstract: The question of whether the Tsallis entropy is Lesche-stable is revisited. It is argued that when physical averages are computed with the escort probabilities, the correct application of the concept of Lesche-stability requires use of the escort probabilities. As a consequence, as shown here, the Tsallis entropy is unstable but the thermodynamic averages are stable. We further show that Lesche stability as well as thermodynamic stability can be obtained if the homogeneous entropy is used as the basis of the formulation of non-extensive thermodynamics. In this approach, the escort distribution arises naturally as a secondary structure.
On the thermodynamic stability conditions of Tsallis' entropy  [PDF]
T. Wada
Physics , 2002, DOI: 10.1016/S0375-9601(02)00378-X
Abstract: The thermodynamic stability condition (TSC) of Tsallis' entropy is revisited. As Ramshaw [Phys. Lett. A {\bf 198} (1995) 119] has already pointed out, the concavity of Tsallis' entropy with respect to the internal energy is not sufficient to guarantee thermodynamic stability for all values of $q$ due to the non-additivity of Tsallis' entropy. Taking account of the non-additivity the differential form of the TSC for Tsallis entropy is explicitly derived. It is shown that the resultant TSC for Tsallis' entropy is equivalent to the positivity of the standard specific heat. These results are consistent with the relation between Tsallis and R\'enyi entropies.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.