Entropy represents
a universal concept in science suitable for quantifying the uncertainty of a series of random events. We define and describe this notion in an appropriate
manner for physicists. We start with a brief recapitulation of the basic
concept of the theory probability being useful for the determination of the
concept of entropy. The history of how this concept came into its to-day exact
form is sketched. We show that the Shannon entropy represents the most adequate
measure of the probabilistic uncertainty of a random object. Though the notion
of entropy has been introduced in classical thermodynamics as a thermodynamic
state variable it relies on concepts studied in the theory of probability and
mathematical statistics. We point out that whole formalisms of statistical
mechanics can be rewritten in terms of Shannon entropy. The notion “entropy” is
differently understood in various science disciplines: in classical physics
it represents the thermodynamical state variable; in communication theory it
represents the efficiency of transmission of communication; in the theory of
general systems the magnitude of the configurational order; in ecology the
measure for bio-diversity; in statistics the degree of disorder, etc. All these
notions can be mapped on the general mathematical concept of entropy. By means
of entropy, the configurational order of complex systems can be exactly
quantified. Besides the Shannon entropy, there exists a class of Shannon-like
entropies which converge, under certain circumstances, toward Shannon
entropy. The Shannon-like entropy is sometimes easier to handle mathematically
then Shannon entropy. One of the important Shannon-like entropy is well-known
Tsallis entropy. The application of the Shannon and Shannon-like entropies in
science is really versatile. Besides the mentioned statistical physics, they
play a fundamental role in the quantum information, communication theory, in the
description of disorder, etc.
References
[1]
Feller, W. (1968) An Introduction to Probability Theory and Its Applications. Volume I., John Wiley and Sons, New York
[2]
Boltzmann, L. (1896) Vorlesungen über Gastheorie. J. A. Barth, Leipzig.
[3]
Shannon, C.E. (1948) A Mathematical Theory of Communication. The Bell System Technical Journal, 27, 53-75. http://dx.doi.org/10.1002/j.1538-7305.1948.tb00917.x
[4]
Nyquist, H. (1924) Certain Factors Affecting Telegraph Speed. Bell System Technical Journal, 3, 324-346. http://dx.doi.org/10.1002/j.1538-7305.1924.tb01361.x
[5]
Khinchin, A.I. (1957) Mathematical Foundation of Information Theory. Dover Publications, New York.
[6]
Aczcél, J. and Daróczy, Z. (1975) On Measures of Information and Their Characterization. Academic Press, New York.
[7]
Merzbacher, E. (1967) Quantum Physics. 7th Edition, John Wiley and Sons, New York.
[8]
Faddejew, D.K. (1957) Der Begriff der Entropie in der Wahrscheinlichkeitstheorie. In: Arbeiten zur Informationstheorie I. DVdW, Berlin.
[9]
Watanabe, S. (1969) Knowing and Guessing. John Wiley and Sons, New York.
[10]
Majerník, V. (2001) Elementary Theory of Organization. Palacky University Press, Olomouc.
[11]
Haken, H. (1983) Advanced Synergetics. Springer-Verlag, Berlin.
[12]
Ke-Hsuch, L. (2000) Physics of Open Systems. Physics Reports, 165, 1-101.
[13]
Jaynes, E.T. (1957) Information Theory and Statistical Mechanics. Physical Review, 106, 620-630. http://dx.doi.org/10.1103/PhysRev.106.620
[14]
Jaynes, E.T. (1967) Foundations of Probability Theory and Statistical Mechanics. In: Bunge, M., Ed., Delavare Seminar in the Foundation of Physics, Springer, New York.
http://dx.doi.org/10.1007/978-3-642-86102-4_6
[15]
Ang, A.H. and Tang, W.H. (2004) Probability Concepts in Engineering. Planning, 1, 3-5.
[16]
Vajda, I. (1995) Theory of Information and Statistical Decisions. Kluver, Academic Publisher, Dortrecht.
[17]
Rényi, A. (1961) On the Measures of Entropy and Information. 4th Berkeley Symposium on Mathematical Statistics, 547-555.
[18]
Havrda, J. and Charvát, F. (1967) Quantification Method of Classification Processes. Concept of Structural -Entropy. Kybernetika, 3, 30-35.
[19]
Majerník, V., Majerníkova, E. and Shpyrko, S. (2003) Uncertainty Relations Expressed by Shannon-Like Entropies. Central European Journal of Physics, 6, 363-371.
http://dx.doi.org/10.2478/s11534-008-0057-6
[20]
Tsallis, C. (1988). Possible Generalization of Boltzmann-Gibbs Statistics. Journal of Statistical Physics, 52, 479-487. http://dx.doi.org/10.1007/BF01016429
[21]
Majerník, V. and Richterek, L. (1997) Entropic Uncertainty Relations. European Journal of Physics, 18, 73-81. http://dx.doi.org/10.1088/0143-0807/18/2/005
[22]
Brillouin, L. (1965) Science and Information Theory. Academic Press, New York.
[23]
Flower, T.B. (1983) The Notion of Entropy. International Journal of General Systems, 9, 143-152.
[24]
Guiasu, S. and Shenitzer, A. (1985) The Principle of Maximum Entropy. The Mathematical Intelligencer, 7, 42-48. http://dx.doi.org/10.1007/BF03023004
[25]
Khinchin, A.I. (1957) Mathematical Foundation of Information Theory. Dover Publications, New York.
[26]
Chaitin, G.J. (1982) Algorithmic Information Theory. John Wiley and Sons, New York.
[27]
Kolmogorov, A.N. (1965) Three Approaches to the Quantittative Definition of Information. Problems of Information Transmission, 1, 1-7.