全部 标题 作者
关键词 摘要


An Informational Proof of H-Theorem

DOI: 10.4236/oalib.1103396, PP. 1-15

Subject Areas: Modern Physics

Keywords: Thermodynamic Entropy, H-Theorem, Information Entropy, Entropic Divergence

Full-Text   Cite this paper   Add to My Lib

Abstract

After a historical reconstruction of the main Boltzmann’s ideas on mechanical statistics, a discrete version of Boltzmann’s H-theorem is proved, by using basic concepts of information theory. Namely, H-theorem follows from the central limit theorem, acting inside a closed physical system, and from the maximum entropy law for normal probability distributions, which is a consequence of Kullback-Leibler entropic divergence positivity. Finally, the relevance of discreteness and probability, for a deep comprehension of the relationship between physical and informational entropy, is analyzed and discussed in the light of new perspectives emerging in computational genomics.

Cite this paper

Manca, V. (2017). An Informational Proof of H-Theorem. Open Access Library Journal, 4, e3396. doi: http://dx.doi.org/10.4236/oalib.1103396.

References

[1]  Shannon, C.E. (1948) A Mathematical Theory of Communication. The Bell System Technical Journal, 27, 623-656. https://doi.org/10.1002/j.1538-7305.1948.tb00917.x
[2]  Wheeler, J.A. (1990) Information, Physics, Quantum: The Search for Links. In: Zurek, W.H., Ed., Complexity, Entropy, and the Physics of Information, Addison-Wesley, Redwood City, 354-368.
[3]  Brush, S.G. and Hall, N.S. (Eds.) (2003) The Kinetic Theory of Gases. An Anthology of Classical Papers with Historical Commentary. Imperial College Press, London.
[4]  Carnot, S. (1890) Reflections on the Motive Power of Heat (English translation from French edition of 1824, with introduction by Lord Kelvin). Wiley & Sons, New York.
[5]  Feynman, R.P. (1963) The Feynman Lectures on Physics. Vol. I, Part 2. Addison-Wesley Publishing Company, Inc., California Institute of Technology, Boston.
[6]  Boltzmann, L. (1872) Weitere Studien uber das Warmegleichgewicht unter Gasmolekulen, Sitzungsber. Kais. Akad. Wiss. Wien Math. Naturwiss. Classe, 66, 275-370.
[7]  Boltzmann, L. (1877) über die Beziehung zwischen dem Zweiten Hauptsatze der mechanischen Warmetheorie und der Wahrscheinlichkeitsrechnung resp. den Satzen über das Warmegleichgewicht, Sitzungsber. Kais. Akad. Wiss. Wien Math. Naturwiss. Classe, 76, 373-435.
[8]  Sharp, K. and Matschinsky, F. (2015) Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium’’. Entropy, 17, 1971-2009. https://doi.org/10.3390/e17041971
[9]  Wiener, N. (1948) Cybernetics or Control and Communication in the Animal and the Machine. Hermann, Paris.
[10]  Planck, M. (1972) Planck’s Original Papers in Quantum Physics. Annotated by H. Kangro, translated by Haar, ter D.; Brush, S. G. Taylor & Francis, London.
[11]  Feller, W. (1968) An Introduction to Probability Theory and Its Applications. Wiley & Sons, New York.
[12]  Brillouin, L. (1953) The Negentropy Principle of Information. Journal of Applied Physics, 24, 1152-1163. https://doi.org/10.1063/1.1721463
[13]  Cover, T.M. and Thomas, J.A. (1991) Elements of Information Theory. Wiley & Sons, New York. https://doi.org/10.1002/0471200611
[14]  Manca, V. (2013) Infobiotics: Information in Biotic Systems. Springer, Berlin. https://doi.org/10.1007/978-3-642-36223-1
[15]  Bonnici, V. and Manca, V. (2016) Informational Laws of Genome Structures. Nature Scientific Report, 6, 28840. http://www.nature.com/articles/srep28840 https://doi.org/10.1038/srep28840
[16]  Holzinger, A., Hortenhuber, M., Mayer, C., Bachler, M., Wassertheurer, S., Pinho, A.J. and Koslicki, D. (2014) On Entropy-Based Data Mining. In: Holzinger, A. and Jurisica, I., Eds., Knowledge Discovery and Data Mining, Springer, Berlin, 209-226. https://doi.org/10.1007/978-3-662-43968-5_12
[17]  Jaynes, E.T. (1957) Information Theory and Statistical Mechanics. The Physical Review, 106, 620-630. https://doi.org/10.1103/PhysRev.106.620
[18]  Jaynes, E.T. (1965) Gibbs vs Boltzmann Entropies. American Journal of Physics, 33, 391-398. https://doi.org/10.1119/1.1971557
[19]  Manca, V. (2016) Infogenomics: Genomes as Information Sources. In: Arabnia, H. and Tran, Q., Eds., Emerging Trends in Applications and Infrastructures for Computational Biology, Elsevier, Morgan, 317-324. https://doi.org/10.1016/b978-0-12-804203-8.00021-3
[20]  Brin, N. and Stuck, G. (2002) Introduction to Dynamical Systems. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9780511755316
[21]  Abhishek, S., Asok, R. and Shalabh, G. (2009) An Information-Theoretic Measure for Anomaly Detection in Complex Dynamical System. Mechanical Systems and Signal Processing, 23, 358-371. https://doi.org/10.1016/j.ymssp.2008.04.007
[22]  Cercignani, C. (1988) The Boltzmann Equation and Its Application. Springer, Berlin.

Full-Text


comments powered by Disqus