Abstract:
The first part of the paper is devoted to the foundations, that is the mathematical and physical justification, of equilibrium statistical mechanics. It is a pedagogical attempt, mostly based on Khinchin's presentation, which purpose is to clarify some aspects of the development of statistical mechanics. In the second part, we discuss some recent developments that appeared out of equilibrium, such as fluctuation theorem and Jarzynski equality.

Abstract:
We construct classes of stochastic differential equations with fluctuating friction forces that generate a dynamics correctly described by Tsallis statistics and nonextensive statistical mechanics. These systems generalize the way in which ordinary Langevin equations underly ordinary statistical mechanics to the more general nonextensive case. As a main example, we construct a dynamical model of velocity fluctuations in a turbulent flow, which generates probability densities that very well fit experimentally measured probability densities in Eulerian and Lagrangian turbulence. Our approach provides a dynamical reason why many physical systems with fluctuations in temperature or energy dissipation rate are correctly described by Tsallis statistics.

Abstract:
We discuss foundation of quantum mechanics (interpretations, superposition, principle of complementarity, locality, hidden variables) and quantum information theory.

Abstract:
This is an extensive review of recent work on the foundations of statistical mechanics. Subject matters discussed include: interpretation of probability, typicality, recurrence, reversibility, ergodicity, mixing, coarse graining, past hypothesis, reductionism, phase average, thermodynamic limit, interventionism, entropy.

Abstract:
In a recent letter, Christian Beck described a theoretical link between a family of stochastic differential equations and the probability density functions (PDF) derived from the formalism of nonextensive statistical mechanics. He applied the theory to explain experimentally measured PDFs from fully developed fluid turbulence. Here we present new experimental results with better statistics which show that the linear model propose by C. Beck does not capture the experimental observations.

Abstract:
A new formulation of statistical mechanics is put forward according to which a random variable characterizing a macroscopic body is postulated to be infinitely divisible. It leads to a parametric representation of partition function of an arbitrary macroscopic body, a possibility to describe a macroscopic body under excitation by a gas of some elementary quasiparticles etc. A phase transition is defined as such a state of a macroscopic body that its random variable is stable in sense of L\'evy. From this definition it follows by deduction all general properties of phase transitions: existence of the renormalization semigroup, the singularity classification for thermodynamic functions, the phase transition universality and universality classes. On this basis we has also built a 2-parameter scaling theory of phase transitions, a thermodynamic function for the Ising model etc.

Abstract:
We consider an alternative approach to the foundations of statistical mechanics, in which subjective randomness, ensemble-averaging or time-averaging are not required. Instead, the universe (i.e. the system together with a sufficiently large environment) is in a quantum pure state subject to a global constraint, and thermalisation results from entanglement between system and environment. We formulate and prove a "General Canonical Principle", which states that the system will be thermalised for almost all pure states of the universe, and provide rigorous quantitative bounds using Levy's Lemma.

Abstract:
Some 80-90 years ago, George A. Linhart, unlike A. Einstein, P. Debye, M. Planck and W. Nernst, has managed to derive a very simple, but ultimately general mathematical formula for heat capacity vs. temperature from the fundamental thermodynamical principles, using what we would nowadays dub a "Bayesian approach to probability". Moreover, he has successfully applied his result to fit the experimental data for diverse substances in their solid state in a rather broad temperature range. Nevertheless, Linhart's work was undeservedly forgotten, although it does represent a valid and fresh standpoint on thermodynamics and statistical physics, which may have a significant implication for academic and applied science.

Abstract:
The present paper is meant to give a simple introduction to the problem of the connection between microscopic dynamics and statistical laws. For sake of simplicity, we mostly refer to non-dissipative dynamics, since dissipation adds technical difficulties to the conceptual issues, although part of our discussion extends beyond this limit. In particular, the relevance of chaos and ergodicity is here confronted with that of the large number of degrees of freedom. In Section 2, we review the microscopic connection, along the lines of Boltzmann's approach, and of its further developments. In Section 3, we discuss the falsifiability of statistical mechanics and its role as statistical inference. In particular we argue that the Maximum entropy priciple is in general not a predictive tool.

Abstract:
It has been known for a long time that the fundamental approaches to equilibrium and nonequillbrium statistical mechanics available at present lead to physical and mathematical inconsistencies for dense systems. A new approach, whose foundation lies in the more powerful statistical method of counting complexions, had been formulated which not only overcomes all these difficulties but also yields satisfactory physical results for dense 'hard sphere' systems as well as for systerns containing charged particles for which a mathematically consistent theory cannot even be formulated if we follow the available formalisms. The specific computational techniques rely on the following four recipes which also are justified theoretically.