Abstract:
The advent of electronic computers has revolutionised the application of statistical mechanics to the liquid state. Computers have permitted, for example, the calculation of the phase diagram of water and ice and the folding of proteins. The behaviour of alkanes adsorbed in zeolites, the formation of liquid crystal phases and the process of nucleation. Computer simulations provide, on one hand, new insights into the physical processes in action, and on the other, quantitative results of greater and greater precision. Insights into physical processes facilitate the reductionist agenda of physics, whilst large scale simulations bring out emergent features that are inherent (although far from obvious) in complex systems consisting of many bodies. It is safe to say that computer simulations are now an indispensable tool for both the theorist and the experimentalist, and in the future their usefulness will only increase. This chapter presents a selective review of some of the incredible advances in condensed matter physics that could only have been achieved with the use of computers.

Abstract:
The recurrent theme of this paper is that sequences of long temporal patterns as opposed to sequences of simple statements are to be fed into computation devices, being them (new proposed) models for brain activity or multi-core/many-core computers. In such models, parts of these long temporal patterns are already committed while other are predicted. This combination of matching patterns and making predictions appears as a key element in producing intelligent processing in brain models and getting efficient speculative execution on multi-core/many-core computers. A bridge between these far-apart models of computation could be provided by appropriate design of massively parallel, interactive programming languages. Agapia is a recently proposed language of this kind, where user controlled long high-level temporal structures occur at the interaction interfaces of processes. In this paper Agapia is used to link HTMs brain models with TRIPS multi-core/many-core architectures.

Abstract:
There is a certainty that the modern (Copenhagen's) interpretation of quantum mechanics is correct. However, the some physicist had the opinion that the modern quantum mechanics is a phenomenological theory. The suggested theory is the new quantum mechanics interpretation that is entirely according to the modern interpretation and gives a number of results, which naturally explain the postulates of the modern quantum mechanics.

Abstract:
The principles of creation of the mechanics of structured particles in the frame of the Newton's laws are considered. The explanation how this mechanics leads to the account of dissipative forces is offered. Why the motions of the system determine by two type of symmetry: symmetry of the system and symmetry of space and how it leads to two types of energy and forces accordingly are discussed. How the mechanics of the structured particles leads to thermodynamics, statistical physics and kinetics are explained.

Abstract:
This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/p？arallstructure/.

Abstract:
Quantum computers are important examples of processes whose evolution can be described in terms of iterations of single step operators or their adjoints. Based on this, Hamiltonian evolution of processes with associated step operators $T$ is investigated here. The main limitation of this paper is to processes which evolve quantum ballistically, i.e. motion restricted to a collection of nonintersecting or distinct paths on an arbitrary basis. The main goal of this paper is proof of a theorem which gives necessary and sufficient conditions that T must satisfy so that there exists a Hamiltonian description of quantum ballistic evolution for the process, namely, that T is a partial isometry and is orthogonality preserving and stable on some basis. Simple examples of quantum ballistic evolution for quantum Turing machines with one and with more than one type of elementary step are discussed. It is seen that for nondeterministic machines the basis set can be quite complex with much entanglement present. It is also proved that, given a step operator T for an arbitrary deterministic quantum Turing machine, it is decidable if T is stable and orthogonality preserving, and if quantum ballistic evolution is possible. The proof fails if T is a step operator for a nondeterministic machine. It is an open question if such a decision procedure exists for nondeterministic machines. This problem does not occur in classical mechanics.

Abstract:
This paper forms part of a wider campaign: to deny pointillisme. That is the doctrine that a physical theory's fundamental quantities are defined at points of space or of spacetime, and represent intrinsic properties of such points or point-sized objects located there; so that properties of spatial or spatiotemporal regions and their material contents are determined by the point-by-point facts. More specifically, this paper argues against pointillisme about the concept of velocity in classical mechanics; especially against proposals by Tooley, Robinson and Lewis. A companion paper argues against pointillisme about (chrono)-geometry, as proposed by Bricker. To avoid technicalities, I conduct the argument almost entirely in the context of ``Newtonian'' ideas about space and time, and the classical mechanics of point-particles, i.e. extensionless particles moving in a void. But both the debate and my arguments carry over to relativistic physics.

Abstract:
The theory of mesoscopic fluctuations is applied to inhomogeneous solids consisting of chaotically distributed regions with different crystalline structure. This approach makes it possible to describe statistical properties of such mixture by constructing a renormalized Hamiltonian. The relative volumes occupied by each of the coexisting structures define the corresponding geometric probabilities. In the case of a frozen heterophase system these probabilities should be given a priori. And in the case of a thermal heterophase mixture the structural probabilities are to be defined self-consistently by minimizing a thermodynamical potential. This permits to find the temperature behavior of the probabilities which is especially important near the points of structural phase transitions. The presense of these structural fluctuations yields a softening of a crystal and a decrease of the effective Debye temperature. These effects can be directly seen by nuclear gamma resonance since the occurrence of structural fluctuations is accompanied by a noticeable sagging of the M\"ossbauer factor at the point of structural phase transition. The structural fluctuations also lead to the attenuation of sound and increase of isothermic compressibility.

Abstract:
The paper in the introductory part reviews various definitions and interpretations of structural redundancy in mechanics. The study focuses on the general structural redundancy of systems after sequences of component failures followed by possible load redistributions. The second section briefly summarizes the Event Oriented System Analysis and structural redundancy in terms of the conditional probabilistic entropy. Mechanical responses to adverse loads in this approach are represented by random operational and failure events in the lifetime. The general redundancy measure in the third section of the paper employs the information entropy and goes beyond existing formulations since it includes all functional modes in service. The paper continues with a summary of traditional redundancy indices. In addition, it proposes an alternative redundancy index that accounts for the transition to secondary functional level in case of failures of primary components. The example of a ship structure illustrates the usage of the conditional entropy of subsystems of operational events and compares it to the traditional and newly proposed redundancy indices. The study at the end investigates how to enhance the safety of structures by using the redundancy based design.

Abstract:
The purpose of this contribution is to provide an introduction for a general physics audience to the recent results of Emile Grgin that unifies quantum mechanics and relativity into the same mathematical structure. This structure is the algebra of quantions, a non-division algebra that is the natural framework for electroweak theory on curved space-time. Similar with quaternions, quantions preserve the core features of associativity and complex conjugation while giving up the unnecessarily historically biased property of division. Lack of division makes possible structural unification with relativity (one cannot upgrade the linear Minkowski space to a division algebra due to null light-cone vectors) and demands an adjustment from Born's standard interpretation of the wave function in terms of probability currents. This paper is an overview to the theory of quantions, followed by discussions.